API: Detect whether required fields nested within optionals can produce nulls (#14270)

* API: Detect whether required fields nested within optionals can produce nulls

This partially reverts some changes around the `Accessor` API that were introduced by https://github.com/apache/iceberg/pull/13804 and uses a Schema visitor
to detect whether any of the parent fields of a nested required field are optional.
This info is then used when IS_NULL / NOT_NULL is evaluated

* only check parent fields on IS_NULL/NOT_NULL
9 files changed
tree: f0be4143f03d51340c74e3d417ee490d59dbce28
  1. .baseline/
  2. .github/
  3. .palantir/
  4. aliyun/
  5. api/
  6. arrow/
  7. aws/
  8. aws-bundle/
  9. azure/
  10. azure-bundle/
  11. bigquery/
  12. bundled-guava/
  13. common/
  14. core/
  15. data/
  16. dell/
  17. delta-lake/
  18. dev/
  19. docker/
  20. docs/
  21. examples/
  22. flink/
  23. format/
  24. gcp/
  25. gcp-bundle/
  26. gradle/
  27. hive-metastore/
  28. kafka-connect/
  29. mr/
  30. nessie/
  31. open-api/
  32. orc/
  33. parquet/
  34. project/
  35. site/
  36. snowflake/
  37. spark/
  38. .asf.yaml
  39. .gitattributes
  40. .gitignore
  41. baseline.gradle
  42. build.gradle
  43. CONTRIBUTING.md
  44. deploy.gradle
  45. doap.rdf
  46. gradle.properties
  47. gradlew
  48. jmh.gradle
  49. LICENSE
  50. NOTICE
  51. README.md
  52. settings.gradle
  53. tasks.gradle
README.md

Iceberg

Slack

Iceberg is a high-performance format for huge analytic tables. Iceberg brings the reliability and simplicity of SQL tables to big data, while making it possible for engines like Spark, Trino, Flink, Presto, Hive and Impala to safely work with the same tables, at the same time.

Background and documentation is available at https://iceberg.apache.org

Status

Iceberg is under active development at the Apache Software Foundation.

The Iceberg format specification is stable and new features are added with each version.

The core Java library is located in this repository and is the reference implementation for other libraries.

Documentation is available for all libraries and integrations.

Collaboration

Iceberg tracks issues in GitHub and prefers to receive contributions as pull requests.

Community discussions happen primarily on the dev mailing list or on specific issues.

Building

Iceberg is built using Gradle with Java 11, 17, or 21.

  • To invoke a build and run tests: ./gradlew build
  • To skip tests: ./gradlew build -x test -x integrationTest
  • To fix code style for default versions: ./gradlew spotlessApply
  • To fix code style for all versions of Spark/Hive/Flink:./gradlew spotlessApply -DallModules

Iceberg table support is organized in library modules:

  • iceberg-common contains utility classes used in other modules
  • iceberg-api contains the public Iceberg API
  • iceberg-core contains implementations of the Iceberg API and support for Avro data files, this is what processing engines should depend on
  • iceberg-parquet is an optional module for working with tables backed by Parquet files
  • iceberg-arrow is an optional module for reading Parquet into Arrow memory
  • iceberg-orc is an optional module for working with tables backed by ORC files
  • iceberg-hive-metastore is an implementation of Iceberg tables backed by the Hive metastore Thrift client
  • iceberg-data is an optional module for working with tables directly from JVM applications

Iceberg also has modules for adding Iceberg support to processing engines:

  • iceberg-spark is an implementation of Spark's Datasource V2 API for Iceberg with submodules for each spark versions (use runtime jars for a shaded version)
  • iceberg-flink contains classes for integrating with Apache Flink (use iceberg-flink-runtime for a shaded version)
  • iceberg-mr contains an InputFormat and other classes for integrating with Apache Hive

NOTE

The tests require Docker to execute. On macOS (with Docker Desktop), you might need to create a symbolic name to the docker socket in order to be detected by the tests:

sudo ln -s $HOME/.docker/run/docker.sock /var/run/docker.sock

In some cases the testcontainer may exit with an initialization error because of an illegal state exception in the GenericContainer. One work around for this problem is to set selinux into permissive mode before running the tests.

sudo setenforce Permissive
./gradlew ...
sudo setenforce Enforcing

Engine Compatibility

See the Multi-Engine Support page to know about Iceberg compatibility with different Spark, Flink and Hive versions. For other engines such as Presto or Trino, please visit their websites for Iceberg integration details.

Implementations

This repository contains the Java implementation of Iceberg. Other implementations can be found at: