[hotfix] Don't check insert only in plan phase for append only table
1 file changed
tree: fc6e20cbbf380daaa958df20b07ef24d817029ff
  1. .github/
  2. docs/
  3. flink-table-store-benchmark/
  4. flink-table-store-codegen/
  5. flink-table-store-codegen-loader/
  6. flink-table-store-common/
  7. flink-table-store-connector/
  8. flink-table-store-core/
  9. flink-table-store-dist/
  10. flink-table-store-docs/
  11. flink-table-store-e2e-tests/
  12. flink-table-store-format/
  13. flink-table-store-hive/
  14. flink-table-store-kafka/
  15. flink-table-store-shade/
  16. flink-table-store-spark/
  17. flink-table-store-spark2/
  18. tools/
  19. .asf.yaml
  20. .gitignore
  21. .gitmodules
  22. .scalafmt.conf
  23. CODE_OF_CONDUCT.md
  24. LICENSE
  25. NOTICE
  26. pom.xml
  27. README.md
README.md

Flink Table Store

Flink Table Store is a unified streaming and batch store for building dynamic tables on Apache Flink.

Flink Table Store is developed under the umbrella of Apache Flink.

Documentation & Getting Started

Please check out the full documentation, hosted by the ASF, for detailed information and user guides.

Check our quick-start guide for simple setup instructions to get you started with the table store.

Building

Run the mvn clean package -DskipTests command to build the project.

Then you will find a JAR file with all shaded dependencies: flink-table-store-dist/target/flink-table-store-dist-<version>.jar.

Contributing

You can learn more about how to contribute on the Apache Flink website. For code contributions, please read carefully the Contributing Code section for an overview of ongoing community work.

License

The code in this repository is licensed under the Apache Software License 2.