Add support for java 17 from spark 3.5.0

### What changes were proposed in this pull request?
1. Create Java17 base images alongside Java11 images starting from spark 3.5.0
2. Change ubuntu version to 22.04 for `scala2.12-java17-*`

### Why are the changes needed?

Spark supports multiple Java versions, but the images are currently built only with Java 11.

### Does this PR introduce _any_ user-facing change?

New images would be available in the repositories.

### How was this patch tested?

Closes #56 from vakarisbk/master.

Authored-by: vakarisbk <vakaris.bashkirov@gmail.com>
Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
14 files changed
tree: caf2d3932bf2272b5b4b7f2235988b62070f5ca2
  1. .github/
  2. 3.3.0/
  3. 3.3.1/
  4. 3.3.2/
  5. 3.3.3/
  6. 3.4.0/
  7. 3.4.1/
  8. 3.5.0/
  9. testing/
  10. tools/
  11. .asf.yaml
  12. add-dockerfiles.sh
  13. awesome-spark-docker.md
  14. CONTRIBUTING.md
  15. Dockerfile.template
  16. entrypoint.sh.template
  17. LICENSE
  18. merge_spark_docker_pr.py
  19. NOTICE
  20. OVERVIEW.md
  21. r-python.template
  22. README.md
  23. versions.json
README.md

Apache Spark Official Dockerfiles

What is Apache Spark?

Spark is a unified analytics engine for large-scale data processing. It provides high-level APIs in Scala, Java, Python, and R, and an optimized engine that supports general computation graphs for data analysis. It also supports a rich set of higher-level tools including Spark SQL for SQL and DataFrames, pandas API on Spark for pandas workloads, MLlib for machine learning, GraphX for graph processing, and Structured Streaming for stream processing.

https://spark.apache.org/

About this repository

This repository contains the Dockerfiles used to build the Apache Spark Docker Image.

See more in SPARK-40513: SPIP: Support Docker Official Image for Spark.