[SPARK-46209] Add java 11 only yml for version before 3.5

### What changes were proposed in this pull request?
Add Java11 only workflow for version before 3.5.0.

### Why are the changes needed?
otherwise, the publish will failed due to no java 17 file founded in version before v 3.5.0.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
Test on my repo: https://github.com/Yikun/spark-docker/actions/workflows/publish-java11.yml

Closes #58 from Yikun/java11-publish.

Authored-by: Yikun Jiang <yikunkero@gmail.com>
Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
2 files changed
tree: 480416d7ea6cdfac3aa49d18264cee1ba7afe91a
  1. .github/
  2. 3.3.0/
  3. 3.3.1/
  4. 3.3.2/
  5. 3.3.3/
  6. 3.4.0/
  7. 3.4.1/
  8. 3.4.2/
  9. 3.5.0/
  10. testing/
  11. tools/
  12. .asf.yaml
  13. add-dockerfiles.sh
  14. awesome-spark-docker.md
  15. CONTRIBUTING.md
  16. Dockerfile.template
  17. entrypoint.sh.template
  18. LICENSE
  19. merge_spark_docker_pr.py
  20. NOTICE
  21. OVERVIEW.md
  22. r-python.template
  23. README.md
  24. versions.json
README.md

Apache Spark Official Dockerfiles

What is Apache Spark?

Spark is a unified analytics engine for large-scale data processing. It provides high-level APIs in Scala, Java, Python, and R, and an optimized engine that supports general computation graphs for data analysis. It also supports a rich set of higher-level tools including Spark SQL for SQL and DataFrames, pandas API on Spark for pandas workloads, MLlib for machine learning, GraphX for graph processing, and Structured Streaming for stream processing.

https://spark.apache.org/

About this repository

This repository contains the Dockerfiles used to build the Apache Spark Docker Image.

See more in SPARK-40513: SPIP: Support Docker Official Image for Spark.