| commit | 7b3dc1c77ecaf1309127bdb7d494ef81915bec34 | [log] [tgz] |
|---|---|---|
| author | Dongjoon Hyun <dongjoon@apache.org> | Sat Aug 17 16:14:35 2024 -0700 |
| committer | Dongjoon Hyun <dongjoon@apache.org> | Sat Aug 17 16:14:35 2024 -0700 |
| tree | 944bad66878932e825ddf8c5e5cafc1879127e4f | |
| parent | 83be95338c26bb7114325fdc785fbc87af066a8a [diff] |
[SPARK-49276] Use API Group `spark.apache.org` ### What changes were proposed in this pull request? This PR aims to use API Group `spark.apache.org`. ``` -apiVersion: org.apache.spark/... +apiVersion: spark.apache.org/... ``` ### Why are the changes needed? K8s convention follows domain name styles instead of package name styles. - https://kubernetes.io/docs/tasks/extend-kubernetes/custom-resources/custom-resource-definitions/#create-a-customresourcedefinition ``` apiVersion: apiextensions.k8s.io/v1 ``` ``` # group name to use for REST API: /apis/<group>/<version> group: stable.example.com ``` ### Does this PR introduce _any_ user-facing change? No, this is not released yet. ### How was this patch tested? Pass the CIs. In addition, we can check via K8s API Server. **BEFORE** ``` $ kubectl proxy --port=8080 & $ curl -s http://localhost:8080/ | grep spark "/apis/org.apache.spark", "/apis/org.apache.spark/v1alpha1", ``` **AFTER** ``` $ kubectl proxy --port=8080 & $ curl -s http://localhost:8080/ | grep spark "/apis/spark.apache.org", "/apis/spark.apache.org/v1alpha1", ``` ### Was this patch authored or co-authored using generative AI tooling? No. Closes #55 from dongjoon-hyun/SPARK-49276. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
Apache Spark K8s Operator is a subproject of Apache Spark and aims to extend K8s resource manager to manage Apache Spark applications via Operator Pattern.
Spark K8s Operator is built using Gradle. To build, run:
$ ./gradlew build -x test
$ ./gradlew build
$ docker build --build-arg APP_VERSION=0.1.0 -t spark-kubernetes-operator:0.1.0 -f build-tools/docker/Dockerfile .
$ ./gradlew spark-operator-api:relocateGeneratedCRD $ helm install spark-kubernetes-operator --create-namespace -f build-tools/helm/spark-kubernetes-operator/values.yaml build-tools/helm/spark-kubernetes-operator/
$ kubectl apply -f examples/pi.yaml $ kubectl get sparkapp NAME CURRENT STATE AGE pi ResourceReleased 4m10s $ kubectl delete sparkapp/pi
Please review the Contribution to Spark guide for information on how to get started contributing to the project.