[SPARK-49276] Use API Group `spark.apache.org`

### What changes were proposed in this pull request?

This PR aims to use API Group `spark.apache.org`.
```
-apiVersion: org.apache.spark/...
+apiVersion: spark.apache.org/...
```

### Why are the changes needed?

K8s convention follows domain name styles instead of package name styles.
- https://kubernetes.io/docs/tasks/extend-kubernetes/custom-resources/custom-resource-definitions/#create-a-customresourcedefinition

```
apiVersion: apiextensions.k8s.io/v1
```

```
# group name to use for REST API: /apis/<group>/<version>
group: stable.example.com
```
### Does this PR introduce _any_ user-facing change?

No, this is not released yet.

### How was this patch tested?

Pass the CIs.

In addition, we can check via K8s API Server.

**BEFORE**
```
$ kubectl proxy --port=8080 &
$ curl -s http://localhost:8080/ | grep spark
    "/apis/org.apache.spark",
    "/apis/org.apache.spark/v1alpha1",
```

**AFTER**
```
$ kubectl proxy --port=8080 &
$ curl -s http://localhost:8080/ | grep spark
    "/apis/spark.apache.org",
    "/apis/spark.apache.org/v1alpha1",
```

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #55 from dongjoon-hyun/SPARK-49276.

Authored-by: Dongjoon Hyun <dongjoon@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
7 files changed
tree: 944bad66878932e825ddf8c5e5cafc1879127e4f
  1. .github/
  2. build-tools/
  3. config/
  4. dev/
  5. examples/
  6. gradle/
  7. spark-operator/
  8. spark-operator-api/
  9. spark-submission-worker/
  10. .asf.yaml
  11. .gitignore
  12. build.gradle
  13. gradle.properties
  14. gradlew
  15. LICENSE
  16. README.md
  17. settings.gradle
README.md

Apache Spark K8s Operator

Apache Spark K8s Operator is a subproject of Apache Spark and aims to extend K8s resource manager to manage Apache Spark applications via Operator Pattern.

GitHub Actions Build

Building Spark K8s Operator

Spark K8s Operator is built using Gradle. To build, run:

$ ./gradlew build -x test

Running Tests

$ ./gradlew build

Build Docker Image

$ docker build --build-arg APP_VERSION=0.1.0 -t spark-kubernetes-operator:0.1.0 -f build-tools/docker/Dockerfile  .

Install Helm Chart

$ ./gradlew spark-operator-api:relocateGeneratedCRD

$ helm install spark-kubernetes-operator --create-namespace -f build-tools/helm/spark-kubernetes-operator/values.yaml build-tools/helm/spark-kubernetes-operator/

Run Spark Pi App

$ kubectl apply -f examples/pi.yaml

$ kubectl get sparkapp
NAME   CURRENT STATE      AGE
pi     ResourceReleased   4m10s

$ kubectl delete sparkapp/pi

Contributing

Please review the Contribution to Spark guide for information on how to get started contributing to the project.