Before starting the Sedona journey, you need to make sure your Apache Spark cluster is ready.
There are two ways to use a Scala or Java library with Apache Spark. You can use either one to run Sedona.
Have your Spark cluster ready.
Run Spark shell with --packages option. This command will automatically download Sedona jars from Maven Central.
./bin/spark-shell --packages MavenCoordinates
Please refer to Sedona Maven Central coordinates to select the corresponding Sedona packages for your Spark version.
* Local mode: test Sedona without setting up a cluster
```
./bin/spark-shell --packages org.apache.sedona:sedona-spark-shaded-3.3_2.12:{{ sedona.current_version }},org.datasyslab:geotools-wrapper:{{ sedona.current_geotools }}
```
* Cluster mode: you need to specify Spark Master IP
```
./bin/spark-shell --master spark://localhost:7077 --packages org.apache.sedona:sedona-spark-shaded-3.3_2.12:{{ sedona.current_version }},org.datasyslab:geotools-wrapper:{{ sedona.current_geotools }}
```
Have your Spark cluster ready.
Download Sedona jars:
Run Spark shell with --jars option.
./bin/spark-shell --jars /Path/To/SedonaJars.jar
Please use jars with Spark major.minor versions in the filename, such as sedona-spark-shaded-3.3_2.12-{{ sedona.current_version }}.
* Local mode: test Sedona without setting up a cluster
```
./bin/spark-shell --jars /path/to/sedona-spark-shaded-3.3_2.12-{{ sedona.current_version }}.jar,/path/to/geotools-wrapper-{{ sedona.current_geotools }}.jar
```
* Cluster mode: you need to specify Spark Master IP
```
./bin/spark-shell --master spark://localhost:7077 --jars /path/to/sedona-spark-shaded-3.3_2.12-{{ sedona.current_version }}.jar,/path/to/geotools-wrapper-{{ sedona.current_geotools }}.jar
```
Please see Use Sedona in a pure SQL environment
A self-contained project allows you to create multiple Scala / Java files and write complex logics in one place. To use Sedona in your self-contained Spark project, you just need to add Sedona as a dependency in your pom.xml or build.sbt.
./bin/spark-submit --master spark://YOUR-IP:7077 /Path/To/YourJar.jar
!!!note The detailed explanation of spark-submit is available on Spark website.