[SPARK-48240][DOCS] Replace `Local[..]` with `"Local[...]"` in the docs

### What changes were proposed in this pull request?
The pr aims to replace `Local[..]` with `"Local[...]"` in the docs

### Why are the changes needed?
1.When I recently switched from `bash` to `zsh` and executed command `./bin/spark-shell --master local[8]` on local, the following error will be printed:
<img width="570" alt="image" src="https://github.com/apache/spark/assets/15246973/d6ad0113-942a-4370-904e-70cb2780f818">

2.Some descriptions in the existing documents have been written as `--master "local[n]"`, eg:
https://github.com/apache/spark/blob/f699f556d8a09bb755e9c8558661a36fbdb42e73/docs/index.md?plain=1#L49

3.The root cause is: https://blog.peiyingchi.com/2017/03/20/spark-zsh-no-matches-found-local/
<img width="942" alt="image" src="https://github.com/apache/spark/assets/15246973/11ff03b1-bc60-48e3-b55c-984cbc053cef">

### Does this PR introduce _any_ user-facing change?
Yes, with the `zsh` becoming the mainstream of shell, avoid the confusion of spark users when submitting apps with `./bin/spark-shell --master "local[n]" ...` or `./bin/spark-sql --master "local[n]" ...`, etc

### How was this patch tested?
Manually test
Whether the user uses `bash` or `zsh`, the above `--master "local[n]"` can be executed successfully in the expected way.

### Was this patch authored or co-authored using generative AI tooling?
No.

Closes #46535 from panbingkun/SPARK-48240.

Authored-by: panbingkun <panbingkun@baidu.com>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
diff --git a/docs/configuration.md b/docs/configuration.md
index c018b9f..7884a2a 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -91,7 +91,7 @@
 ```sh
 ./bin/spark-submit \
   --name "My app" \
-  --master local[4] \
+  --master "local[4]" \
   --conf spark.eventLog.enabled=false \
   --conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps" \
   myApp.jar
@@ -3750,7 +3750,7 @@
 {% highlight bash %}
 ./bin/spark-submit \
   --name "My app" \
-  --master local[4] \
+  --master "local[4]" \
   --conf spark.eventLog.enabled=false \
   --conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps" \
   --conf spark.hadoop.abc.def=xyz \
diff --git a/docs/quick-start.md b/docs/quick-start.md
index 366970c..5a03af9 100644
--- a/docs/quick-start.md
+++ b/docs/quick-start.md
@@ -286,7 +286,7 @@
 {% highlight bash %}
 # Use spark-submit to run your application
 $ YOUR_SPARK_HOME/bin/spark-submit \
-  --master local[4] \
+  --master "local[4]" \
   SimpleApp.py
 ...
 Lines with a: 46, Lines with b: 23
@@ -371,7 +371,7 @@
 # Use spark-submit to run your application
 $ YOUR_SPARK_HOME/bin/spark-submit \
   --class "SimpleApp" \
-  --master local[4] \
+  --master "local[4]" \
   target/scala-{{site.SCALA_BINARY_VERSION}}/simple-project_{{site.SCALA_BINARY_VERSION}}-1.0.jar
 ...
 Lines with a: 46, Lines with b: 23
@@ -452,7 +452,7 @@
 # Use spark-submit to run your application
 $ YOUR_SPARK_HOME/bin/spark-submit \
   --class "SimpleApp" \
-  --master local[4] \
+  --master "local[4]" \
   target/simple-project-1.0.jar
 ...
 Lines with a: 46, Lines with b: 23
diff --git a/docs/rdd-programming-guide.md b/docs/rdd-programming-guide.md
index f75bda0..cbbce4c 100644
--- a/docs/rdd-programming-guide.md
+++ b/docs/rdd-programming-guide.md
@@ -214,13 +214,13 @@
 `bin/pyspark` on exactly four cores, use:
 
 {% highlight bash %}
-$ ./bin/pyspark --master local[4]
+$ ./bin/pyspark --master "local[4]"
 {% endhighlight %}
 
 Or, to also add `code.py` to the search path (in order to later be able to `import code`), use:
 
 {% highlight bash %}
-$ ./bin/pyspark --master local[4] --py-files code.py
+$ ./bin/pyspark --master "local[4]" --py-files code.py
 {% endhighlight %}
 
 For a complete list of options, run `pyspark --help`. Behind the scenes,
@@ -260,19 +260,19 @@
 four cores, use:
 
 {% highlight bash %}
-$ ./bin/spark-shell --master local[4]
+$ ./bin/spark-shell --master "local[4]"
 {% endhighlight %}
 
 Or, to also add `code.jar` to its classpath, use:
 
 {% highlight bash %}
-$ ./bin/spark-shell --master local[4] --jars code.jar
+$ ./bin/spark-shell --master "local[4]" --jars code.jar
 {% endhighlight %}
 
 To include a dependency using Maven coordinates:
 
 {% highlight bash %}
-$ ./bin/spark-shell --master local[4] --packages "org.example:example:0.1"
+$ ./bin/spark-shell --master "local[4]" --packages "org.example:example:0.1"
 {% endhighlight %}
 
 For a complete list of options, run `spark-shell --help`. Behind the scenes,
@@ -781,7 +781,7 @@
 
 #### Example
 
-Consider the naive RDD element sum below, which may behave differently depending on whether execution is happening within the same JVM. A common example of this is when running Spark in `local` mode (`--master = local[n]`) versus deploying a Spark application to a cluster (e.g. via spark-submit to YARN):
+Consider the naive RDD element sum below, which may behave differently depending on whether execution is happening within the same JVM. A common example of this is when running Spark in `local` mode (`--master = "local[n]"`) versus deploying a Spark application to a cluster (e.g. via spark-submit to YARN):
 
 <div class="codetabs">
 
diff --git a/docs/submitting-applications.md b/docs/submitting-applications.md
index bf02ec1..3a99151 100644
--- a/docs/submitting-applications.md
+++ b/docs/submitting-applications.md
@@ -91,7 +91,7 @@
 # Run application locally on 8 cores
 ./bin/spark-submit \
   --class org.apache.spark.examples.SparkPi \
-  --master local[8] \
+  --master "local[8]" \
   /path/to/examples.jar \
   100