Skip to content

Commit

Permalink
[SPARK-48240][DOCS] Replace Local[..] with "Local[...]" in the docs
Browse files Browse the repository at this point in the history
### What changes were proposed in this pull request?
The pr aims to replace `Local[..]` with `"Local[...]"` in the docs

### Why are the changes needed?
1.When I recently switched from `bash` to `zsh` and executed command `./bin/spark-shell --master local[8]` on local, the following error will be printed:
<img width="570" alt="image" src="https://github.com/apache/spark/assets/15246973/d6ad0113-942a-4370-904e-70cb2780f818">

2.Some descriptions in the existing documents have been written as `--master "local[n]"`, eg:
https://github.com/apache/spark/blob/f699f556d8a09bb755e9c8558661a36fbdb42e73/docs/index.md?plain=1#L49

3.The root cause is: https://blog.peiyingchi.com/2017/03/20/spark-zsh-no-matches-found-local/
<img width="942" alt="image" src="https://github.com/apache/spark/assets/15246973/11ff03b1-bc60-48e3-b55c-984cbc053cef">

### Does this PR introduce _any_ user-facing change?
Yes, with the `zsh` becoming the mainstream of shell, avoid the confusion of spark users when submitting apps with `./bin/spark-shell --master "local[n]" ...` or `./bin/spark-sql --master "local[n]" ...`, etc

### How was this patch tested?
Manually test
Whether the user uses `bash` or `zsh`, the above `--master "local[n]"` can be executed successfully in the expected way.

### Was this patch authored or co-authored using generative AI tooling?
No.

Closes #46535 from panbingkun/SPARK-48240.

Authored-by: panbingkun <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
  • Loading branch information
panbingkun authored and HyukjinKwon committed May 11, 2024
1 parent f699f55 commit 57b2077
Show file tree
Hide file tree
Showing 4 changed files with 12 additions and 12 deletions.
4 changes: 2 additions & 2 deletions docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ Then, you can supply configuration values at runtime:
```sh
./bin/spark-submit \
--name "My app" \
--master local[4] \
--master "local[4]" \
--conf spark.eventLog.enabled=false \
--conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps" \
myApp.jar
Expand Down Expand Up @@ -3750,7 +3750,7 @@ Also, you can modify or add configurations at runtime:
{% highlight bash %}
./bin/spark-submit \
--name "My app" \
--master local[4] \
--master "local[4]" \
--conf spark.eventLog.enabled=false \
--conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps" \
--conf spark.hadoop.abc.def=xyz \
Expand Down
6 changes: 3 additions & 3 deletions docs/quick-start.md
Original file line number Diff line number Diff line change
Expand Up @@ -286,7 +286,7 @@ We can run this application using the `bin/spark-submit` script:
{% highlight bash %}
# Use spark-submit to run your application
$ YOUR_SPARK_HOME/bin/spark-submit \
--master local[4] \
--master "local[4]" \
SimpleApp.py
...
Lines with a: 46, Lines with b: 23
Expand Down Expand Up @@ -371,7 +371,7 @@ $ sbt package
# Use spark-submit to run your application
$ YOUR_SPARK_HOME/bin/spark-submit \
--class "SimpleApp" \
--master local[4] \
--master "local[4]" \
target/scala-{{site.SCALA_BINARY_VERSION}}/simple-project_{{site.SCALA_BINARY_VERSION}}-1.0.jar
...
Lines with a: 46, Lines with b: 23
Expand Down Expand Up @@ -452,7 +452,7 @@ $ mvn package
# Use spark-submit to run your application
$ YOUR_SPARK_HOME/bin/spark-submit \
--class "SimpleApp" \
--master local[4] \
--master "local[4]" \
target/simple-project-1.0.jar
...
Lines with a: 46, Lines with b: 23
Expand Down
12 changes: 6 additions & 6 deletions docs/rdd-programming-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -214,13 +214,13 @@ can be passed to the `--repositories` argument. For example, to run
`bin/pyspark` on exactly four cores, use:

{% highlight bash %}
$ ./bin/pyspark --master local[4]
$ ./bin/pyspark --master "local[4]"
{% endhighlight %}

Or, to also add `code.py` to the search path (in order to later be able to `import code`), use:

{% highlight bash %}
$ ./bin/pyspark --master local[4] --py-files code.py
$ ./bin/pyspark --master "local[4]" --py-files code.py
{% endhighlight %}

For a complete list of options, run `pyspark --help`. Behind the scenes,
Expand Down Expand Up @@ -260,19 +260,19 @@ can be passed to the `--repositories` argument. For example, to run `bin/spark-s
four cores, use:

{% highlight bash %}
$ ./bin/spark-shell --master local[4]
$ ./bin/spark-shell --master "local[4]"
{% endhighlight %}

Or, to also add `code.jar` to its classpath, use:

{% highlight bash %}
$ ./bin/spark-shell --master local[4] --jars code.jar
$ ./bin/spark-shell --master "local[4]" --jars code.jar
{% endhighlight %}

To include a dependency using Maven coordinates:

{% highlight bash %}
$ ./bin/spark-shell --master local[4] --packages "org.example:example:0.1"
$ ./bin/spark-shell --master "local[4]" --packages "org.example:example:0.1"
{% endhighlight %}

For a complete list of options, run `spark-shell --help`. Behind the scenes,
Expand Down Expand Up @@ -781,7 +781,7 @@ One of the harder things about Spark is understanding the scope and life cycle o

#### Example

Consider the naive RDD element sum below, which may behave differently depending on whether execution is happening within the same JVM. A common example of this is when running Spark in `local` mode (`--master = local[n]`) versus deploying a Spark application to a cluster (e.g. via spark-submit to YARN):
Consider the naive RDD element sum below, which may behave differently depending on whether execution is happening within the same JVM. A common example of this is when running Spark in `local` mode (`--master = "local[n]"`) versus deploying a Spark application to a cluster (e.g. via spark-submit to YARN):

<div class="codetabs">

Expand Down
2 changes: 1 addition & 1 deletion docs/submitting-applications.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ run it with `--help`. Here are a few examples of common options:
# Run application locally on 8 cores
./bin/spark-submit \
--class org.apache.spark.examples.SparkPi \
--master local[8] \
--master "local[8]" \
/path/to/examples.jar \
100

Expand Down

0 comments on commit 57b2077

Please sign in to comment.