-
Notifications
You must be signed in to change notification settings - Fork 28.6k
[SPARK-1824] Remove <master> from Python examples #802
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Merged build triggered. |
Merged build started. |
Note that this reflects changes incorporated in apache#799.
Merged build triggered. |
Merged build started. |
@@ -46,9 +46,10 @@ locally with one thread, or `local[N]` to run locally with N threads. You should | |||
Spark also provides a Python interface. To run an example Spark application written in Python, use | |||
`bin/pyspark <program> [params]`. For example, | |||
|
|||
./bin/pyspark examples/src/main/python/pi.py local[2] 10 | |||
./bin/pyspark examples/src/main/python/pi.py 10 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should also use bin/spark-submit instead of bin/pyspark here.
I think we should deprecate using |
Merged build finished. All automated tests passed. |
1 similar comment
Merged build finished. All automated tests passed. |
All automated tests passed. |
All automated tests passed. |
I see. I didn't realize we want to deprecate using |
Yeah, it wasn't super obvious but that's a good direction to go. I replaced pyspark with spark-submit in many of the docs on running applications, but I missed some. |
Merged build triggered. |
Merged build started. |
Merged build finished. All automated tests passed. |
All automated tests passed. |
This is ready to merge (after #799) |
Merged build triggered. |
Merged build started. |
Merged build finished. All automated tests passed. |
All automated tests passed. |
A recent PR (#552) fixed this for all Scala / Java examples. We need to do it for python too. Note that this blocks on #799, which makes `bin/pyspark` go through Spark submit. With only the changes in this PR, the only way to run these examples is through Spark submit. Once #799 goes in, you can use `bin/pyspark` to run them too. For example, ``` bin/pyspark examples/src/main/python/pi.py 100 --master local-cluster[4,1,512] ``` Author: Andrew Or <andrewor14@gmail.com> Closes #802 from andrewor14/python-examples and squashes the following commits: cf50b9f [Andrew Or] De-indent python comments (minor) 50f80b1 [Andrew Or] Remove pyFiles from SparkContext construction c362f69 [Andrew Or] Update docs to use spark-submit for python applications 7072c6a [Andrew Or] Merge branch 'master' of github.com:apache/spark into python-examples 427a5f0 [Andrew Or] Update docs d32072c [Andrew Or] Remove <master> from examples + update usages (cherry picked from commit cf6cbe9) Signed-off-by: Patrick Wendell <pwendell@gmail.com>
A recent PR (apache#552) fixed this for all Scala / Java examples. We need to do it for python too. Note that this blocks on apache#799, which makes `bin/pyspark` go through Spark submit. With only the changes in this PR, the only way to run these examples is through Spark submit. Once apache#799 goes in, you can use `bin/pyspark` to run them too. For example, ``` bin/pyspark examples/src/main/python/pi.py 100 --master local-cluster[4,1,512] ``` Author: Andrew Or <andrewor14@gmail.com> Closes apache#802 from andrewor14/python-examples and squashes the following commits: cf50b9f [Andrew Or] De-indent python comments (minor) 50f80b1 [Andrew Or] Remove pyFiles from SparkContext construction c362f69 [Andrew Or] Update docs to use spark-submit for python applications 7072c6a [Andrew Or] Merge branch 'master' of github.com:apache/spark into python-examples 427a5f0 [Andrew Or] Update docs d32072c [Andrew Or] Remove <master> from examples + update usages
A recent PR (#552) fixed this for all Scala / Java examples. We need to do it for python too.
Note that this blocks on #799, which makes
bin/pyspark
go through Spark submit. With only the changes in this PR, the only way to run these examples is through Spark submit. Once #799 goes in, you can usebin/pyspark
to run them too. For example,