You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
<h1 id="Who-created-Spark">Who created Spark<a class="anchor-link" href="#Who-created-Spark">¶</a></h1><ul>
14140
14141
<li><p>Spark was a PhD student project in Berkerley University.</p>
14141
14142
</li>
14142
14143
<li><p><a href="https://cs.stanford.edu/people/matei/">Matei Zaharia</a> was the major contributor during his PhD at UC Berkeley in 2009.</p>
@@ -14161,6 +14162,9 @@ <h3 id="Ease-of-Use">Ease of Use<a class="anchor-link" href="#Ease-of-Use">¶
14161
14162
<li><p>Spark offers over 80 high-level operators that make it easy to build parallel apps. And you can use it interactively from the Scala, Python, R, and SQL shells.</p>
14162
14163
</li>
14163
14164
</ul>
14165
+
<ul>
14166
+
<li>DataFrame with pandas API support</li>
14167
+
</ul>
14164
14168
14165
14169
</div>
14166
14170
</div>
@@ -14246,7 +14250,7 @@ <h2 id="Launching-Applications-with-spark-submit">Launching Applications with <c
14246
14250
--deploy-mode <deploy-mode> \
14247
14251
--conf <key>=<value> \
14248
14252
... # other options
14249
-
<application-jar> \
14253
+
<application> \
14250
14254
[application-arguments]</code></pre>
14251
14255
14252
14256
</div>
@@ -14279,7 +14283,7 @@ <h2 id="Run-on-a-YARN-cluster">Run on a YARN cluster<a class="anchor-link" href=
<h2 id="Run-a-Python-application-on-a-Spark-standalone-cluster">Run a Python application on a Spark standalone cluster<a class="anchor-link" href="#Run-a-Python-application-on-a-Spark-standalone-cluster">¶</a></h2>
14286
+
<h2 id="Run-a-Python-application-on-a-Spark-on-YARN-cluster">Run a Python application on a Spark-on-YARN cluster<a class="anchor-link" href="#Run-a-Python-application-on-a-Spark-on-YARN-cluster">¶</a></h2>
<p>If you have problem to start pyspark interactive session due to system limitation. You could submit your spakr Job via the <code>spark-submit</code> command as below.</p>
0 commit comments