You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Dec 22, 2021. It is now read-only.
While I was going through hello_world.ipynb, I noticed this error ValueError: Cannot run multiple SparkContexts at once. It is a pretty common error that occurs because the system automatically initializes the SparkContex.
I had to use sc.stop() to stop the earlier context and create a new one. @birdsarah Should I maybe add a cell just after this code snippet
import findspark findspark.init('/opt/spark') # Adjust for the location where you installed spark from pyspark import SparkContext from pyspark.sql import SparkSession sc = SparkContext(appName="Overscripted") spark = SparkSession(sc)
#If you are already running a context. run this cell and rerun the cell above sc.stop()