Skip to content
This repository was archived by the owner on Dec 22, 2021. It is now read-only.
This repository was archived by the owner on Dec 22, 2021. It is now read-only.

Changes in hello_world.ipynb (Spark Context) #44

@asquare14

Description

@asquare14

While I was going through hello_world.ipynb, I noticed this error ValueError: Cannot run multiple SparkContexts at once. It is a pretty common error that occurs because the system automatically initializes the SparkContex.

I had to use sc.stop() to stop the earlier context and create a new one. @birdsarah Should I maybe add a cell just after this code snippet

import findspark findspark.init('/opt/spark')
# Adjust for the location where you installed spark
from pyspark import SparkContext from pyspark.sql import SparkSession sc = SparkContext(appName="Overscripted") spark = SparkSession(sc)

#If you are already running a context. run this cell and rerun the cell above
sc.stop()

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions