Skip to content
Discussion options

You must be logged in to vote

OK. You will need to run the following code (basically explicitly call SedonaContext.create).

from pyspark.sql import functions as f

from sedona.spark import *

sedona = SedonaContext.create(spark)

df = sedona.sql("SELECT array(0.0, 1.0, 2.0) AS values")

min_value = f.array_min("values")
max_value = f.array_max("values")

df = df.select(ST_Point(min_value, max_value))

df.show()

I don’t know why the create is needed since Sedona on Databricks alreay did the same via the SesonaSqlExtension in config. But anyway, this fixes the issue

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by jiayuasu
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants