-
I noticed in the latest release notes that there is "experimental support for Spark 4.0" mentioned, along with some feature improvements like the TPC-H/DS connector and SPARK_HOME detection that now support Spark 4.0. I'm considering using Spark 4.0 in our project and have a few questions:
Thank you~ |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
@Echan0808 thanks for interested in the Spark 4.0 support.
basically in two aspects. firstly, the current integration tests are based on the Spark 4.0.0-preview1, which is not a stable release thus does not guarantee stable API, that means it may not fully compatible with the final Spark 4.0.0. secondly, we have made all functionalities in the Spark SQL engine work with Spark 4.0.0-preview1, which means you can use the Kyuubi to bootstrap Spark 4.0.0-preview1 applications, run SQL/Python/Scala quires and retrieve results, etc. but plugins like authz, spark extensions do not work yet.
Please read https://spark.apache.org/news/spark-4.0.0-preview1.html.
Kyuubi will publish a version soon(usually in a few days) after Spark 4.0.0 is out, with full Spark 4.0 supported. AFAIK, the Spark 4.0.0-preview2 is coming soon, and the final 4.0.0 is likely out in Q4. You can subscribe Spark mailing list to receive more information |
Beta Was this translation helpful? Give feedback.
@Echan0808 thanks for interested in the Spark 4.0 support.
basically in two aspects. firstly, the current integration tests are based on the Spark 4.0.0-preview1, which is not a stable release thus does not guarantee stable API, that means it may not fully compatible with the final Spark 4.0.0. secondly, we have made all functionalities in the Spark SQL engine work with Spark 4.0.0-preview1, which means you can use the Kyuubi to bootstrap Spark 4.0.0-preview1 applications, run SQL/Python/Scala quires and retrieve results, etc. but plugins like authz, spark extensions do not work yet.