You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[WIP] Use SPARK_HIVE to determine if we include Hive in packaging
Previously, we based our decision regarding including datanucleus jars
based on the existence of a spark-hive-assembly jar, which was incididentally
built whenever "sbt assembly" is run. This means that a typical and
previously supported pathway would start using hive jars.
This patch has the following features/bug fixes:
- Use of SPARK_HIVE (default false) to determine if we should include Hive
in the assembly jar.
- Analagous feature in Maven with -Phive.
- assemble-deps fixed since we no longer use a different ASSEMBLY_DIR
Still TODO before mergeable:
- We need to download the datanucleus jars outside of sbt. Perhaps we can have
spark-class download them if SPARK_HIVE is set similar to how sbt downloads
itself.
- Spark SQL documentation updates.
0 commit comments