Skip to content

Commit

Permalink
[SPARK-44914][BUILD] Upgrade Apache Ivy to 2.5.2
Browse files Browse the repository at this point in the history
### What changes were proposed in this pull request?

This PR aims to upgrade Apache Ivy to 2.5.2 and protect old Ivy-based systems like old Spark from Apache Ivy 2.5.2's incompatibility by introducing a new `.ivy2.5.2` directory.

- Apache Spark 4.0.0 will create this once and reuse this directory while all the other systems like old Sparks uses the old one, `.ivy2`. So, the behavior is the same with the case where Apache Spark 4.0.0 is installed and used in a new machine.

- For the environments with `User-provided Ivy-path`es, the user might hit the incompatibility still. However, the users can mitigate them because they already have full control on `Ivy-path`es.

### Why are the changes needed?

This was tried once and reverted logically due to Java 11 and Java 17 failures in Daily CIs.
- apache#42613
- apache#42668

Currently, PR Builder also fails as of now. If the PR passes CIes, we can achieve the following.

- [Release notes](https://lists.apache.org/thread/9gcz4xrsn8c7o9gb377xfzvkb8jltffr)
    - FIX: CVE-2022-46751: Apache Ivy Is Vulnerable to XML External Entity Injections

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Pass the CIs including `HiveExternalCatalogVersionsSuite`.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes apache#45075 from dongjoon-hyun/SPARK-44914.

Authored-by: Dongjoon Hyun <dhyun@apple.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
  • Loading branch information
dongjoon-hyun authored and ericm-db committed Mar 5, 2024
1 parent 25bc625 commit 552cbaf
Show file tree
Hide file tree
Showing 7 changed files with 24 additions and 12 deletions.
17 changes: 14 additions & 3 deletions common/utils/src/main/scala/org/apache/spark/util/MavenUtils.scala
Original file line number Diff line number Diff line change
Expand Up @@ -324,6 +324,14 @@ private[spark] object MavenUtils extends Logging {
val ivySettings: IvySettings = new IvySettings
try {
ivySettings.load(file)
if (ivySettings.getDefaultIvyUserDir == null && ivySettings.getDefaultCache == null) {
// To protect old Ivy-based systems like old Spark from Apache Ivy 2.5.2's incompatibility.
// `processIvyPathArg` can overwrite these later.
val alternateIvyDir = System.getProperty("ivy.home",
System.getProperty("user.home") + File.separator + ".ivy2.5.2")
ivySettings.setDefaultIvyUserDir(new File(alternateIvyDir))
ivySettings.setDefaultCache(new File(alternateIvyDir, "cache"))
}
} catch {
case e @ (_: IOException | _: ParseException) =>
throw new SparkException(s"Failed when loading Ivy settings from $settingsFile", e)
Expand All @@ -335,10 +343,13 @@ private[spark] object MavenUtils extends Logging {

/* Set ivy settings for location of cache, if option is supplied */
private def processIvyPathArg(ivySettings: IvySettings, ivyPath: Option[String]): Unit = {
ivyPath.filterNot(_.trim.isEmpty).foreach { alternateIvyDir =>
ivySettings.setDefaultIvyUserDir(new File(alternateIvyDir))
ivySettings.setDefaultCache(new File(alternateIvyDir, "cache"))
val alternateIvyDir = ivyPath.filterNot(_.trim.isEmpty).getOrElse {
// To protect old Ivy-based systems like old Spark from Apache Ivy 2.5.2's incompatibility.
System.getProperty("ivy.home",
System.getProperty("user.home") + File.separator + ".ivy2.5.2")
}
ivySettings.setDefaultIvyUserDir(new File(alternateIvyDir))
ivySettings.setDefaultCache(new File(alternateIvyDir, "cache"))
}

/* Add any optional additional remote repositories */
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -374,7 +374,8 @@ private[spark] object IvyTestUtils {
f(repo.toURI.toString)
} finally {
// Clean up
if (repo.toString.contains(".m2") || repo.toString.contains(".ivy2")) {
if (repo.toString.contains(".m2") || repo.toString.contains(".ivy2") ||
repo.toString.contains(".ivy2.5.2")) {
val groupDir = getBaseGroupDirectory(artifact, useIvyLayout)
FileUtils.deleteDirectory(new File(repo, groupDir + File.separator + artifact.artifactId))
deps.foreach { _.foreach { dep =>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2491,10 +2491,10 @@ package object config {
.doc("Path to specify the Ivy user directory, used for the local Ivy cache and " +
"package files from spark.jars.packages. " +
"This will override the Ivy property ivy.default.ivy.user.dir " +
"which defaults to ~/.ivy2.")
"which defaults to ~/.ivy2.5.2")
.version("1.3.0")
.stringConf
.createOptional
.createWithDefault("~/.ivy2.5.2")

private[spark] val JAR_IVY_SETTING_PATH =
ConfigBuilder(MavenUtils.JAR_IVY_SETTING_PATH_KEY)
Expand Down
2 changes: 1 addition & 1 deletion dev/deps/spark-deps-hadoop-3-hive-2.3
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ httpcore/4.4.16//httpcore-4.4.16.jar
icu4j/72.1//icu4j-72.1.jar
ini4j/0.5.4//ini4j-0.5.4.jar
istack-commons-runtime/3.0.8//istack-commons-runtime-3.0.8.jar
ivy/2.5.1//ivy-2.5.1.jar
ivy/2.5.2//ivy-2.5.2.jar
jackson-annotations/2.16.1//jackson-annotations-2.16.1.jar
jackson-core-asl/1.9.13//jackson-core-asl-1.9.13.jar
jackson-core/2.16.1//jackson-core-2.16.1.jar
Expand Down
2 changes: 2 additions & 0 deletions dev/run-tests.py
Original file line number Diff line number Diff line change
Expand Up @@ -478,6 +478,8 @@ def main():
rm_r(os.path.join(SPARK_HOME, "work"))
rm_r(os.path.join(USER_HOME, ".ivy2", "local", "org.apache.spark"))
rm_r(os.path.join(USER_HOME, ".ivy2", "cache", "org.apache.spark"))
rm_r(os.path.join(USER_HOME, ".ivy2.5.2", "local", "org.apache.spark"))
rm_r(os.path.join(USER_HOME, ".ivy2.5.2", "cache", "org.apache.spark"))

os.environ["CURRENT_BLOCK"] = str(ERROR_CODES["BLOCK_GENERAL"])

Expand Down
2 changes: 2 additions & 0 deletions docs/core-migration-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,8 @@ license: |

- Since Spark 4.0, Spark uses `ReadWriteOncePod` instead of `ReadWriteOnce` access mode in persistence volume claims. To restore the legacy behavior, you can set `spark.kubernetes.legacy.useReadWriteOnceAccessMode` to `true`.

- Since Spark 4.0, Spark uses `~/.ivy2.5.2` as Ivy user directory by default to isolate the existing systems from Apache Ivy's incompatibility. To restore the legacy behavior, you can set `spark.jars.ivy` to `~/.ivy2`.

## Upgrading from Core 3.4 to 3.5

- Since Spark 3.5, `spark.yarn.executor.failuresValidityInterval` is deprecated. Use `spark.executor.failuresValidityInterval` instead.
Expand Down
6 changes: 1 addition & 5 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -146,11 +146,7 @@
<jetty.version>10.0.19</jetty.version>
<jakartaservlet.version>4.0.3</jakartaservlet.version>
<chill.version>0.10.0</chill.version>
<!--
SPARK-44968: don't upgrade Ivy to version 2.5.2 until the test aborted of
`HiveExternalCatalogVersionsSuite` in Java 11/17 daily tests is resolved.
-->
<ivy.version>2.5.1</ivy.version>
<ivy.version>2.5.2</ivy.version>
<oro.version>2.0.8</oro.version>
<!--
If you change codahale.metrics.version, you also need to change
Expand Down

0 comments on commit 552cbaf

Please sign in to comment.