Skip to content

Commit

Permalink
[SPARK-49221][BUILD] Fix sbt compilation warnings related to `Regular…
Browse files Browse the repository at this point in the history
… tasks always evaluate task dependencies (.value) regardless of if expressions`

### What changes were proposed in this pull request?
This PR changes to use `taskDyn` to dynamically define `task` based on the value of `moduleName.value.contains("assembly")` in order to fix compilation warnings:

```
[warn] /home/runner/work/spark/spark/project/SparkBuild.scala:1554:77: value lookup of `/` inside an `if` expression
[warn]
[warn] problem: `/.value` is inside an `if` expression of a regular task.
[warn]   Regular tasks always evaluate task dependencies (`.value`) regardless of `if` expressions.
[warn] solution:
[warn]   1. Use a conditional task `Def.taskIf(...)` to evaluate it when the `if` predicate is true or false.
[warn]   2. Or turn the task body into a single `if` expression; the task is then auto-converted to a conditional task.
[warn]   3. Or make the static evaluation explicit by declaring `/.value` outside the `if` expression.
[warn]   4. If you still want to force the static lookup, you may annotate the task lookup with `sbtUnchecked`, e.g. `(/.value: sbtUnchecked)`.
[warn]   5. Add `import sbt.dsl.LinterLevel.Ignore` to your build file to disable all task linting.
[warn]
[warn]         val replClasspathes = (LocalProject("connect-client-jvm") / Compile / dependencyClasspath)
[warn]                                                                             ^
[warn] one warning found
```

### Why are the changes needed?
Fix compilation warnings in the GA compilation log

- https://github.com/apache/spark/actions/runs/10365742495/job/28693551621

<img width="880" alt="image" src="https://github.com/user-attachments/assets/e1a94017-cde3-4d55-9a67-197bb2f3e484">

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
- Pass Github Actions
- Check the GA compilation log, and ensure there are no warnings as mentioned above.
- Manual check

```
build/sbt clean assembly/package -Phive
ls -l assembly/target/scala-2.13/jars/connect-repl
```

Before and after this PR, the content under the `assembly/target/scala-2.13/jars/connect-repl ` directory remains consistent.

### Was this patch authored or co-authored using generative AI tooling?
No

Closes apache#47739 from LuciferYang/dynTask.

Authored-by: yangjie01 <yangjie01@baidu.com>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
  • Loading branch information
LuciferYang authored and HyukjinKwon committed Aug 13, 2024
1 parent b8060e5 commit 717cedf
Showing 1 changed file with 30 additions and 23 deletions.
53 changes: 30 additions & 23 deletions project/SparkBuild.scala
Original file line number Diff line number Diff line change
Expand Up @@ -1549,30 +1549,37 @@ object CopyDependencies {

// Here we get the full classpathes required for Spark Connect client including
// ammonite, and exclude all spark-* jars. After that, we add the shaded Spark
// Connect cleint assembly manually.
if (moduleName.value.contains("assembly")) {
val replClasspathes = (LocalProject("connect-client-jvm") / Compile / dependencyClasspath)
.value.map(_.data).filter(_.isFile())
val scalaBinaryVer = SbtPomKeys.effectivePom.value.getProperties.get(
"scala.binary.version").asInstanceOf[String]
val sparkVer = SbtPomKeys.effectivePom.value.getProperties.get(
"spark.version").asInstanceOf[String]
val destDir = new File(dest, "connect-repl").toPath
Files.createDirectories(destDir)

val sourceAssemblyJar = Paths.get(
BuildCommons.sparkHome.getAbsolutePath, "connector", "connect", "client",
"jvm", "target", s"scala-$scalaBinaryVer", s"spark-connect-client-jvm-assembly-$sparkVer.jar")
val destAssemblyJar = Paths.get(destDir.toString, s"spark-connect-client-jvm-assembly-$sparkVer.jar")
Files.copy(sourceAssemblyJar, destAssemblyJar, StandardCopyOption.REPLACE_EXISTING)

replClasspathes.foreach { f =>
val destFile = Paths.get(destDir.toString, f.getName)
if (!f.getName.startsWith("spark-")) {
Files.copy(f.toPath, destFile, StandardCopyOption.REPLACE_EXISTING)
}
// Connect client assembly manually.
Def.taskDyn {
if (moduleName.value.contains("assembly")) {
Def.task {
val replClasspathes = (LocalProject("connect-client-jvm") / Compile / dependencyClasspath)
.value.map(_.data).filter(_.isFile())
val scalaBinaryVer = SbtPomKeys.effectivePom.value.getProperties.get(
"scala.binary.version").asInstanceOf[String]
val sparkVer = SbtPomKeys.effectivePom.value.getProperties.get(
"spark.version").asInstanceOf[String]
val dest = destPath.value
val destDir = new File(dest, "connect-repl").toPath
Files.createDirectories(destDir)

val sourceAssemblyJar = Paths.get(
BuildCommons.sparkHome.getAbsolutePath, "connector", "connect", "client",
"jvm", "target", s"scala-$scalaBinaryVer", s"spark-connect-client-jvm-assembly-$sparkVer.jar")
val destAssemblyJar = Paths.get(destDir.toString, s"spark-connect-client-jvm-assembly-$sparkVer.jar")
Files.copy(sourceAssemblyJar, destAssemblyJar, StandardCopyOption.REPLACE_EXISTING)

replClasspathes.foreach { f =>
val destFile = Paths.get(destDir.toString, f.getName)
if (!f.getName.startsWith("spark-")) {
Files.copy(f.toPath, destFile, StandardCopyOption.REPLACE_EXISTING)
}
}
}.dependsOn(LocalProject("connect-client-jvm") / assembly)
} else {
Def.task {}
}
}
}.value
},
(Compile / packageBin / crossTarget) := destPath.value,
(Compile / packageBin) := (Compile / packageBin).dependsOn(copyDeps).value
Expand Down

0 comments on commit 717cedf

Please sign in to comment.