Skip to content

[SPARK-45593][BUILD] Building a runnable distribution from master code running spark-sql raise error #43436

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from

Conversation

yikf
Copy link
Contributor

@yikf yikf commented Oct 18, 2023

What changes were proposed in this pull request?

Fix a build issue, when building a runnable distribution from master code running spark-sql raise error:

Caused by: java.lang.ClassNotFoundException: org.sparkproject.guava.util.concurrent.internal.InternalFutureFailureAccess
	at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
	at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520)
	... 58 more

the problem is due to a gauva dependency in spark-connect-common POM that conflicts with the shade plugin of the parent pom.

  • the spark-connect-common contains connect.guava.version version of guava, and it is relocation as ${spark.shade.packageName}.guava not the ${spark.shade.packageName}.connect.guava;
  • The spark-network-common also contains guava related classes, it has also been relocation is ${spark.shade.packageName}.guava, but guava version ${guava.version};
  • As a result, in the presence of different versions of the classpath org.sparkproject.guava.xx;

In addition, after investigation, it seems that module spark-connect-common is not related to guava, so we can remove guava dependency from spark-connect-common.

Why are the changes needed?

Building a runnable distribution from master code is not runnable.

Does this PR introduce any user-facing change?

No

How was this patch tested?

I ran the build command output a runnable distribution package manually for the tests;

Build command:

./dev/make-distribution.sh --name ui --pip --tgz  -Phive -Phive-thriftserver -Pyarn -Pconnect

Test result:
image

I also checked the org.sparkproject.guava.cache.LocalCache from jars dir;
Before:

➜  jars grep -lr 'org.sparkproject.guava.cache.LocalCache' ./
.//spark-connect_2.13-4.0.0-SNAPSHOT.jar
.//spark-network-common_2.13-4.0.0-SNAPSHOT.jar
.//spark-connect-common_2.13-4.0.0-SNAPSHOT.jar

Now:

➜  jars grep -lr 'org.sparkproject.guava.cache.LocalCache' ./
.//spark-network-common_2.13-4.0.0-SNAPSHOT.jar

Was this patch authored or co-authored using generative AI tooling?

No

@yikf
Copy link
Contributor Author

yikf commented Oct 18, 2023

Kindly ping @juliuszsompolski @LuciferYang @dongjoon-hyun , Could you please take a look if you find a moment ~

@yikf yikf changed the title [SPARK-45593][BUILD] Building a runnable distribution from master code running spark-sql raise error "java.lang.ClassNotFoundException: org.sparkproject.guava.util.concurrent.internal.InternalFutureFailureAccess" [SPARK-45593][BUILD] Building a runnable distribution from master code running spark-sql raise error Oct 18, 2023
@LuciferYang
Copy link
Contributor

hmm... also cc @hvanhovell

@LuciferYang
Copy link
Contributor

LuciferYang commented Oct 18, 2023

Kindly ping @juliuszsompolski @LuciferYang @dongjoon-hyun , Could you please take a look if you find a moment ~

@yikf I will discuss the specifics with you offline tomorrow.

@@ -47,18 +47,6 @@
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
</dependency>
<dependency>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spark connect requires a higher version of guava, so we can't remove this dependency. We should reconfigure the maven-shade-plugin in connect-common to exclude the packaging of this Guava.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both connect-server and connect-client have their own independent guava dependencies : )

I think we should use connect-common shade guava, connect-client and connect-server shade connect-common to make connect-server and connect-client use the same guava dependency.

In addition, because connect-common is included in the connect-server shade, the runnable distribution can also exclude connect-common.

Do you like this current commit?

Copy link
Contributor Author

@yikf yikf Oct 19, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tested it as #43195 (comment) description and it worked as expected.
image

@LuciferYang
Copy link
Contributor

A bit busy today, I'll check this PR later.

<exclusions>
<exclusion>
<groupId>org.apache.spark</groupId>
<artifactId>spark-connect-common_${scala.binary.version}</artifactId>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why this change?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Runnable distribution tar already contains connect-server and connect-server shade the connect-common, so runnable distribution tar does not need to contain connect-common anymore.

<include>org.apache.spark:spark-connect-common_${scala.binary.version}</include>

@@ -124,6 +124,10 @@
<include>io.grpc.**</include>
</includes>
</relocation>
<relocation>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So in how many places will we shade Guava now? 2? or 3? AFAICT we have the relocation defined by the parent, this one, and the one below that might come into play?

Copy link
Contributor Author

@yikf yikf Oct 26, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The current Apache Spark includes two versions of Guava, which are (The historical reasons are not clear to me. @LuciferYang may know some background):

  1. spark/pom.xml

    Line 203 in 8cdcfd2

    <guava.version>14.0.1</guava.version>
  2. spark/pom.xml

    Line 292 in 8cdcfd2

    <connect.guava.version>32.0.1-jre</connect.guava.version>

The current Apache Spark has shading in the following three places for these two versions:

  1. for guava.version
    <dependency>
    <groupId>com.google.guava</groupId>
    <artifactId>guava</artifactId>
    <scope>compile</scope>
    </dependency>
  2. for connect.guava.version
    <relocation>
    <pattern>com.google.common</pattern>
    <shadedPattern>${spark.shade.packageName}.connect.guava</shadedPattern>
    <includes>
    <include>com.google.common.**</include>
    </includes>
    </relocation>
  3. for conflicting versions, the connect-common does not override the parent's shade plugin, so it is relocated to '${spark.shade.packageName}.guava'. However, the Guava version of connect-common is connect.guava.version, which results in two versions of the same package in the classpath.
    <dependency>
    <groupId>com.google.guava</groupId>
    <artifactId>guava</artifactId>
    <version>${connect.guava.version}</version>
    <scope>compile</scope>
    </dependency>
    <dependency>
    <groupId>com.google.guava</groupId>
    <artifactId>failureaccess</artifactId>
    <version>${guava.failureaccess.version}</version>
    <scope>compile</scope>
    </dependency>

Based on the above, it will lead to:

  • The server with the connect profile has version 'guava.version' of '${spark.shade.packageName}.guava'.
  • The server with the connect profile has version 'connect.guava.version' of '${spark.shade.packageName}.guava'.
  • The server with the connect profile has version 'connect.guava.version' of '${spark.shade.packageName}.connect.guava'.
  • The server with the connect profile does not have version 'connect.guava.version' of 'failureaccess'

This PR plans to make modifications following this line of thought, we should use connect-common shade guava, connect-client and connect-server shade connect-common to make connect-server and connect-client use the same guava dependency.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The current Apache Spark includes two versions of Guava, which are (The historical reasons are not clear to me. @LuciferYang may know some background):

Sorry I missed this ping. 14.0.1 is because Hive 2.3.9 depends on Guava 14.0.1, which is shaded in the network-common module and used by other non-connect modules. 32.0.1-jre is dedicated to Connect-related modules. Currently, the connect server and connect jvm client modules should have shaded it respectively, and for the Guava 32.0.1-jre, the dependency com.google.guava:failureaccess should also be shaded.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From the error message, it seems that the connect-common module has inherited the shade plugin configuration from the parent pom.xml, and then performed shade + relocation on Guava 32.0.1-jre, but missed com.google.guava:failureaccess. Perhaps overwriting the configuration of the shade plugin in the connect-common module to do nothing would solve the problem?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yea

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for working on this PR, it helped us with our issue. We had additional problem with same error message when we build spark with upgraded version of guava which depends on failureaccess. In this case shading done in network common misses failureaccess dependency since shading is done only on guava omitting classes in failureaccess.

@@ -140,6 +140,35 @@
</execution>
</executions>
</plugin>
<plugin>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need to shade common? For the off-chance someone takes a direct dependency on it?

@yikf
Copy link
Contributor Author

yikf commented Jan 10, 2024

Any other suggestion on the PR? @hvanhovell @LuciferYang

@@ -137,6 +137,10 @@
<include>io.grpc.**</include>
</includes>
</relocation>
<relocation>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

still need this change?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you are right, It is unnecessary.

Now, we perform shading of the Guava library within the connect-common module to ensure both connect-server and connect-client modules maintain consistent and accurate Guava dependencies.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BTW, I also tested connect as described in PR description , and spark sql shell using ./bin/spark-sql , and it worked as expected.

Copy link
Contributor

@LuciferYang LuciferYang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM

@LuciferYang
Copy link
Contributor

@yikf Can you re-trigger the failed GA tasks?

@hvanhovell Do you have any further suggestions for this pr? If not, I will merge it after it passes all GA tests. Thanks

@LuciferYang
Copy link
Contributor

Merged into master. Thanks @yikf @hvanhovell @c3-ffomenko

@yikf yikf deleted the SPARK-45593 branch January 18, 2024 03:34
LuciferYang pushed a commit that referenced this pull request Jan 24, 2024
…dency

### What changes were proposed in this pull request?

This PR amins to correct relocation connect guava dependency and remove duplicate connect-common from SBT build jars.

**Item 1:** In #43436, We fixed the connect module dependency on guava, but the dependency on guava was relocation incorrectly.
- connect server and connect client jvm don't relocation guava dependency, this runs the risk of causing conflict problems;
- connect common relocation does not take effect because it defines conflicting relocation rules with the parent pom(Now, we remove guava dependency from connect-common as it never use this library);

**Item2:** Remove duplicate connect-common from SBT build jars as it is shaded in the spark connect. Also, in fact, before this PR, in the output jars built using SBT, connect-common and common-server were the same thing, because they both hit the `jar.getName.contains("spark-connect")` condition.

### Why are the changes needed?

Bugfix

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

GA

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #44801 from Yikf/SPARK-45593-SBT.

Authored-by: yikaifei <yikaifei@apache.org>
Signed-off-by: yangjie01 <yangjie01@baidu.com>
okumin pushed a commit to zookage/spark that referenced this pull request Mar 31, 2024
…e running spark-sql raise error

### What changes were proposed in this pull request?

Fix a build issue, when building a runnable distribution from master code running spark-sql raise error:
```
Caused by: java.lang.ClassNotFoundException: org.sparkproject.guava.util.concurrent.internal.InternalFutureFailureAccess
	at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
	at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520)
	... 58 more
```
the problem is due to a gauva dependency in  spark-connect-common POM that **conflicts**  with the shade plugin of the parent pom.

- the spark-connect-common contains `connect.guava.version` version of guava, and it is relocation as `${spark.shade.packageName}.guava` not the `${spark.shade.packageName}.connect.guava`;
- The spark-network-common also contains guava related classes, it has also been relocation is `${spark.shade.packageName}.guava`, but guava version `${guava.version}`;
- As a result, in the presence of different versions of the classpath org.sparkproject.guava.xx;

In addition, after investigation, it seems that module spark-connect-common is not related to guava, so we can remove guava dependency from spark-connect-common.

### Why are the changes needed?

Building a runnable distribution from master code is not runnable.

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

I ran the build command output a runnable distribution package manually for the tests;

Build command:
```
./dev/make-distribution.sh --name ui --pip --tgz  -Phive -Phive-thriftserver -Pyarn -Pconnect
```

Test result:
<img width="1276" alt="image" src="https://github.com/apache/spark/assets/51110188/aefbc433-ea5c-4287-8ebd-367806043ac8">

I also checked the `org.sparkproject.guava.cache.LocalCache` from jars dir;
Before:
```
➜  jars grep -lr 'org.sparkproject.guava.cache.LocalCache' ./
.//spark-connect_2.13-4.0.0-SNAPSHOT.jar
.//spark-network-common_2.13-4.0.0-SNAPSHOT.jar
.//spark-connect-common_2.13-4.0.0-SNAPSHOT.jar
```

Now:
```
➜  jars grep -lr 'org.sparkproject.guava.cache.LocalCache' ./
.//spark-network-common_2.13-4.0.0-SNAPSHOT.jar
```

### Was this patch authored or co-authored using generative AI tooling?

No

Closes apache#43436 from Yikf/SPARK-45593.

Authored-by: yikaifei <yikaifei@apache.org>
Signed-off-by: yangjie01 <yangjie01@baidu.com>
LuciferYang pushed a commit that referenced this pull request Apr 1, 2024
### What changes were proposed in this pull request?

This PR amins to correct relocation connect guava dependency and remove duplicate connect-common from SBT build jars.

This PR cherry-pick from #43436 and #44801 as a backport to 3.5 branch.

### Why are the changes needed?

Bugfix

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Follow the steps described at #43195 (comment) to test manually.

In addition, will continue to observe the GA situation in recent days.

### Was this patch authored or co-authored using generative AI tooling?

No

Closes #45775 from Yikf/branch-3.5.

Authored-by: yikaifei <yikaifei@apache.org>
Signed-off-by: yangjie01 <yangjie01@baidu.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants