-
Notifications
You must be signed in to change notification settings - Fork 28.6k
[SPARK-45593][BUILD] Building a runnable distribution from master code running spark-sql raise error #43436
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Kindly ping @juliuszsompolski @LuciferYang @dongjoon-hyun , Could you please take a look if you find a moment ~ |
hmm... also cc @hvanhovell |
@yikf I will discuss the specifics with you offline tomorrow. |
@@ -47,18 +47,6 @@ | |||
<groupId>com.google.protobuf</groupId> | |||
<artifactId>protobuf-java</artifactId> | |||
</dependency> | |||
<dependency> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spark connect requires a higher version of guava, so we can't remove this dependency. We should reconfigure the maven-shade-plugin in connect-common to exclude the packaging of this Guava.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Both connect-server and connect-client have their own independent guava dependencies : )
I think we should use connect-common shade guava, connect-client and connect-server shade connect-common to make connect-server and connect-client use the same guava dependency.
In addition, because connect-common is included in the connect-server shade, the runnable distribution can also exclude connect-common.
Do you like this current commit?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tested it as #43195 (comment) description and it worked as expected.
A bit busy today, I'll check this PR later. |
<exclusions> | ||
<exclusion> | ||
<groupId>org.apache.spark</groupId> | ||
<artifactId>spark-connect-common_${scala.binary.version}</artifactId> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why this change?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Runnable distribution tar already contains connect-server and connect-server shade the connect-common, so runnable distribution tar does not need to contain connect-common anymore.
spark/connector/connect/server/pom.xml
Line 306 in 8cdcfd2
<include>org.apache.spark:spark-connect-common_${scala.binary.version}</include> |
connector/connect/client/jvm/pom.xml
Outdated
@@ -124,6 +124,10 @@ | |||
<include>io.grpc.**</include> | |||
</includes> | |||
</relocation> | |||
<relocation> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So in how many places will we shade Guava now? 2? or 3? AFAICT we have the relocation defined by the parent, this one, and the one below that might come into play?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The current Apache Spark includes two versions of Guava, which are (The historical reasons are not clear to me. @LuciferYang may know some background):
Line 203 in 8cdcfd2
<guava.version>14.0.1</guava.version> Line 292 in 8cdcfd2
<connect.guava.version>32.0.1-jre</connect.guava.version>
The current Apache Spark has shading in the following three places for these two versions:
- for guava.version
spark/common/network-common/pom.xml
Lines 125 to 129 in 8cdcfd2
<dependency> <groupId>com.google.guava</groupId> <artifactId>guava</artifactId> <scope>compile</scope> </dependency> - for connect.guava.version
spark/connector/connect/server/pom.xml
Lines 310 to 316 in 8cdcfd2
<relocation> <pattern>com.google.common</pattern> <shadedPattern>${spark.shade.packageName}.connect.guava</shadedPattern> <includes> <include>com.google.common.**</include> </includes> </relocation> - for conflicting versions, the connect-common does not override the parent's shade plugin, so it is relocated to '${spark.shade.packageName}.guava'. However, the Guava version of connect-common is connect.guava.version, which results in two versions of the same package in the classpath.
spark/connector/connect/common/pom.xml
Lines 50 to 61 in 8cdcfd2
<dependency> <groupId>com.google.guava</groupId> <artifactId>guava</artifactId> <version>${connect.guava.version}</version> <scope>compile</scope> </dependency> <dependency> <groupId>com.google.guava</groupId> <artifactId>failureaccess</artifactId> <version>${guava.failureaccess.version}</version> <scope>compile</scope> </dependency>
Based on the above, it will lead to:
- The server with the connect profile has version 'guava.version' of '${spark.shade.packageName}.guava'.
- The server with the connect profile has version 'connect.guava.version' of '${spark.shade.packageName}.guava'.
- The server with the connect profile has version 'connect.guava.version' of '${spark.shade.packageName}.connect.guava'.
- The server with the connect profile does not have version 'connect.guava.version' of 'failureaccess'
This PR plans to make modifications following this line of thought, we should use connect-common shade guava, connect-client and connect-server shade connect-common to make connect-server and connect-client use the same guava dependency.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The current Apache Spark includes two versions of Guava, which are (The historical reasons are not clear to me. @LuciferYang may know some background):
Sorry I missed this ping. 14.0.1 is because Hive 2.3.9 depends on Guava 14.0.1, which is shaded in the network-common
module and used by other non-connect modules. 32.0.1-jre
is dedicated to Connect-related modules. Currently, the connect server and connect jvm client modules should have shaded it respectively, and for the Guava 32.0.1-jre, the dependency com.google.guava:failureaccess
should also be shaded.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From the error message, it seems that the connect-common module has inherited the shade plugin configuration from the parent pom.xml, and then performed shade + relocation on Guava 32.0.1-jre, but missed com.google.guava:failureaccess
. Perhaps overwriting the configuration of the shade plugin in the connect-common module to do nothing would solve the problem?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yea
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for working on this PR, it helped us with our issue. We had additional problem with same error message when we build spark with upgraded version of guava which depends on failureaccess. In this case shading done in network common misses failureaccess dependency since shading is done only on guava omitting classes in failureaccess.
@@ -140,6 +140,35 @@ | |||
</execution> | |||
</executions> | |||
</plugin> | |||
<plugin> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need to shade common? For the off-chance someone takes a direct dependency on it?
Any other suggestion on the PR? @hvanhovell @LuciferYang |
connector/connect/client/jvm/pom.xml
Outdated
@@ -137,6 +137,10 @@ | |||
<include>io.grpc.**</include> | |||
</includes> | |||
</relocation> | |||
<relocation> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
still need this change?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you are right, It is unnecessary.
Now, we perform shading of the Guava library within the connect-common module to ensure both connect-server and connect-client modules maintain consistent and accurate Guava dependencies.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BTW, I also tested connect as described in PR description , and spark sql shell using ./bin/spark-sql
, and it worked as expected.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM
@yikf Can you re-trigger the failed GA tasks? @hvanhovell Do you have any further suggestions for this pr? If not, I will merge it after it passes all GA tests. Thanks |
Merged into master. Thanks @yikf @hvanhovell @c3-ffomenko |
…dency ### What changes were proposed in this pull request? This PR amins to correct relocation connect guava dependency and remove duplicate connect-common from SBT build jars. **Item 1:** In #43436, We fixed the connect module dependency on guava, but the dependency on guava was relocation incorrectly. - connect server and connect client jvm don't relocation guava dependency, this runs the risk of causing conflict problems; - connect common relocation does not take effect because it defines conflicting relocation rules with the parent pom(Now, we remove guava dependency from connect-common as it never use this library); **Item2:** Remove duplicate connect-common from SBT build jars as it is shaded in the spark connect. Also, in fact, before this PR, in the output jars built using SBT, connect-common and common-server were the same thing, because they both hit the `jar.getName.contains("spark-connect")` condition. ### Why are the changes needed? Bugfix ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? GA ### Was this patch authored or co-authored using generative AI tooling? No. Closes #44801 from Yikf/SPARK-45593-SBT. Authored-by: yikaifei <yikaifei@apache.org> Signed-off-by: yangjie01 <yangjie01@baidu.com>
…e running spark-sql raise error ### What changes were proposed in this pull request? Fix a build issue, when building a runnable distribution from master code running spark-sql raise error: ``` Caused by: java.lang.ClassNotFoundException: org.sparkproject.guava.util.concurrent.internal.InternalFutureFailureAccess at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520) ... 58 more ``` the problem is due to a gauva dependency in spark-connect-common POM that **conflicts** with the shade plugin of the parent pom. - the spark-connect-common contains `connect.guava.version` version of guava, and it is relocation as `${spark.shade.packageName}.guava` not the `${spark.shade.packageName}.connect.guava`; - The spark-network-common also contains guava related classes, it has also been relocation is `${spark.shade.packageName}.guava`, but guava version `${guava.version}`; - As a result, in the presence of different versions of the classpath org.sparkproject.guava.xx; In addition, after investigation, it seems that module spark-connect-common is not related to guava, so we can remove guava dependency from spark-connect-common. ### Why are the changes needed? Building a runnable distribution from master code is not runnable. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? I ran the build command output a runnable distribution package manually for the tests; Build command: ``` ./dev/make-distribution.sh --name ui --pip --tgz -Phive -Phive-thriftserver -Pyarn -Pconnect ``` Test result: <img width="1276" alt="image" src="https://github.com/apache/spark/assets/51110188/aefbc433-ea5c-4287-8ebd-367806043ac8"> I also checked the `org.sparkproject.guava.cache.LocalCache` from jars dir; Before: ``` ➜ jars grep -lr 'org.sparkproject.guava.cache.LocalCache' ./ .//spark-connect_2.13-4.0.0-SNAPSHOT.jar .//spark-network-common_2.13-4.0.0-SNAPSHOT.jar .//spark-connect-common_2.13-4.0.0-SNAPSHOT.jar ``` Now: ``` ➜ jars grep -lr 'org.sparkproject.guava.cache.LocalCache' ./ .//spark-network-common_2.13-4.0.0-SNAPSHOT.jar ``` ### Was this patch authored or co-authored using generative AI tooling? No Closes apache#43436 from Yikf/SPARK-45593. Authored-by: yikaifei <yikaifei@apache.org> Signed-off-by: yangjie01 <yangjie01@baidu.com>
### What changes were proposed in this pull request? This PR amins to correct relocation connect guava dependency and remove duplicate connect-common from SBT build jars. This PR cherry-pick from #43436 and #44801 as a backport to 3.5 branch. ### Why are the changes needed? Bugfix ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Follow the steps described at #43195 (comment) to test manually. In addition, will continue to observe the GA situation in recent days. ### Was this patch authored or co-authored using generative AI tooling? No Closes #45775 from Yikf/branch-3.5. Authored-by: yikaifei <yikaifei@apache.org> Signed-off-by: yangjie01 <yangjie01@baidu.com>
What changes were proposed in this pull request?
Fix a build issue, when building a runnable distribution from master code running spark-sql raise error:
the problem is due to a gauva dependency in spark-connect-common POM that conflicts with the shade plugin of the parent pom.
connect.guava.version
version of guava, and it is relocation as${spark.shade.packageName}.guava
not the${spark.shade.packageName}.connect.guava
;${spark.shade.packageName}.guava
, but guava version${guava.version}
;In addition, after investigation, it seems that module spark-connect-common is not related to guava, so we can remove guava dependency from spark-connect-common.
Why are the changes needed?
Building a runnable distribution from master code is not runnable.
Does this PR introduce any user-facing change?
No
How was this patch tested?
I ran the build command output a runnable distribution package manually for the tests;
Build command:
Test result:

I also checked the
org.sparkproject.guava.cache.LocalCache
from jars dir;Before:
Now:
Was this patch authored or co-authored using generative AI tooling?
No