Skip to content

Commit

Permalink
[SPARK-47706][BUILD] Bump json4s 4.0.7
Browse files Browse the repository at this point in the history
### What changes were proposed in this pull request?

Bump json4s from 3.7.0-M11 to 4.0.7

### Why are the changes needed?

4.0.7 is the latest stable version of json4s.

https://mvnrepository.com/artifact/org.json4s/json4s-jackson

### Does this PR introduce _any_ user-facing change?

No, all Mima complaints are private API.

### How was this patch tested?

Pass GHA.

### Was this patch authored or co-authored using generative AI tooling?

No

Closes apache#45838 from pan3793/SPARK-47706.

Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
  • Loading branch information
pan3793 authored and dongjoon-hyun committed Apr 10, 2024
1 parent b53ec00 commit 7996b03
Show file tree
Hide file tree
Showing 4 changed files with 12 additions and 8 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,6 @@ import java.time.Duration
import scala.jdk.CollectionConverters._

import com.google.protobuf.{Any => AnyProto, BoolValue, ByteString, BytesValue, DoubleValue, DynamicMessage, FloatValue, Int32Value, Int64Value, StringValue, UInt32Value, UInt64Value}
import org.json4s.StringInput
import org.json4s.jackson.JsonMethods

import org.apache.spark.sql.{AnalysisException, Column, DataFrame, QueryTest, Row}
Expand Down Expand Up @@ -1339,7 +1338,7 @@ class ProtobufFunctionsSuite extends QueryTest with SharedSparkSession with Prot

// Takes json string and return a json with all the extra whitespace removed.
def compactJson(json: String): String = {
val jsonValue = JsonMethods.parse(StringInput(json))
val jsonValue = JsonMethods.parse(json)
JsonMethods.compact(jsonValue)
}

Expand Down
9 changes: 5 additions & 4 deletions dev/deps/spark-deps-hadoop-3-hive-2.3
Original file line number Diff line number Diff line change
Expand Up @@ -147,10 +147,11 @@ joda-time/2.12.7//joda-time-2.12.7.jar
jodd-core/3.5.2//jodd-core-3.5.2.jar
jpam/1.1//jpam-1.1.jar
json/1.8//json-1.8.jar
json4s-ast_2.13/3.7.0-M11//json4s-ast_2.13-3.7.0-M11.jar
json4s-core_2.13/3.7.0-M11//json4s-core_2.13-3.7.0-M11.jar
json4s-jackson_2.13/3.7.0-M11//json4s-jackson_2.13-3.7.0-M11.jar
json4s-scalap_2.13/3.7.0-M11//json4s-scalap_2.13-3.7.0-M11.jar
json4s-ast_2.13/4.0.7//json4s-ast_2.13-4.0.7.jar
json4s-core_2.13/4.0.7//json4s-core_2.13-4.0.7.jar
json4s-jackson-core_2.13/4.0.7//json4s-jackson-core_2.13-4.0.7.jar
json4s-jackson_2.13/4.0.7//json4s-jackson_2.13-4.0.7.jar
json4s-scalap_2.13/4.0.7//json4s-scalap_2.13-4.0.7.jar
jsr305/3.0.0//jsr305-3.0.0.jar
jta/1.1//jta-1.1.jar
jul-to-slf4j/2.0.12//jul-to-slf4j-2.0.12.jar
Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -1119,7 +1119,7 @@
<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-jackson_${scala.binary.version}</artifactId>
<version>3.7.0-M11</version>
<version>4.0.7</version>
<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
Expand Down
6 changes: 5 additions & 1 deletion project/MimaExcludes.scala
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,11 @@ object MimaExcludes {
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.sql.jdbc.MySQLDialect#MySQLSQLBuilder.this"),
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.sql.jdbc.MySQLDialect#MySQLSQLQueryBuilder.this"),
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.sql.jdbc.OracleDialect#OracleSQLBuilder.this"),
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.sql.jdbc.OracleDialect#OracleSQLQueryBuilder.this")
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.sql.jdbc.OracleDialect#OracleSQLQueryBuilder.this"),
// SPARK-47706: Bump json4s from 3.7.0-M11 to 4.0.7
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.expressions.MutableAggregationBuffer.jsonValue"),
ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.types.DataType#JSortedObject.unapplySeq"),
ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.mllib.tree.model.TreeEnsembleModel#SaveLoadV1_0.readMetadata")
)

// Default exclude rules
Expand Down

0 comments on commit 7996b03

Please sign in to comment.