Merge remote-tracking branch 'parent/master' into SPARK-45265 #54
build_main.yml
on: push
Run
/
Check changes
41s
Run
/
Breaking change detection with Buf (branch-3.5)
1m 1s
Run
/
Run TPC-DS queries with SF=1
0s
Run
/
Run Docker integration tests
0s
Run
/
Run Spark on Kubernetes Integration test
1h 15m
Matrix: Run / build
Matrix: Run / java-other-versions
Run
/
Build modules: sparkr
41m 43s
Run
/
Linters, licenses, dependencies and documentation generation
2h 1m
Matrix: Run / pyspark
Annotations
62 errors and 5 warnings
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-4502808ae30285de-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-1645d38ae303a005-exec-1".
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$675/0x00007fe3585898c0@59f9dd13 rejected from java.util.concurrent.ThreadPoolExecutor@68013193[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 340]
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$675/0x00007fe3585898c0@4e27fdea rejected from java.util.concurrent.ThreadPoolExecutor@68013193[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 341]
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-9660448ae319d5a0-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-7336be8ae31af86b-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-c86eca8ae31f0aa4-exec-1".
|
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-e0eb16348e2c46b087e64751db67f020-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-e0eb16348e2c46b087e64751db67f020-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
|
Run / Build modules: pyspark-sql, pyspark-resource, pyspark-testing
Process completed with exit code 19.
|
Run / Build modules: hive - other tests
Process completed with exit code 18.
|
HiveClientSuite.4.0: create client:
HiveClientSuite#L64
java.lang.RuntimeException: [unresolved dependency: org.apache.hive#hive-metastore;4.0.0: not found, unresolved dependency: org.apache.hive#hive-exec;4.0.0: not found, unresolved dependency: org.apache.hive#hive-common;4.0.0: not found, unresolved dependency: org.apache.hive#hive-serde;4.0.0: not found]
|
HiveClientSuite.4.0: createDatabase:
HiveClientSuite#L95
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.createDatabase(org.apache.spark.sql.catalyst.catalog.CatalogDatabase, boolean)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: create/get/alter database should pick right user name as owner:
HiveClientSuite#L115
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.createDatabase(org.apache.spark.sql.catalyst.catalog.CatalogDatabase, boolean)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: createDatabase with null description:
HiveClientSuite#L134
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.createDatabase(org.apache.spark.sql.catalyst.catalog.CatalogDatabase, boolean)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: setCurrentDatabase:
HiveClientSuite#L143
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.setCurrentDatabase(String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getDatabase:
HiveClientSuite#L148
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getDatabase(String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: databaseExists:
HiveClientSuite#L153
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.databaseExists(String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: listDatabases:
HiveClientSuite#L158
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.listDatabases(String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: alterDatabase:
HiveClientSuite#L162
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getDatabase(String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: dropDatabase:
HiveClientSuite#L184
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.databaseExists(String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: createTable:
HiveClientSuite#L204
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.createTable(org.apache.spark.sql.catalyst.catalog.CatalogTable, boolean)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: loadTable:
HiveClientSuite#L215
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.loadTable(String, String, boolean, boolean)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: tableExists:
HiveClientSuite#L220
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.tableExists(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getTable:
HiveClientSuite#L226
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getTable(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getTableOption:
HiveClientSuite#L230
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getTableOption(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getTablesByName:
HiveClientSuite#L234
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getTablesByName(String, scala.collection.immutable.Seq)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getTablesByName when multiple tables:
HiveClientSuite#L239
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getTablesByName(String, scala.collection.immutable.Seq)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getTablesByName when some tables do not exist:
HiveClientSuite#L244
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getTablesByName(String, scala.collection.immutable.Seq)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getTablesByName when contains invalid name:
HiveClientSuite#L252
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getTablesByName(String, scala.collection.immutable.Seq)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getTablesByName when empty:
HiveClientSuite#L257
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getTablesByName(String, scala.collection.immutable.Seq)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: alterTable(table: CatalogTable):
HiveClientSuite#L261
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getTable(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: alterTable - should respect the original catalog table's owner name:
HiveClientSuite#L268
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getTable(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: alterTable(dbName: String, tableName: String, table: CatalogTable):
HiveClientSuite#L280
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getTable(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: alterTable - rename:
HiveClientSuite#L286
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getTable(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: alterTable - change database:
HiveClientSuite#L299
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.createDatabase(org.apache.spark.sql.catalyst.catalog.CatalogDatabase, boolean)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: alterTable - change database and table names:
HiveClientSuite#L312
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getTable(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: listTables(database):
HiveClientSuite#L323
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.listTables(String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: listTables(database, pattern):
HiveClientSuite#L327
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.listTables(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: listTablesByType(database, pattern, tableType):
HiveClientSuite#L333
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.listTablesByType(String, String, org.apache.spark.sql.catalyst.catalog.CatalogTableType)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: dropTable:
HiveClientSuite#L345
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.dropTable(String, String, boolean, boolean)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: sql create partitioned table:
HiveClientSuite#L392
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.createTable(org.apache.spark.sql.catalyst.catalog.CatalogTable, boolean)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: createPartitions:
HiveClientSuite#L402
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.createPartitions(String, String, scala.collection.immutable.Seq, boolean)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getPartitionNames(catalogTable):
HiveClientSuite#L407
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getTable(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getPartitions(db, table, spec):
HiveClientSuite#L412
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getPartitions(String, String, scala.Option)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getPartitionsByFilter:
HiveClientSuite#L417
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getRawHiveTable(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getPartition:
HiveClientSuite#L430
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getPartition(String, String, scala.collection.immutable.Map)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getPartitionOption(db: String, table: String, spec: TablePartitionSpec):
HiveClientSuite#L435
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getPartitionOption(String, String, scala.collection.immutable.Map)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getPartitionOption(table: CatalogTable, spec: TablePartitionSpec):
HiveClientSuite#L441
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getRawHiveTable(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getPartitions(db: String, table: String):
HiveClientSuite#L446
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getPartitions(String, String, scala.Option)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: loadPartition:
HiveClientSuite#L461
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.loadPartition(String, String, String, java.util.LinkedHashMap, boolean, boolean, boolean)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: loadDynamicPartitions:
HiveClientSuite#L475
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.loadDynamicPartitions(String, String, String, java.util.LinkedHashMap, boolean, int)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: renamePartitions:
HiveClientSuite#L481
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.renamePartitions(String, String, scala.collection.immutable.Seq, scala.collection.immutable.Seq)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: alterPartitions:
HiveClientSuite#L496
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.alterPartitions(String, String, scala.collection.immutable.Seq)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: dropPartitions:
HiveClientSuite#L511
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.dropPartitions(String, String, scala.collection.immutable.Seq, boolean, boolean, boolean)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: createPartitions if already exists:
HiveClientSuite#L543
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.dropPartitions(String, String, scala.collection.immutable.Seq, boolean, boolean, boolean)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: createFunction:
HiveClientSuite#L564
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.createFunction(String, org.apache.spark.sql.catalyst.catalog.CatalogFunction)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: functionExists:
HiveClientSuite#L573
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.functionExists(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: renameFunction:
HiveClientSuite#L584
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.renameFunction(String, String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: alterFunction:
HiveClientSuite#L597
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.alterFunction(String, org.apache.spark.sql.catalyst.catalog.CatalogFunction)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
HiveClientSuite.4.0: getFunction:
HiveClientSuite#L609
java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.hive.client.HiveClient.getFunction(String, String)" because the return value of "org.apache.spark.sql.hive.client.HiveClientSuite.client()" is null
|
Run / Build modules: sql - extended tests
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: sql - slow tests
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: sql - other tests
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-core, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Run / Build modules: pyspark-errors
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
site
Expired
|
58.5 MB |
|
test-results-catalyst, hive-thriftserver--17-hadoop3-hive2.3
Expired
|
199 KB |
|
test-results-core, unsafe, kvstore, avro, network-common, network-shuffle, repl, launcher, examples, sketch, graphx--17-hadoop3-hive2.3
Expired
|
17.4 KB |
|
test-results-hive-- other tests-17-hadoop3-hive2.3
Expired
|
1.31 MB |
|
test-results-hive-- slow tests-17-hadoop3-hive2.3
Expired
|
853 KB |
|
test-results-pyspark-connect--8-hadoop3-hive2.3
Expired
|
407 KB |
|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect--8-hadoop3-hive2.3
Expired
|
1.27 MB |
|
test-results-pyspark-pandas--8-hadoop3-hive2.3
Expired
|
1.14 MB |
|
test-results-pyspark-pandas-connect-part0--8-hadoop3-hive2.3
Expired
|
978 KB |
|
test-results-pyspark-pandas-connect-part1--8-hadoop3-hive2.3
Expired
|
966 KB |
|
test-results-pyspark-pandas-connect-part2--8-hadoop3-hive2.3
Expired
|
637 KB |
|
test-results-pyspark-pandas-connect-part3--8-hadoop3-hive2.3
Expired
|
326 KB |
|
test-results-pyspark-pandas-slow--8-hadoop3-hive2.3
Expired
|
1.85 MB |
|
test-results-pyspark-sql, pyspark-resource, pyspark-testing--8-hadoop3-hive2.3
Expired
|
228 KB |
|
test-results-sparkr--8-hadoop3-hive2.3
Expired
|
280 KB |
|
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, mllib-local, mllib, yarn, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf--17-hadoop3-hive2.3
Expired
|
547 KB |
|
unit-tests-log-hive-- other tests-17-hadoop3-hive2.3
Expired
|
107 MB |
|
unit-tests-log-pyspark-sql, pyspark-resource, pyspark-testing--8-hadoop3-hive2.3
Expired
|
960 MB |
|