[HUDI-7096] Improving incremental query to fetch partitions based on commit metadata #29525
Triggered via pull request
November 22, 2023 01:11
Status
Success
Total duration
2h 46m 43s
Artifacts
–
bot.yml
on: pull_request
validate-source
23s
Matrix: docker-java17-test
Matrix: integration-tests
Matrix: test-flink
Matrix: test-hudi-hadoop-mr-and-hudi-java-client
Matrix: test-spark-java17
Matrix: test-spark
Matrix: validate-bundles
Matrix: validate-release-candidate-bundles
Annotations
8 errors and 80 warnings
test-spark-java17 (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Cannot resolve conflicts for overlapping writes
|
test-spark (scala-2.12, spark3.1, hudi-spark-datasource/hudi-spark3.1.x)
Cannot resolve conflicts for overlapping writes
|
test-spark-java17 (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes
|
test-spark-java17 (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Cannot resolve conflicts for overlapping writes
|
test-spark (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Cannot resolve conflicts for overlapping writes
|
test-spark (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Cannot resolve conflicts for overlapping writes
|
test-spark (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes
|
test-spark (scala-2.11, spark2.4, hudi-spark-datasource/hudi-spark2)
Cannot resolve conflicts for overlapping writes
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh Building Hudi with Java 8
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh Done building Hudi with Java 8
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh copying hadoop conf
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh starting hadoop hdfs
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:1
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:2
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:3
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh Running tests with Java 17
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh Building Hudi with Java 8
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh Done building Hudi with Java 8
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh copying hadoop conf
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:1
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:2
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:3
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh Running tests with Java 17
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
|
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh validating spark & hadoop-mr bundle
|
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh Query and validate the results using Spark SQL
|
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh Query and validate the results using HiveQL
|
validate-bundles (flink1.15, spark3.3, spark3.3.1)
Use default java runtime under /opt/java/openjdk
|
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh spark & hadoop-mr bundles validation was successful.
|
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh done validating spark & hadoop-mr bundle
|
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
|
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh validating utilities slim bundle
|
validate-bundles (flink1.18, spark3.3, spark3.3.2)
validate.sh validating spark & hadoop-mr bundle
|
validate-bundles (flink1.18, spark3.3, spark3.3.2)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
validate-bundles (flink1.18, spark3.3, spark3.3.2)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
validate-bundles (flink1.18, spark3.3, spark3.3.2)
validate.sh Query and validate the results using Spark SQL
|
validate-bundles (flink1.18, spark3.3, spark3.3.2)
validate.sh Query and validate the results using HiveQL
|
validate-bundles (flink1.18, spark3.3, spark3.3.2)
Use default java runtime under /opt/java/openjdk
|
validate-bundles (flink1.18, spark3.3, spark3.3.2)
validate.sh spark & hadoop-mr bundles validation was successful.
|
validate-bundles (flink1.18, spark3.3, spark3.3.2)
validate.sh done validating spark & hadoop-mr bundle
|
validate-bundles (flink1.18, spark3.3, spark3.3.2)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
|
validate-bundles (flink1.18, spark3.3, spark3.3.2)
validate.sh validating utilities slim bundle
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh validating spark & hadoop-mr bundle
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh Query and validate the results using Spark SQL
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh Query and validate the results using HiveQL
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
Use default java runtime under /opt/java/openjdk
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh spark & hadoop-mr bundles validation was successful.
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh done validating spark & hadoop-mr bundle
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh validating utilities slim bundle
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh validating spark & hadoop-mr bundle
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh Query and validate the results using Spark SQL
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh Query and validate the results using HiveQL
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
Use default java runtime under /opt/java/openjdk
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh spark & hadoop-mr bundles validation was successful.
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh done validating spark & hadoop-mr bundle
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh validating utilities slim bundle
|
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh validating spark & hadoop-mr bundle
|
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh Query and validate the results using Spark SQL
|
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh Query and validate the results using HiveQL
|
validate-bundles (flink1.16, spark3.3, spark3.3.2)
Use default java runtime under /opt/java/openjdk
|
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh spark & hadoop-mr bundles validation was successful.
|
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh done validating spark & hadoop-mr bundle
|
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
|
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh validating utilities slim bundle
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh validating spark & hadoop-mr bundle
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh Query and validate the results using Spark SQL
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh Query and validate the results using HiveQL
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
Use default java runtime under /opt/java/openjdk
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh spark & hadoop-mr bundles validation was successful.
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh done validating spark & hadoop-mr bundle
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh validating utilities slim bundle
|