[HUDI-8543] Fixing SI MDT record generation in MDT to not rely on RDD<WriteStatus> #36689
Triggered via pull request
November 19, 2024 08:19
Status
Failure
Total duration
52m 42s
Artifacts
–
bot.yml
on: pull_request
validate-source
2m 57s
Matrix: build-flink-java17
Matrix: build-spark-java17
Matrix: docker-java17-test
Matrix: integration-tests
Matrix: test-flink
Matrix: test-hudi-hadoop-mr-and-hudi-java-client
Matrix: test-spark-java-tests
Matrix: test-spark-java11-17-java-tests
Matrix: test-spark-java11-17-scala-tests
Matrix: test-spark-java17-java-tests
Matrix: test-spark-java17-scala-tests
Matrix: test-spark-scala-tests
Matrix: validate-bundles-java11
Matrix: validate-bundles
Annotations
57 errors and 25 warnings
test-spark-scala-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Process completed with exit code 1.
|
test-spark-java-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Process completed with exit code 1.
|
build-spark-java17 (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Process completed with exit code 1.
|
build-spark-java17 (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The job was canceled because "scala-2_13_spark3_5_hudi-" failed.
|
test-spark-java11-17-scala-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Process completed with exit code 1.
|
test-spark-scala-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The job was canceled because "scala-2_13_spark3_5_hudi-" failed.
|
test-spark-scala-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The operation was canceled.
|
test-spark-scala-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The job was canceled because "scala-2_13_spark3_5_hudi-" failed.
|
test-spark-scala-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The operation was canceled.
|
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
Process completed with exit code 1.
|
test-spark-java-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The job was canceled because "scala-2_13_spark3_5_hudi-" failed.
|
test-spark-java-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The operation was canceled.
|
test-spark-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The job was canceled because "scala-2_13_spark3_5_hudi-" failed.
|
test-spark-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The operation was canceled.
|
test-spark-scala-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The job was canceled because "scala-2_13_spark3_5_hudi-" failed.
|
test-spark-scala-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The operation was canceled.
|
test-spark-java-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The job was canceled because "scala-2_13_spark3_5_hudi-" failed.
|
test-spark-java-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The operation was canceled.
|
build-spark-java17 (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The job was canceled because "scala-2_13_spark3_5_hudi-" failed.
|
build-spark-java17 (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The operation was canceled.
|
build-spark-java17 (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The job was canceled because "scala-2_13_spark3_5_hudi-" failed.
|
build-spark-java17 (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The operation was canceled.
|
test-spark-java11-17-scala-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The job was canceled because "scala-2_13_spark3_5_hudi-" failed.
|
test-spark-java11-17-scala-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The operation was canceled.
|
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
The job was canceled because "scala-2_13_flink1_20_spar" failed.
|
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
The operation was canceled.
|
test-spark-java11-17-java-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Process completed with exit code 1.
|
test-spark-java11-17-java-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The job was canceled because "scala-2_13_spark3_5_hudi-" failed.
|
test-spark-java11-17-java-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The operation was canceled.
|
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
Process completed with exit code 1.
|
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
The job was canceled because "scala-2_13_flink1_18_spar" failed.
|
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
Process completed with exit code 1.
|
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
The job was canceled because "scala-2_13_flink1_18_spar" failed.
|
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
Process completed with exit code 1.
|
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
The job was canceled because "scala-2_13_flink1_18_spar" failed.
|
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
The operation was canceled.
|
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
The job was canceled because "scala-2_13_flink1_18_spar" failed.
|
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
The operation was canceled.
|
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
The job was canceled because "scala-2_13_flink1_18_spar" failed.
|
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
The operation was canceled.
|
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
The job was canceled because "scala-2_13_flink1_18_spar" failed.
|
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
The operation was canceled.
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Failed building Hudi with Java 8!
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
Process completed with exit code 1.
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
The job was canceled because "scala-2_13_flink1_20_spar" failed.
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
The operation was canceled.
|
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
The job was canceled because "scala-2_13_flink1_20_spar" failed.
|
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
The operation was canceled.
|
integration-tests (spark3.5, spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz)
System.IO.IOException: No space left on device : '/home/runner/runners/2.320.0/_diag/Worker_20241119-081920-utc.log'
at System.IO.RandomAccess.WriteAtOffset(SafeFileHandle handle, ReadOnlySpan`1 buffer, Int64 fileOffset)
at System.IO.Strategies.BufferedFileStreamStrategy.FlushWrite()
at System.IO.StreamWriter.Flush(Boolean flushStream, Boolean flushEncoder)
at System.Diagnostics.TextWriterTraceListener.Flush()
at GitHub.Runner.Common.HostTraceListener.WriteHeader(String source, TraceEventType eventType, Int32 id)
at GitHub.Runner.Common.HostTraceListener.TraceEvent(TraceEventCache eventCache, String source, TraceEventType eventType, Int32 id, String message)
at System.Diagnostics.TraceSource.TraceEvent(TraceEventType eventType, Int32 id, String message)
at GitHub.Runner.Worker.Worker.RunAsync(String pipeIn, String pipeOut)
at GitHub.Runner.Worker.Program.MainAsync(IHostContext context, String[] args)
System.IO.IOException: No space left on device : '/home/runner/runners/2.320.0/_diag/Worker_20241119-081920-utc.log'
at System.IO.RandomAccess.WriteAtOffset(SafeFileHandle handle, ReadOnlySpan`1 buffer, Int64 fileOffset)
at System.IO.Strategies.BufferedFileStreamStrategy.FlushWrite()
at System.IO.StreamWriter.Flush(Boolean flushStream, Boolean flushEncoder)
at System.Diagnostics.TextWriterTraceListener.Flush()
at GitHub.Runner.Common.HostTraceListener.WriteHeader(String source, TraceEventType eventType, Int32 id)
at GitHub.Runner.Common.HostTraceListener.TraceEvent(TraceEventCache eventCache, String source, TraceEventType eventType, Int32 id, String message)
at System.Diagnostics.TraceSource.TraceEvent(TraceEventType eventType, Int32 id, String message)
at GitHub.Runner.Common.Tracing.Error(Exception exception)
at GitHub.Runner.Worker.Program.MainAsync(IHostContext context, String[] args)
Unhandled exception. System.IO.IOException: No space left on device : '/home/runner/runners/2.320.0/_diag/Worker_20241119-081920-utc.log'
at System.IO.RandomAccess.WriteAtOffset(SafeFileHandle handle, ReadOnlySpan`1 buffer, Int64 fileOffset)
at System.IO.Strategies.BufferedFileStreamStrategy.FlushWrite()
at System.IO.StreamWriter.Flush(Boolean flushStream, Boolean flushEncoder)
at System.Diagnostics.TextWriterTraceListener.Flush()
at System.Diagnostics.TraceSource.Flush()
at GitHub.Runner.Common.TraceManager.Dispose(Boolean disposing)
at GitHub.Runner.Common.TraceManager.Dispose()
at GitHub.Runner.Common.HostContext.Dispose(Boolean disposing)
at GitHub.Runner.Common.HostContext.Dispose()
at GitHub.Runner.Worker.Program.Main(String[] args)
|
test-spark-java17-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Process completed with exit code 1.
|
test-spark-java17-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Cannot resolve conflicts for overlapping writes
|
test-spark-java17-java-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The job was canceled because "scala-2_12_spark3_4_hudi-" failed.
|
test-spark-java17-java-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The operation was canceled.
|
test-spark-java17-java-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Cannot resolve conflicts for overlapping writes
|
test-spark-java17-scala-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Process completed with exit code 1.
|
test-spark-java17-scala-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The job was canceled because "scala-2_12_spark3_4_hudi-" failed.
|
test-spark-java17-scala-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The operation was canceled.
|
validate-source
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
test-spark-scala-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
test-spark-java-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
build-spark-java17 (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
test-spark-java11-17-scala-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
test-flink (flink1.14)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
build-flink-java17 (scala-2.12, flink1.20)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
test-flink (flink1.15)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
test-flink (flink1.17)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
test-flink (flink1.18)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
test-flink (flink1.16)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
test-spark-java11-17-java-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
test-flink (flink1.19)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Building Hudi with Java 8
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Building Hudi with Java 8
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Done building Hudi with Java 8
|
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh copying hadoop conf
|
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Building Hudi with Java 8
|
test-hudi-hadoop-mr-and-hudi-java-client (scala-2.12, spark3.5, flink1.20)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
test-flink (flink1.20)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
test-spark-java17-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|
test-spark-java17-scala-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
|