-
Notifications
You must be signed in to change notification settings - Fork 199
Open
Description
The following test failed when auron is enabled:
stacktrace:
===== TEST OUTPUT FOR o.a.s.sql.AuronInnerJoinSuite: 'inner join, multiple matches using ShuffledHashJoin (build=right) (whole-stage-codegen on)' =====
- inner join, multiple matches using ShuffledHashJoin (build=right) (whole-stage-codegen off) *** FAILED ***
Exception thrown while executing Spark plan:
ShuffledHashJoin [a#47], [a#58], Inner, BuildRight
:- Exchange hashpartitioning(a#47, 1), ENSURE_REQUIREMENTS, [plan_id=797]
: +- Project [_1#42 AS a#47, _2#43 AS b#48]
: +- Filter (_1#42 = 1)
: +- LocalTableScan [_1#42, _2#43]
+- Exchange hashpartitioning(a#58, 1), ENSURE_REQUIREMENTS, [plan_id=798]
+- Project [_1#53 AS a#58, _2#54 AS b#59]
+- Filter (_1#53 = 1)
+- LocalTableScan [_1#53, _2#54]
== Exception ==
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 48.0 failed 1 times, most recent failure: Lost task 0.0 in stage 48.0 (TID 90) (192.168.78.226 executor driver): java.lang.RuntimeException: called `Result::unwrap()` on an `Err` value: Execution("cannot create execution plan: ArrowError(SchemaError(\"Unable to get field named \\\"#58\\\". Valid fields: [\\\"#47\\\", \\\"#48\\\"]\"))")
at org.apache.auron.jni.JniBridge.callNative(Native Method)
at org.apache.auron.jni.AuronCallNativeWrapper.<init>(AuronCallNativeWrapper.java:94)
at org.apache.spark.sql.auron.NativeHelper$.executeNativePlan(NativeHelper.scala:110)
at org.apache.spark.sql.auron.NativeRDD.compute(NativeRDD.scala:68)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:329)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:329)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:136)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Driver stacktrace:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 48.0 failed 1 times, most recent failure: Lost task 0.0 in stage 48.0 (TID 90) (192.168.78.226 executor driver): java.lang.RuntimeException: called `Result::unwrap()` on an `Err` value: Execution("cannot create execution plan: ArrowError(SchemaError(\"Unable to get field named \\\"#58\\\". Valid fields: [\\\"#47\\\", \\\"#48\\\"]\"))")
at org.apache.auron.jni.JniBridge.callNative(Native Method)
at org.apache.auron.jni.AuronCallNativeWrapper.<init>(AuronCallNativeWrapper.java:94)
at org.apache.spark.sql.auron.NativeHelper$.executeNativePlan(NativeHelper.scala:110)
at org.apache.spark.sql.auron.NativeRDD.compute(NativeRDD.scala:68)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:329)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:329)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:136)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2668)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2604)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2603)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2603)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1178)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1178)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1178)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2856)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2798)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2787)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:952)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2238)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2259)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2278)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2303)
at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1021)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:406)
at org.apache.spark.rdd.RDD.collect(RDD.scala:1020)
at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:424)
at org.apache.spark.sql.execution.SparkPlan.executeCollectPublic(SparkPlan.scala:451)
at org.apache.spark.sql.execution.SparkPlanTest$.executePlan(SparkPlanTest.scala:251)
at org.apache.spark.sql.execution.SparkPlanTest$.checkAnswer(SparkPlanTest.scala:211)
at org.apache.spark.sql.execution.SparkPlanTest.doCheckAnswer(SparkPlanTest.scala:94)
at org.apache.spark.sql.execution.SparkPlanTest.checkAnswer2(SparkPlanTest.scala:76)
at org.apache.spark.sql.execution.joins.InnerJoinSuite.$anonfun$testInnerJoin$20(InnerJoinSuite.scala:177)
at org.apache.spark.sql.catalyst.plans.SQLHelper.withSQLConf(SQLHelper.scala:54)
at org.apache.spark.sql.catalyst.plans.SQLHelper.withSQLConf$(SQLHelper.scala:38)
at org.apache.spark.sql.execution.joins.InnerJoinSuite.org$apache$spark$sql$test$SQLTestUtilsBase$$super$withSQLConf(InnerJoinSuite.scala:32)
at org.apache.spark.sql.test.SQLTestUtilsBase.withSQLConf(SQLTestUtils.scala:247)
at org.apache.spark.sql.test.SQLTestUtilsBase.withSQLConf$(SQLTestUtils.scala:245)
at org.apache.spark.sql.execution.joins.InnerJoinSuite.withSQLConf(InnerJoinSuite.scala:32)
at org.apache.spark.sql.execution.joins.InnerJoinSuite.$anonfun$testInnerJoin$19(InnerJoinSuite.scala:173)
at org.apache.spark.sql.execution.joins.InnerJoinSuite.$anonfun$testInnerJoin$19$adapted(InnerJoinSuite.scala:171)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.sql.execution.joins.InnerJoinSuite.$anonfun$testInnerJoin$18(InnerJoinSuite.scala:171)
at org.apache.spark.sql.execution.joins.InnerJoinSuite.$anonfun$testInnerJoin$18$adapted(InnerJoinSuite.scala:170)
at org.apache.spark.sql.test.SQLTestUtils.$anonfun$testWithWholeStageCodegenOnAndOff$3(SQLTestUtils.scala:92)
at org.apache.spark.sql.catalyst.plans.SQLHelper.withSQLConf(SQLHelper.scala:54)
at org.apache.spark.sql.catalyst.plans.SQLHelper.withSQLConf$(SQLHelper.scala:38)
at org.apache.spark.sql.execution.joins.InnerJoinSuite.org$apache$spark$sql$test$SQLTestUtilsBase$$super$withSQLConf(InnerJoinSuite.scala:32)
at org.apache.spark.sql.test.SQLTestUtilsBase.withSQLConf(SQLTestUtils.scala:247)
at org.apache.spark.sql.test.SQLTestUtilsBase.withSQLConf$(SQLTestUtils.scala:245)
at org.apache.spark.sql.execution.joins.InnerJoinSuite.withSQLConf(InnerJoinSuite.scala:32)
at org.apache.spark.sql.test.SQLTestUtils.$anonfun$testWithWholeStageCodegenOnAndOff$2(SQLTestUtils.scala:92)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
at org.scalatest.Transformer.apply(Transformer.scala:22)
at org.scalatest.Transformer.apply(Transformer.scala:20)
at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:203)
at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:64)
at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:64)
at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
at scala.collection.immutable.List.foreach(List.scala:431)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
at org.scalatest.Suite.run(Suite.scala:1112)
at org.scalatest.Suite.run$(Suite.scala:1094)
at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:64)
at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:64)
at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
at org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
at org.scalatest.Suite.run(Suite.scala:1109)
at org.scalatest.Suite.run$(Suite.scala:1094)
at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
at scala.collection.immutable.List.foreach(List.scala:431)
at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
at org.scalatest.tools.Runner$.main(Runner.scala:775)
at org.scalatest.tools.Runner.main(Runner.scala)
Caused by: java.lang.RuntimeException: called `Result::unwrap()` on an `Err` value: Execution("cannot create execution plan: ArrowError(SchemaError(\"Unable to get field named \\\"#58\\\". Valid fields: [\\\"#47\\\", \\\"#48\\\"]\"))")
at org.apache.auron.jni.JniBridge.callNative(Native Method)
at org.apache.auron.jni.AuronCallNativeWrapper.<init>(AuronCallNativeWrapper.java:94)
at org.apache.spark.sql.auron.NativeHelper$.executeNativePlan(NativeHelper.scala:110)
at org.apache.spark.sql.auron.NativeRDD.compute(NativeRDD.scala:68)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:329)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:329)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:136)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750) (SparkPlanTest.scala:95)
26/01/01 23:49:58 INFO BaseSessionStateBuilder$$anon$2: Optimization rule 'org.apache.spark.sql.catalyst.optimizer.ConvertToLocalRelation' is excluded from the optimizer.
26/01/01 23:49:58 INFO BaseSessionStateBuilder$$anon$2: Optimization rule 'org.apache.spark.sql.catalyst.optimizer.ConvertToLocalRelation' is excluded from the optimizer.
26/01/01 23:49:58 INFO AuronColumnarOverrides$$anon$1: Auron convert strategy for current stage:
26/01/01 23:49:58 INFO AuronSparkSessionExtension: + ShuffledHashJoin (convertible=true, strategy=AlwaysConvert)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +- Exchange (convertible=true, strategy=AlwaysConvert)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +-- Project (convertible=true, strategy=AlwaysConvert)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +--- Filter (convertible=true, strategy=AlwaysConvert)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +---- LocalTableScan (convertible=true, strategy=AlwaysConvert)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +- Exchange (convertible=true, strategy=AlwaysConvert)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +-- Project (convertible=true, strategy=AlwaysConvert)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +--- Filter (convertible=true, strategy=AlwaysConvert)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +---- LocalTableScan (convertible=true, strategy=AlwaysConvert)
26/01/01 23:49:58 INFO AuronColumnarOverrides$$anon$1: Auron convert result for current stage:
26/01/01 23:49:58 INFO AuronSparkSessionExtension: + NativeShuffledHashJoin (convertible=false, strategy=Default)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +- NativeShuffleExchange (convertible=false, strategy=Default)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +-- NativeProject (convertible=false, strategy=Default)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +--- NativeFilter (convertible=false, strategy=Default)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +---- ConvertToNative (convertible=false, strategy=Default)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +----- LocalTableScan (convertible=true, strategy=AlwaysConvert)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +- NativeShuffleExchange (convertible=false, strategy=Default)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +-- NativeProject (convertible=false, strategy=Default)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +--- NativeFilter (convertible=false, strategy=Default)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +---- ConvertToNative (convertible=false, strategy=Default)
26/01/01 23:49:58 INFO AuronSparkSessionExtension: +----- LocalTableScan (convertible=true, strategy=AlwaysConvert)
26/01/01 23:49:58 INFO AuronColumnarOverrides$$anon$1: Transformed spark plan after preColumnarTransitions:
NativeShuffledHashJoin [a#47], [a#58], Inner, BuildRight
:- NativeShuffleExchange hashpartitioning(a#47, 1), ENSURE_REQUIREMENTS, [plan_id=920]
: +- NativeProject [_1#42 AS a#47, _2#43 AS b#48]
: +- NativeFilter (_1#42 = 1)
: +- ConvertToNative
: +- LocalTableScan [_1#42, _2#43]
+- NativeShuffleExchange hashpartitioning(a#58, 1), ENSURE_REQUIREMENTS, [plan_id=930]
+- NativeProject [_1#53 AS a#58, _2#54 AS b#59]
+- NativeFilter (_1#53 = 1)
+- ConvertToNative
+- LocalTableScan [_1#53, _2#54]
26/01/01 23:49:58 INFO SparkContext: Starting job: apply at OutcomeOf.scala:85
26/01/01 23:49:58 INFO DAGScheduler: Registering RDD 251 (apply at OutcomeOf.scala:85) as input to shuffle 15
26/01/01 23:49:58 INFO DAGScheduler: Got job 34 (apply at OutcomeOf.scala:85) with 1 output partitions
26/01/01 23:49:58 INFO DAGScheduler: Final stage: ResultStage 50 (apply at OutcomeOf.scala:85)
26/01/01 23:49:58 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 49)
26/01/01 23:49:58 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 49)
26/01/01 23:49:58 INFO DAGScheduler: Submitting ShuffleMapStage 49 (MapPartitionsRDD[251] at apply at OutcomeOf.scala:85), which has no missing parents
26/01/01 23:49:58 INFO MemoryStore: Block broadcast_61 stored as values in memory (estimated size 26.3 KiB, free 6.2 GiB)
26/01/01 23:49:58 INFO MemoryStore: Block broadcast_61_piece0 stored as bytes in memory (estimated size 12.5 KiB, free 6.2 GiB)
26/01/01 23:49:58 INFO BlockManagerInfo: Added broadcast_61_piece0 in memory on 192.168.78.226:57318 (size: 12.5 KiB, free: 6.2 GiB)
26/01/01 23:49:58 INFO SparkContext: Created broadcast 61 from broadcast at DAGScheduler.scala:1509
26/01/01 23:49:58 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 49 (MapPartitionsRDD[251] at apply at OutcomeOf.scala:85) (first 15 tasks are for partitions Vector(0, 1))
26/01/01 23:49:58 INFO TaskSchedulerImpl: Adding task set 49.0 with 2 tasks resource profile 0
26/01/01 23:49:58 INFO TaskSetManager: Starting task 0.0 in stage 49.0 (TID 91) (192.168.78.226, executor driver, partition 0, PROCESS_LOCAL, 7576 bytes) taskResourceAssignments Map()
26/01/01 23:49:58 INFO TaskSetManager: Starting task 1.0 in stage 49.0 (TID 92) (192.168.78.226, executor driver, partition 1, PROCESS_LOCAL, 7576 bytes) taskResourceAssignments Map()
26/01/01 23:49:58 INFO Executor: Running task 0.0 in stage 49.0 (TID 91)
26/01/01 23:49:58 INFO Executor: Running task 1.0 in stage 49.0 (TID 92)
26/01/01 23:49:58 WARN AuronCallNativeWrapper: Start executing native plan
26/01/01 23:49:58 WARN AuronCallNativeWrapper: Start executing native plan
2026-01-01 23:49:58.785 (+1.900s) [INFO] [auron::rt:144] (stage: 49, partition: 0, tid: 91) - start executing plan:
ShuffleWriterExec: partitioning=HashPartitioning([Column { name: "#47", index: 0 }], 1), schema=[#47:Int32, #48:Int32]
ProjectExec [#42@0 AS #47, #43@1 AS #48], schema=[#47:Int32, #48:Int32]
FilterExec [#42@0 = 1], schema=[#42:Int32, #43:Int32]
FFIReader, schema=[#42:Int32, #43:Int32]
2026-01-01 23:49:58.785 (+1.900s) [INFO] [auron::rt:144] (stage: 49, partition: 1, tid: 92) - start executing plan:
ShuffleWriterExec: partitioning=HashPartitioning([Column { name: "#47", index: 0 }], 1), schema=[#47:Int32, #48:Int32]
ProjectExec [#42@0 AS #47, #43@1 AS #48], schema=[#47:Int32, #48:Int32]
FilterExec [#42@0 = 1], schema=[#42:Int32, #43:Int32]
FFIReader, schema=[#42:Int32, #43:Int32]
Metadata
Metadata
Assignees
Labels
No labels