Closed
Description
Describe the bug, including details regarding any error messages, version, and platform.
It does seem that:
#36211
Updated from PoolThreadCache
to PoolArenasCache
this has made our nightly integration tests with Spark previous and current development versions to fail.
- test-conda-python-3.10-spark-master
- test-conda-python-3.8-spark-v3.1.2
- test-conda-python-3.9-spark-v3.2.0
The error:
02:07:30.759 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 24.0 (TID 30) (ab33723e6432 executor driver): java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.threadCache()Lio/netty/buffer/PoolArenasCache;
Spark hasn't yet updated to 4.1.94.Final.
I am unsure on how do we fix this but does this mean we break backwards compatibility with previous Spark versions?
Component(s)
Continuous Integration, Java