Skip to content

[CI][Java] Integration jobs with Spark fail with NoSuchMethodError:io.netty.buffer.PooledByteBufAllocator #36332

Closed
@raulcd

Description

@raulcd

Describe the bug, including details regarding any error messages, version, and platform.

It does seem that:
#36211
Updated from PoolThreadCache to PoolArenasCache this has made our nightly integration tests with Spark previous and current development versions to fail.

The error:

 02:07:30.759 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 24.0 (TID 30) (ab33723e6432 executor driver): java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocatorL$InnerAllocator.threadCache()Lio/netty/buffer/PoolArenasCache;

Spark hasn't yet updated to 4.1.94.Final.
I am unsure on how do we fix this but does this mean we break backwards compatibility with previous Spark versions?

Component(s)

Continuous Integration, Java

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions