Skip to content

Clarify memory requirements for Spark OSS with Comet vs Spark OSS #595

@lmouhib

Description

@lmouhib

What is the problem the feature request solves?

The publish benchmark for Spark with comet vs spark OSS, the memory for executors when using Comet for acceleration is double (64G vs 32G) the one for Spark OSS. Could we provide documentation to explain how comet impact the memory usage?

Describe the potential solution

Clarifying the need would provide a better user experience when setting up/configuring their Spark job.

Additional context

No response

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions