Skip to content

Add spark.sql.shuffle.partitions setting to the DefaultSparkConfiguration #76

@Zejnilovic

Description

@Zejnilovic

Background

In an effort to lower the run time of tests, add a setting for spark.sql.shuffle.partitions. The default is 200 and makes no sense for spark-based unit tests.

Feature

.config("spark.sql.shuffle.partitions", "1") to the DefaultSparkConfiguration

Additional info

I ran the tests a few times. It is not life-saving but it is something.

Spark commons

Without the setting
sbt test 492.02s user 39.22s system 742% cpu 1:11.54 total
With the setting
sbt test 305.25s user 18.57s system 556% cpu 58.167 total

Enceladus spark-jobs

Without the setting
mvn test 881.18s user 87.80s system 327% cpu 4:56.18 total
With the setting
mvn test 713.31s user 74.34s system 324% cpu 4:02.44 total

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

Status

✅ Done

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions