Skip to content

correct crd docs to be inline with 23.11 release docs #327

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Dec 20, 2023
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,5 @@ crate-hashes.json
result
image.tar

tilt_options.json
tilt_options.json
/.vscode/settings.json
91 changes: 50 additions & 41 deletions docs/modules/spark-k8s/pages/crd-reference.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -21,13 +21,7 @@ Below are listed the CRD fields that can be defined by the user:
|User-supplied image containing spark-job dependencies that will be copied to the specified volume mount.

|`spec.sparkImage`
| Spark image which will be deployed to driver and executor pods, which must contain spark environment needed by the job e.g. `docker.stackable.tech/stackable/spark-k8s:3.5.0-stackable0.0.0-dev`. Mandatory.

|`spec.sparkImagePullPolicy`
| Optional Enum (one of `Always`, `IfNotPresent` or `Never`) that determines the pull policy of the spark job image.

|`spec.sparkImagePullSecrets`
| An optional list of references to secrets in the same namespace to use for pulling any of the images used by a `SparkApplication` resource. Each reference has a single property (`name`) that must contain a reference to a valid secret.
| Spark image which will be deployed to driver and executor pods, which must contain spark environment needed by the job. See xref:concepts:product_image_selection.adoc[] for more details of the structure of this property. Mandatory.

|`spec.mainApplicationFile`
|The actual application file that will be called by `spark-submit`. Mandatory.
Expand All @@ -54,63 +48,78 @@ Below are listed the CRD fields that can be defined by the user:
|A list of excluded packages that is passed directly to `spark-submit`.

|`spec.deps.repositories`
|A list of repositories that is passed directly to `spark-submit`
|A list of repositories that is passed directly to `spark-submit`.

|`spec.env`
|A list of environment variables that will be set in the job, driver and executor pods.

|`spec.volumes`
|A list of volumes
|A list of volumes.

|`spec.volumes.name`
|The volume name
|`spec.job.configOverrides`
| See xref:concepts:overrides.adoc[] for more information.

|`spec.volumes.persistentVolumeClaim.claimName`
|The persistent volume claim backing the volume
|`spec.job.envOverrides`
|See xref:concepts:overrides.adoc[] for more information.

|`spec.job.resources`
|Resources specification for the initiating Job
|`spec.job.podOverrides`
|See xref:concepts:overrides.adoc[] for more information.

|`spec.driver.resources`
|Resources specification for the driver Pod
|`spec.job.config.logging`
|Logging specification for the initiating Job. See xref:concepts:logging.adoc[] for details.

|`spec.driver.volumeMounts`
|A list of mounted volumes for the driver
|`spec.job.config.resources`
|Resources specification for the initiating Job.

|`spec.driver.volumeMounts.name`
|Name of mount
|`spec.driver.configOverrides`
| See xref:concepts:overrides.adoc[] for more information.

|`spec.driver.volumeMounts.mountPath`
|Volume mount path
|`spec.driver.envOverrides`
|See xref:concepts:overrides.adoc[] for more information.

|`spec.driver.affinity`
|Driver Pod placement affinity. See xref:usage-guide/operations/pod-placement.adoc[] for details
|`spec.driver.podOverrides`
|See xref:concepts:overrides.adoc[] for more information.

|`spec.driver.logging`
|Logging aggregation for the driver Pod. See xref:concepts:logging.adoc[] for details
|`spec.driver.config.affinity`
|Driver Pod placement affinity. See xref:usage-guide/operations/pod-placement.adoc[] for details.

|`spec.executor.resources`
|Resources specification for the executor Pods
|`spec.driver.config.logging`
|Logging aggregation for the driver Pod. See xref:concepts:logging.adoc[] for details.

|`spec.executor.replicas`
|Number of executor instances launched for this job.
|`spec.driver.config.resources`
|Resources specification for the driver Pod.

|`spec.executor.volumeMounts`
|A list of mounted volumes for each executor.
|`spec.driver.config.volumeMounts`
|A list of mounted volumes for the driver.

|`spec.executor.volumeMounts.name`
|Name of mount.
|`spec.executor.configOverrides`
| See xref:concepts:overrides.adoc[] for more information.

|`spec.executor.volumeMounts.mountPath`
|Volume mount path.
|`spec.executor.envOverrides`
|See xref:concepts:overrides.adoc[] for more information.

|`spec.executor.affinity`
|`spec.executor.podOverrides`
|See xref:concepts:overrides.adoc[] for more information.

|`spec.executor.replicas`
|Number of executor instances launched for this job.

|`spec.executor.config.affinity`
|Driver Pod placement affinity. See xref:usage-guide/operations/pod-placement.adoc[] for details.

|`spec.executor.logging`
|`spec.executor.config.logging`
|Logging aggregation for the executor Pods. See xref:concepts:logging.adoc[] for details.

|`spec.logFileDirectory.bucket`
|`spec.executor.config.resources`
|Resources specification for the executor Pods.

|`spec.executor.config.volumeMounts`
|A list of mounted volumes for each executor.

|`spec.logFileDirectory.s3.bucket`
|S3 bucket definition where applications should publish events for the Spark History server.

|`spec.logFileDirectory.prefix`
|`spec.logFileDirectory.s3.prefix`
|Prefix to use when storing events for the Spark History server.

|`spec.driver.jvmSecurity`
Expand Down