Skip to content

Commit

Permalink
Upgraded to Spark 2.4.5 (kubeflow#798)
Browse files Browse the repository at this point in the history
  • Loading branch information
liyinan926 authored and breetasinha1109 committed Feb 27, 2020
1 parent 8c49573 commit fc491a2
Show file tree
Hide file tree
Showing 17 changed files with 61 additions and 61 deletions.
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
# limitations under the License.
#

ARG SPARK_IMAGE=gcr.io/spark-operator/spark:v2.4.5-SNAPSHOT
ARG SPARK_IMAGE=gcr.io/spark-operator/spark:v2.4.5

FROM golang:1.12.5-alpine as builder
ARG DEP_VERSION="0.5.3"
Expand Down
2 changes: 1 addition & 1 deletion Dockerfile.rh
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
# 1. Your Docker version is >= 18.09.3
# 2. export DOCKER_BUILDKIT=1

ARG SPARK_IMAGE=gcr.io/spark-operator/spark:v2.4.4
ARG SPARK_IMAGE=gcr.io/spark-operator/spark:v2.4.5

FROM golang:1.12.5-alpine as builder
ARG DEP_VERSION="0.5.3"
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,8 @@ The following table lists the most recent few versions of the operator.

| Operator Version | API Version | Kubernetes Version | Base Spark Version | Operator Image Tag |
| ------------- | ------------- | ------------- | ------------- | ------------- |
| `latest` (master HEAD) | `v1beta2` | 1.13+ | `2.4.5-SNAPSHOT` | `latest` |
| `v1beta2-1.0.2-2.4.5-SNAPSHOT` | `v1beta2` | 1.13+ | `2.4.5-SNAPSHOT` | `v1beta2-1.0.2-2.4.5-SNAPSHOT` |
| `latest` (master HEAD) | `v1beta2` | 1.13+ | `2.4.5` | `latest` |
| `v1beta2-1.1.0-2.4.5` | `v1beta2` | 1.13+ | `2.4.5` | `v1beta2-1.1.0-2.4.5` |
| `v1beta2-1.0.1-2.4.4` | `v1beta2` | 1.13+ | `2.4.4` | `v1beta2-1.0.1-2.4.4` |
| `v1beta2-1.0.0-2.4.4` | `v1beta2` | 1.13+ | `2.4.4` | `v1beta2-1.0.0-2.4.4` |
| `v1beta1-0.9.0` | `v1beta1` | 1.13+ | `2.4.0` | `v2.4.0-v1beta1-0.9.0` |
Expand Down
2 changes: 1 addition & 1 deletion docs/developer-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The easiest way to build the operator without worrying about its dependencies is
$ docker build -t <image-tag> .
```

The operator image is built upon a base Spark image that defaults to `gcr.io/spark-operator/spark:v2.4.4`. If you want to use your own Spark image (e.g., an image with a different version of Spark or some custom dependencies), specify the argument `SPARK_IMAGE` as the following example shows:
The operator image is built upon a base Spark image that defaults to `gcr.io/spark-operator/spark:v2.4.5`. If you want to use your own Spark image (e.g., an image with a different version of Spark or some custom dependencies), specify the argument `SPARK_IMAGE` as the following example shows:

```bash
$ docker build --build-arg SPARK_IMAGE=<your Spark image> -t <image-tag> .
Expand Down
2 changes: 1 addition & 1 deletion docs/quick-start-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ spec:
labels:
version: 2.3.0
memory: 512m
image: gcr.io/ynli-k8s/spark:v2.4.4
image: gcr.io/ynli-k8s/spark:v2.4.5
mainApplicationFile: local:///opt/spark/examples/jars/spark-examples_2.11-2.3.0.jar
mainClass: org.apache.spark.examples.SparkPi
mode: cluster
Expand Down
16 changes: 8 additions & 8 deletions docs/user-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,9 +67,9 @@ metadata:
spec:
type: Scala
mode: cluster
image: gcr.io/spark/spark:v2.4.4
image: gcr.io/spark/spark:v2.4.5
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar
mainApplicationFile: local:///opt/spark/examples/jars/spark-examples_2.11-2.4.5.jar
```
### Specifying Application Dependencies
Expand Down Expand Up @@ -133,7 +133,7 @@ spec:
coreLimit: 200m
memory: 512m
labels:
version: 2.4.4
version: 2.4.5
serviceAccount: spark
```

Expand All @@ -153,7 +153,7 @@ spec:
instances: 1
memory: 512m
labels:
version: 2.4.4
version: 2.4.5
```

### Specifying Extra Java Options
Expand Down Expand Up @@ -234,7 +234,7 @@ spec:
name: "amd.com/gpu" # GPU resource name
quantity: 1 # number of GPUs to request
labels:
version: 2.4.4
version: 2.4.5
serviceAccount: spark
executor:
cores: 1
Expand All @@ -258,7 +258,7 @@ spec:
memory: "512m"
hostNetwork: true
labels:
version: 2.4.4
version: 2.4.5
serviceAccount: spark
executor:
cores: 1
Expand Down Expand Up @@ -590,7 +590,7 @@ Note that Python binding for PySpark is available in Apache Spark 2.4.
The operator supports using the Spark metric system to expose metrics to a variety of sinks. Particularly, it is able to automatically configure the metric system to expose metrics to [Prometheus](https://prometheus.io/). Specifically, the field `.spec.monitoring` specifies how application monitoring is handled and particularly how metrics are to be reported. The metric system is configured through the configuration file `metrics.properties`, which gets its content from the field `.spec.monitoring.metricsProperties`. The content of [metrics.properties](../spark-docker/conf/metrics.properties) will be used by default if `.spec.monitoring.metricsProperties` is not specified. You can choose to enable or disable reporting driver and executor metrics using the fields `.spec.monitoring.exposeDriverMetrics` and `.spec.monitoring.exposeExecutorMetrics`, respectively.
Further, the field `.spec.monitoring.prometheus` specifies how metrics are exposed to Prometheus using the [Prometheus JMX exporter](https://github.com/prometheus/jmx_exporter). When `.spec.monitoring.prometheus` is specified, the operator automatically configures the JMX exporter to run as a Java agent. The only required field of `.spec.monitoring.prometheus` is `jmxExporterJar`, which specified the path to the Prometheus JMX exporter Java agent jar in the container. If you use the image `gcr.io/spark-operator/spark:v2.4.4-gcs-prometheus`, the jar is located at `/prometheus/jmx_prometheus_javaagent-0.11.0.jar`. The field `.spec.monitoring.prometheus.port` specifies the port the JMX exporter Java agent binds to and defaults to `8090` if not specified. The field `.spec.monitoring.prometheus.configuration` specifies the content of the configuration to be used with the JMX exporter. The content of [prometheus.yaml](../spark-docker/conf/prometheus.yaml) will be used by default if `.spec.monitoring.prometheus.configuration` is not specified.
Further, the field `.spec.monitoring.prometheus` specifies how metrics are exposed to Prometheus using the [Prometheus JMX exporter](https://github.com/prometheus/jmx_exporter). When `.spec.monitoring.prometheus` is specified, the operator automatically configures the JMX exporter to run as a Java agent. The only required field of `.spec.monitoring.prometheus` is `jmxExporterJar`, which specified the path to the Prometheus JMX exporter Java agent jar in the container. If you use the image `gcr.io/spark-operator/spark:v2.4.5-gcs-prometheus`, the jar is located at `/prometheus/jmx_prometheus_javaagent-0.11.0.jar`. The field `.spec.monitoring.prometheus.port` specifies the port the JMX exporter Java agent binds to and defaults to `8090` if not specified. The field `.spec.monitoring.prometheus.configuration` specifies the content of the configuration to be used with the JMX exporter. The content of [prometheus.yaml](../spark-docker/conf/prometheus.yaml) will be used by default if `.spec.monitoring.prometheus.configuration` is not specified.
Below is an example that shows how to configure the metric system to expose metrics to Prometheus using the Prometheus JMX exporter. Note that the JMX exporter Java agent jar is listed as a dependency and will be downloaded to where `.spec.dep.jarsDownloadDir` points to in Spark 2.3.x, which is `/var/spark-data/spark-jars` by default. Things are different in Spark 2.4 as dependencies will be downloaded to the local working directory instead in Spark 2.4. A complete example can be found in [examples/spark-pi-prometheus.yaml](../examples/spark-pi-prometheus.yaml).
Expand Down Expand Up @@ -678,7 +678,7 @@ spec:
template:
type: Scala
mode: cluster
image: gcr.io/spark/spark:v2.4.4
image: gcr.io/spark/spark:v2.4.5
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: local:///opt/spark/examples/jars/spark-examples_2.11-2.3.0.jar
driver:
Expand Down
10 changes: 5 additions & 5 deletions docs/volcano-integration.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,11 +31,11 @@ metadata:
spec:
type: Scala
mode: cluster
image: "gcr.io/spark-operator/spark:v2.4.4"
image: "gcr.io/spark-operator/spark:v2.4.5"
imagePullPolicy: Always
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: "local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar"
sparkVersion: "2.4.4"
mainApplicationFile: "local:///opt/spark/examples/jars/spark-examples_2.11-2.4.5.jar"
sparkVersion: "2.4.5"
batchScheduler: "volcano" #Note: the batch scheduler name must be specified with `volcano`
restartPolicy:
type: Never
Expand All @@ -49,7 +49,7 @@ spec:
coreLimit: "1200m"
memory: "512m"
labels:
version: 2.4.4
version: 2.4.5
serviceAccount: spark
volumeMounts:
- name: "test-volume"
Expand All @@ -59,7 +59,7 @@ spec:
instances: 1
memory: "512m"
labels:
version: 2.4.4
version: 2.4.5
volumeMounts:
- name: "test-volume"
mountPath: "/tmp"
Expand Down
10 changes: 5 additions & 5 deletions examples/spark-pi-configmap.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,11 +21,11 @@ metadata:
spec:
type: Scala
mode: cluster
image: "gcr.io/spark-operator/spark:v2.4.4"
image: "gcr.io/spark-operator/spark:v2.4.5"
imagePullPolicy: Always
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: "local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar"
sparkVersion: "2.4.4"
mainApplicationFile: "local:///opt/spark/examples/jars/spark-examples_2.11-2.4.5.jar"
sparkVersion: "2.4.5"
restartPolicy:
type: Never
volumes:
Expand All @@ -37,7 +37,7 @@ spec:
coreLimit: "1200m"
memory: "512m"
labels:
version: 2.4.4
version: 2.4.5
serviceAccount: spark
volumeMounts:
- name: config-vol
Expand All @@ -47,7 +47,7 @@ spec:
instances: 1
memory: "512m"
labels:
version: 2.4.4
version: 2.4.5
volumeMounts:
- name: config-vol
mountPath: /opt/spark/mycm
10 changes: 5 additions & 5 deletions examples/spark-pi-prometheus.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -22,28 +22,28 @@ metadata:
spec:
type: Scala
mode: cluster
image: "gcr.io/spark-operator/spark:v2.4.4-gcs-prometheus"
image: "gcr.io/spark-operator/spark:v2.4.5-gcs-prometheus"
imagePullPolicy: Always
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: "local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar"
mainApplicationFile: "local:///opt/spark/examples/jars/spark-examples_2.11-2.4.5.jar"
arguments:
- "100000"
sparkVersion: "2.4.4"
sparkVersion: "2.4.5"
restartPolicy:
type: Never
driver:
cores: 1
coreLimit: "1200m"
memory: "512m"
labels:
version: 2.4.4
version: 2.4.5
serviceAccount: spark
executor:
cores: 1
instances: 1
memory: "512m"
labels:
version: 2.4.4
version: 2.4.5
monitoring:
exposeDriverMetrics: true
exposeExecutorMetrics: true
Expand Down
10 changes: 5 additions & 5 deletions examples/spark-pi-schedule.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,23 +25,23 @@ spec:
template:
type: Scala
mode: cluster
image: "gcr.io/spark-operator/spark:v2.4.4"
image: "gcr.io/spark-operator/spark:v2.4.5"
imagePullPolicy: Always
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: "local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar"
sparkVersion: "2.4.4"
mainApplicationFile: "local:///opt/spark/examples/jars/spark-examples_2.11-2.4.5.jar"
sparkVersion: "2.4.5"
restartPolicy:
type: Never
driver:
cores: 1
coreLimit: "1200m"
memory: "512m"
labels:
version: 2.4.4
version: 2.4.5
serviceAccount: spark
executor:
cores: 1
instances: 1
memory: "512m"
labels:
version: 2.4.4
version: 2.4.5
10 changes: 5 additions & 5 deletions examples/spark-pi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,11 +21,11 @@ metadata:
spec:
type: Scala
mode: cluster
image: "gcr.io/spark-operator/spark:v2.4.4"
image: "gcr.io/spark-operator/spark:v2.4.5"
imagePullPolicy: Always
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: "local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar"
sparkVersion: "2.4.4"
mainApplicationFile: "local:///opt/spark/examples/jars/spark-examples_2.11-2.4.5.jar"
sparkVersion: "2.4.5"
restartPolicy:
type: Never
volumes:
Expand All @@ -38,7 +38,7 @@ spec:
coreLimit: "1200m"
memory: "512m"
labels:
version: 2.4.4
version: 2.4.5
serviceAccount: spark
volumeMounts:
- name: "test-volume"
Expand All @@ -48,7 +48,7 @@ spec:
instances: 1
memory: "512m"
labels:
version: 2.4.4
version: 2.4.5
volumeMounts:
- name: "test-volume"
mountPath: "/tmp"
8 changes: 4 additions & 4 deletions examples/spark-py-pi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,10 +25,10 @@ spec:
type: Python
pythonVersion: "2"
mode: cluster
image: "gcr.io/spark-operator/spark-py:v2.4.4"
image: "gcr.io/spark-operator/spark-py:v2.4.5"
imagePullPolicy: Always
mainApplicationFile: local:///opt/spark/examples/src/main/python/pi.py
sparkVersion: "2.4.4"
sparkVersion: "2.4.5"
restartPolicy:
type: OnFailure
onFailureRetries: 3
Expand All @@ -40,11 +40,11 @@ spec:
coreLimit: "1200m"
memory: "512m"
labels:
version: 2.4.4
version: 2.4.5
serviceAccount: spark
executor:
cores: 1
instances: 1
memory: "512m"
labels:
version: 2.4.4
version: 2.4.5
8 changes: 4 additions & 4 deletions manifest/spark-operator-with-metrics.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,13 +21,13 @@ metadata:
namespace: spark-operator
labels:
app.kubernetes.io/name: sparkoperator
app.kubernetes.io/version: v2.4.4-v1beta2
app.kubernetes.io/version: v2.4.5-v1beta2
spec:
replicas: 1
selector:
matchLabels:
app.kubernetes.io/name: sparkoperator
app.kubernetes.io/version: v2.4.4-v1beta2
app.kubernetes.io/version: v2.4.5-v1beta2
strategy:
type: Recreate
template:
Expand All @@ -38,12 +38,12 @@ spec:
prometheus.io/path: "/metrics"
labels:
app.kubernetes.io/name: sparkoperator
app.kubernetes.io/version: v2.4.4-v1beta2
app.kubernetes.io/version: v2.4.5-v1beta2
spec:
serviceAccountName: sparkoperator
containers:
- name: sparkoperator
image: gcr.io/spark-operator/spark-operator:v2.4.4-v1beta2-latest
image: gcr.io/spark-operator/spark-operator:v2.4.5-v1beta2-latest
imagePullPolicy: Always
ports:
- containerPort: 10254
Expand Down
16 changes: 8 additions & 8 deletions manifest/spark-operator-with-webhook.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,20 +21,20 @@ metadata:
namespace: spark-operator
labels:
app.kubernetes.io/name: sparkoperator
app.kubernetes.io/version: v2.4.4-v1beta2
app.kubernetes.io/version: v2.4.5-v1beta2
spec:
replicas: 1
selector:
matchLabels:
app.kubernetes.io/name: sparkoperator
app.kubernetes.io/version: v2.4.4-v1beta2
app.kubernetes.io/version: v2.4.5-v1beta2
strategy:
type: Recreate
template:
metadata:
labels:
app.kubernetes.io/name: sparkoperator
app.kubernetes.io/version: v2.4.4-v1beta2
app.kubernetes.io/version: v2.4.5-v1beta2
spec:
serviceAccountName: sparkoperator
volumes:
Expand All @@ -43,7 +43,7 @@ spec:
secretName: spark-webhook-certs
containers:
- name: sparkoperator
image: gcr.io/spark-operator/spark-operator:v2.4.4-v1beta2-latest
image: gcr.io/spark-operator/spark-operator:v2.4.5-v1beta2-latest
imagePullPolicy: Always
volumeMounts:
- name: webhook-certs
Expand All @@ -62,20 +62,20 @@ metadata:
namespace: spark-operator
labels:
app.kubernetes.io/name: sparkoperator
app.kubernetes.io/version: v2.4.4-v1beta2
app.kubernetes.io/version: v2.4.5-v1beta2
spec:
backoffLimit: 3
template:
metadata:
labels:
app.kubernetes.io/name: sparkoperator
app.kubernetes.io/version: v2.4.4-v1beta2
app.kubernetes.io/version: v2.4.5-v1beta2
spec:
serviceAccountName: sparkoperator
restartPolicy: Never
containers:
- name: main
image: gcr.io/spark-operator/spark-operator:v2.4.4-v1beta2-latest
image: gcr.io/spark-operator/spark-operator:v2.4.5-v1beta2-latest
imagePullPolicy: IfNotPresent
command: ["/usr/bin/gencerts.sh", "-p"]
---
Expand All @@ -91,4 +91,4 @@ spec:
name: webhook
selector:
app.kubernetes.io/name: sparkoperator
app.kubernetes.io/version: v2.4.4-v1beta2
app.kubernetes.io/version: v2.4.5-v1beta2
Loading

0 comments on commit fc491a2

Please sign in to comment.