You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: docs/quick-start-guide.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -184,9 +184,9 @@ The operator submits the Spark Pi example to run once it receives an event indic
184
184
The Kubernetes Operator for "Apache Spark comes with an optional mutating admission webhook for customizing Spark driver and executor pods based on the specification in `SparkApplication` objects, e.g., mounting user-specified ConfigMaps and volumes, and setting pod affinity/anti-affinity, and adding tolerations.
185
185
186
186
The webhook requires a X509 certificate for TLS for pod admission requests and responses between the Kubernetes API server and the webhook server running inside the operator. For that, the certificate and key files must be accessible by the webhook server.
187
-
The Spark Operator ships with a tool at `hack/gencerts.sh` for generating the CA and server certificate and putting the certificate and key files into a secret named `spark-webhook-certs` in the namespace `spark-operator`. This secret will be mounted into the Spark Operator pod.
187
+
The Kubernetes Operator for Spark ships with a tool at `hack/gencerts.sh` for generating the CA and server certificate and putting the certificate and key files into a secret named `spark-webhook-certs` in the namespace `spark-operator`. This secret will be mounted into the operator pod.
188
188
189
-
Run the following command to create secret with certificate and key files using Batch Job, and install the Spark Operator Deployment with the mutating admission webhook:
189
+
Run the following command to create the secret with a certificate and key files using a Batch Job, and install the operator Deployment with the mutating admission webhook:
Copy file name to clipboardexpand all lines: test/e2e/README.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -21,4 +21,4 @@ Note that all tests are run on a live Kubernetes cluster. After the tests are do
21
21
22
22
*`basic_test.go`
23
23
24
-
This test submits `spark-pi.yaml` contained in the `\examples` using `kubectl`. It then checks that the Spark job successfully completes with the right result of Pi.
24
+
This test submits `spark-pi.yaml` contained in the `\examples`. It then checks that the Spark job successfully completes with the right result of Pi.
0 commit comments