Skip to content

Commit d885503

Browse files
committed
Additional security statement updates
1 parent 83acd5d commit d885503

File tree

2 files changed

+4
-6
lines changed

2 files changed

+4
-6
lines changed

docs/quick-start.md

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -32,11 +32,6 @@ you can download a package for any version of Hadoop.
3232

3333
Note that, before Spark 2.0, the main programming interface of Spark was the Resilient Distributed Dataset (RDD). After Spark 2.0, RDDs are replaced by Dataset, which is strongly-typed like an RDD, but with richer optimizations under the hood. The RDD interface is still supported, and you can get a more detailed reference at the [RDD programming guide](rdd-programming-guide.html). However, we highly recommend you to switch to use Dataset, which has better performance than RDD. See the [SQL programming guide](sql-programming-guide.html) to get more information about Dataset.
3434

35-
# Security
36-
37-
Security in Spark is OFF by default. This could mean you are vulnerable to attack by default.
38-
Please see [Spark Security](security.html) before running Spark.
39-
4035
# Interactive Analysis with the Spark Shell
4136

4237
## Basics

docs/security.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,10 @@ license: |
2323

2424
# Spark Security: Things You Need To Know
2525

26-
Security in Spark is OFF by default. This could mean you are vulnerable to attack by default.
26+
Security features like authentication are not enabled by default. When deploying a cluster that is open to the internet
27+
or an untrusted network, it's important to secure access to the cluster to prevent unauthorized applications
28+
from running on the cluster.
29+
2730
Spark supports multiple deployments types and each one supports different levels of security. Not
2831
all deployment types will be secure in all environments and none are secure by default. Be
2932
sure to evaluate your environment, what Spark supports, and take the appropriate measure to secure

0 commit comments

Comments
 (0)