@@ -6,7 +6,7 @@ title: Accessing OpenStack Swift from Spark
6
6
Spark's support for Hadoop InputFormat allows it to process data in OpenStack Swift using the
7
7
same URI formats as in Hadoop. You can specify a path in Swift as input through a
8
8
URI of the form <code >swift://container.PROVIDER/path</code >. You will also need to set your
9
- Swift security credentials, through <code >core-sites .xml</code > or via
9
+ Swift security credentials, through <code >core-site .xml</code > or via
10
10
<code >SparkContext.hadoopConfiguration</code >.
11
11
Current Swift driver requires Swift to use Keystone authentication method.
12
12
@@ -37,7 +37,7 @@ For example, for Maven support, add the following to the <code>pom.xml</code> fi
37
37
38
38
# Configuration Parameters
39
39
40
- Create <code >core-sites .xml</code > and place it inside <code >/spark/conf</code > directory.
40
+ Create <code >core-site .xml</code > and place it inside <code >/spark/conf</code > directory.
41
41
There are two main categories of parameters that should to be configured: declaration of the
42
42
Swift driver and the parameters that are required by Keystone.
43
43
@@ -100,7 +100,7 @@ contains a list of Keystone mandatory parameters. <code>PROVIDER</code> can be a
100
100
</table >
101
101
102
102
For example, assume <code >PROVIDER=SparkTest</code > and Keystone contains user <code >tester</code > with password <code >testing</code >
103
- defined for tenant <code >test</code >. Than <code >core-sites .xml</code > should include:
103
+ defined for tenant <code >test</code >. Than <code >core-site .xml</code > should include:
104
104
105
105
{% highlight xml %}
106
106
<configuration >
@@ -146,7 +146,7 @@ Notice that
146
146
<code >fs.swift.service.PROVIDER.tenant</code >,
147
147
<code >fs.swift.service.PROVIDER.username</code >,
148
148
<code >fs.swift.service.PROVIDER.password</code > contains sensitive information and keeping them in
149
- <code >core-sites .xml</code > is not always a good approach.
150
- We suggest to keep those parameters in <code >core-sites .xml</code > for testing purposes when running Spark
149
+ <code >core-site .xml</code > is not always a good approach.
150
+ We suggest to keep those parameters in <code >core-site .xml</code > for testing purposes when running Spark
151
151
via <code >spark-shell</code >.
152
152
For job submissions they should be provided via <code >sparkContext.hadoopConfiguration</code >.
0 commit comments