Skip to content

Commit

Permalink
[SPARK-47529][DOCS] Use hadoop 3.4.0 in some docs
Browse files Browse the repository at this point in the history
### What changes were proposed in this pull request?
This PR aims to update `Hadoop` dependency in some docs.

### Why are the changes needed?
Currently Spark codebase master is using Apache Hadoop `3.4.0` by default.

### Does this PR introduce _any_ user-facing change?
No. This is a doc-only change.

### How was this patch tested?
N/A.

### Was this patch authored or co-authored using generative AI tooling?
No.

Closes apache#45679 from panbingkun/minor_use_hadoop_3.4.

Authored-by: panbingkun <panbingkun@baidu.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
  • Loading branch information
panbingkun authored and dongjoon-hyun committed Mar 24, 2024
1 parent c29d132 commit 9d6b9f7
Show file tree
Hide file tree
Showing 4 changed files with 4 additions and 4 deletions.
2 changes: 1 addition & 1 deletion assembly/README
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,4 @@ This module is off by default. To activate it specify the profile in the command

If you need to build an assembly for a different version of Hadoop the
hadoop-version system property needs to be set as in this example:
-Dhadoop.version=3.3.6
-Dhadoop.version=3.4.0
2 changes: 1 addition & 1 deletion docs/building-spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ from `hadoop.version`.

Example:

./build/mvn -Pyarn -Dhadoop.version=3.3.0 -DskipTests clean package
./build/mvn -Pyarn -Dhadoop.version=3.4.0 -DskipTests clean package

## Building With Hive and JDBC Support

Expand Down
2 changes: 1 addition & 1 deletion docs/running-on-kubernetes.md
Original file line number Diff line number Diff line change
Expand Up @@ -236,7 +236,7 @@ A typical example of this using S3 is via passing the following options:

```
...
--packages org.apache.hadoop:hadoop-aws:3.2.2
--packages org.apache.hadoop:hadoop-aws:3.4.0
--conf spark.kubernetes.file.upload.path=s3a://<s3-bucket>/path
--conf spark.hadoop.fs.s3a.access.key=...
--conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem
Expand Down
2 changes: 1 addition & 1 deletion resource-managers/kubernetes/integration-tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ properties to Maven. For example:

mvn integration-test -am -pl :spark-kubernetes-integration-tests_2.13 \
-Pkubernetes -Pkubernetes-integration-tests \
-Phadoop-3 -Dhadoop.version=3.3.6 \
-Phadoop-3 -Dhadoop.version=3.4.0 \
-Dspark.kubernetes.test.sparkTgz=spark-4.0.0-SNAPSHOT-bin-example.tgz \
-Dspark.kubernetes.test.imageTag=sometag \
-Dspark.kubernetes.test.imageRepo=docker.io/somerepo \
Expand Down

0 comments on commit 9d6b9f7

Please sign in to comment.