Skip to content

Commit 1533087

Browse files
Add Scala as --build-arg (#1757)
* add scala version choise * add ; \ fi * change checksum and removed default scala version * remove RUN * add { } and remove old code * remove 3 duplicated lines. * Add the commint as a comment * Add back #Fix * Rename downloadeds as spark.tgz * Fix doc * Update specifics.md * New fix * Fix wget * Remove make link to spark * Set full path to /usr/local/spark * Change /usr/local/spark to ${SPARK_HOME} * fix RUN with if * Remove empty lines * Update Dockerfile * Update Dockerfile * Update Dockerfile Co-authored-by: Ayaz Salikhov <mathbunnyru@users.noreply.github.com>
1 parent 0be9e33 commit 1533087

File tree

2 files changed

+20
-14
lines changed

2 files changed

+20
-14
lines changed

docs/using/specifics.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -46,14 +46,13 @@ You can build a `pyspark-notebook` image (and also the downstream `all-spark-not
4646

4747
- `spark_version`: The Spark version to install (`3.3.0`).
4848
- `hadoop_version`: The Hadoop version (`3.2`).
49-
- `scala_version`: The Scala version (`2.13`).
49+
- `scala_version`: The Scala version (`2.13`, optional).
5050
- `spark_checksum`: The package checksum (`BFE4540...`).
5151
- `openjdk_version`: The version of the OpenJDK (JRE headless) distribution (`17`).
5252
- This version needs to match the version supported by the Spark distribution used above.
5353
- See [Spark Overview](https://spark.apache.org/docs/latest/#downloading) and [Ubuntu packages](https://packages.ubuntu.com/search?keywords=openjdk).
5454

55-
- Starting with _Spark >= 3.2_ the distribution file contains Scala version, hence building older Spark will not work.
56-
- Building older version requires modification to the Dockerfile or using it's older version of the Dockerfile.
55+
- Starting with _Spark >= 3.2_ the distribution file might contain Scala version.
5756

5857
For example here is how to build a `pyspark-notebook` image with Spark `3.2.0`, Hadoop `3.2` and OpenJDK `11`.
5958

pyspark-notebook/Dockerfile

Lines changed: 18 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -17,8 +17,8 @@ USER root
1717
# (ARGS are in lower case to distinguish them from ENV)
1818
ARG spark_version="3.3.0"
1919
ARG hadoop_version="3"
20-
ARG scala_version="2.13"
21-
ARG spark_checksum="4c09dac70e22bf1d5b7b2cabc1dd92aba13237f52a5b682c67982266fc7a0f5e0f964edff9bc76adbd8cb444eb1a00fdc59516147f99e4e2ce068420ff4881f0"
20+
ARG scala_version
21+
ARG spark_checksum="1e8234d0c1d2ab4462d6b0dfe5b54f2851dcd883378e0ed756140e10adfb5be4123961b521140f580e364c239872ea5a9f813a20b73c69cb6d4e95da2575c29c"
2222
ARG openjdk_version="17"
2323

2424
ENV APACHE_SPARK_VERSION="${spark_version}" \
@@ -32,22 +32,29 @@ RUN apt-get update --yes && \
3232

3333
# Spark installation
3434
WORKDIR /tmp
35-
RUN wget -q "https://archive.apache.org/dist/spark/spark-${APACHE_SPARK_VERSION}/spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}-scala${scala_version}.tgz" && \
36-
echo "${spark_checksum} *spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}-scala${scala_version}.tgz" | sha512sum -c - && \
37-
tar xzf "spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}-scala${scala_version}.tgz" -C /usr/local --owner root --group root --no-same-owner && \
38-
rm "spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}-scala${scala_version}.tgz"
3935

40-
WORKDIR /usr/local
36+
RUN if [ -z "${scala_version}" ]; then \
37+
wget -qO "spark.tgz" "https://archive.apache.org/dist/spark/spark-${APACHE_SPARK_VERSION}/spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz"; \
38+
else \
39+
wget -qO "spark.tgz" "https://archive.apache.org/dist/spark/spark-${APACHE_SPARK_VERSION}/spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}-scala${scala_version}.tgz"; \
40+
fi && \
41+
echo "${spark_checksum} *spark.tgz" | sha512sum -c - && \
42+
tar xzf "spark.tgz" -C /usr/local --owner root --group root --no-same-owner && \
43+
rm "spark.tgz"
4144

4245
# Configure Spark
4346
ENV SPARK_HOME=/usr/local/spark
4447
ENV SPARK_OPTS="--driver-java-options=-Xms1024M --driver-java-options=-Xmx4096M --driver-java-options=-Dlog4j.logLevel=info" \
4548
PATH="${PATH}:${SPARK_HOME}/bin"
4649

47-
RUN ln -s "spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}-scala${scala_version}" spark && \
48-
# Add a link in the before_notebook hook in order to source automatically PYTHONPATH
49-
mkdir -p /usr/local/bin/before-notebook.d && \
50-
ln -s "${SPARK_HOME}/sbin/spark-config.sh" /usr/local/bin/before-notebook.d/spark-config.sh
50+
RUN if [ -z "${scala_version}" ]; then \
51+
ln -s "spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" "${SPARK_HOME}"; \
52+
else \
53+
ln -s "spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}-scala${scala_version}" "${SPARK_HOME}"; \
54+
fi && \
55+
# Add a link in the before_notebook hook in order to source automatically PYTHONPATH && \
56+
mkdir -p /usr/local/bin/before-notebook.d && \
57+
ln -s "${SPARK_HOME}/sbin/spark-config.sh" /usr/local/bin/before-notebook.d/spark-config.sh
5158

5259
# Configure IPython system-wide
5360
COPY ipython_kernel_config.py "/etc/ipython/"

0 commit comments

Comments
 (0)