[SPARK-53539][INFRA][4.0] Add libwebp-dev to recover spark-rm/Dockerfile building#52339
Closed
peter-toth wants to merge 1 commit intoapache:branch-4.0from
Closed
[SPARK-53539][INFRA][4.0] Add libwebp-dev to recover spark-rm/Dockerfile building#52339peter-toth wants to merge 1 commit intoapache:branch-4.0from
libwebp-dev to recover spark-rm/Dockerfile building#52339peter-toth wants to merge 1 commit intoapache:branch-4.0from
Conversation
…e` building ### What changes were proposed in this pull request? This PR aims to add `libwebp-dev` to recover `spark-rm/Dockerfile` building. ### Why are the changes needed? `Apache Spark` release docker image compilation has been broken for last 7 days due to the SparkR package compilation. - https://github.com/apache/spark/actions/workflows/release.yml - https://github.com/apache/spark/actions/runs/17425825244 ``` apache#11 559.4 No package 'libwebpmux' found ... apache#11 559.4 -------------------------- [ERROR MESSAGE] --------------------------- apache#11 559.4 <stdin>:1:10: fatal error: ft2build.h: No such file or directory apache#11 559.4 compilation terminated. apache#11 559.4 -------------------------------------------------------------------- apache#11 559.4 ERROR: configuration failed for package 'ragg' ``` ### Does this PR introduce _any_ user-facing change? No, this is a fix for Apache Spark release tool. ### How was this patch tested? Manually build. ``` $ cd dev/create-release/spark-rm $ docker build . ``` **BEFORE** ``` ... Dockerfile:83 -------------------- 82 | # See more in SPARK-39959, roxygen2 < 7.2.1 83 | >>> RUN Rscript -e "install.packages(c('devtools', 'knitr', 'markdown', \ 84 | >>> 'rmarkdown', 'testthat', 'devtools', 'e1071', 'survival', 'arrow', \ 85 | >>> 'ggplot2', 'mvtnorm', 'statmod', 'xml2'), repos='https://cloud.r-project.org/')" && \ 86 | >>> Rscript -e "devtools::install_version('roxygen2', version='7.2.0', repos='https://cloud.r-project.org')" && \ 87 | >>> Rscript -e "devtools::install_version('lintr', version='2.0.1', repos='https://cloud.r-project.org')" && \ 88 | >>> Rscript -e "devtools::install_version('pkgdown', version='2.0.1', repos='https://cloud.r-project.org')" && \ 89 | >>> Rscript -e "devtools::install_version('preferably', version='0.4', repos='https://cloud.r-project.org')" 90 | -------------------- ERROR: failed to build: failed to solve: ``` **AFTER** ``` ... => [ 6/22] RUN add-apt-repository 'deb https://cloud.r-project.org/bin/linux/ubuntu jammy-cran40/' 3.8s => [ 7/22] RUN Rscript -e "install.packages(c('devtools', 'knitr', 'markdown', 'rmarkdown', 'testthat', 'devtools', 'e1071', 'survival', 'arrow', 892.2s => [ 8/22] RUN add-apt-repository ppa:pypy/ppa 15.3s ... ``` After merging this PR, we can validate via the daily release dry-run CI. - https://github.com/apache/spark/actions/workflows/release.yml ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#52290 from dongjoon-hyun/SPARK-53539. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
yaooqinn
approved these changes
Sep 15, 2025
Contributor
Author
|
@dongjoon-hyun, while I was preparing the 3.5.7 RC, I ran into the same issue you fixed on So let's cherry pick this fix to |
dongjoon-hyun
added a commit
that referenced
this pull request
Sep 15, 2025
…erfile` building ### What changes were proposed in this pull request? This PR aims to add `libwebp-dev` to recover `spark-rm/Dockerfile` building. ### Why are the changes needed? `Apache Spark` release docker image compilation has been broken for last 7 days due to the SparkR package compilation. - https://github.com/apache/spark/actions/workflows/release.yml - https://github.com/apache/spark/actions/runs/17425825244 ``` #11 559.4 No package 'libwebpmux' found ... #11 559.4 -------------------------- [ERROR MESSAGE] --------------------------- #11 559.4 <stdin>:1:10: fatal error: ft2build.h: No such file or directory #11 559.4 compilation terminated. #11 559.4 -------------------------------------------------------------------- #11 559.4 ERROR: configuration failed for package 'ragg' ``` ### Does this PR introduce _any_ user-facing change? No, this is a fix for Apache Spark release tool. ### How was this patch tested? Manually build. ``` $ cd dev/create-release/spark-rm $ docker build . ``` **BEFORE** ``` ... Dockerfile:83 -------------------- 82 | # See more in SPARK-39959, roxygen2 < 7.2.1 83 | >>> RUN Rscript -e "install.packages(c('devtools', 'knitr', 'markdown', \ 84 | >>> 'rmarkdown', 'testthat', 'devtools', 'e1071', 'survival', 'arrow', \ 85 | >>> 'ggplot2', 'mvtnorm', 'statmod', 'xml2'), repos='https://cloud.r-project.org/')" && \ 86 | >>> Rscript -e "devtools::install_version('roxygen2', version='7.2.0', repos='https://cloud.r-project.org')" && \ 87 | >>> Rscript -e "devtools::install_version('lintr', version='2.0.1', repos='https://cloud.r-project.org')" && \ 88 | >>> Rscript -e "devtools::install_version('pkgdown', version='2.0.1', repos='https://cloud.r-project.org')" && \ 89 | >>> Rscript -e "devtools::install_version('preferably', version='0.4', repos='https://cloud.r-project.org')" 90 | -------------------- ERROR: failed to build: failed to solve: ``` **AFTER** ``` ... => [ 6/22] RUN add-apt-repository 'deb https://cloud.r-project.org/bin/linux/ubuntu jammy-cran40/' 3.8s => [ 7/22] RUN Rscript -e "install.packages(c('devtools', 'knitr', 'markdown', 'rmarkdown', 'testthat', 'devtools', 'e1071', 'survival', 'arrow', 892.2s => [ 8/22] RUN add-apt-repository ppa:pypy/ppa 15.3s ... ``` After merging this PR, we can validate via the daily release dry-run CI. - https://github.com/apache/spark/actions/workflows/release.yml ### Was this patch authored or co-authored using generative AI tooling? No. Closes #52339 from peter-toth/SPARK-53539-4.0. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
Member
|
Merged to branch-4.0. |
Contributor
Author
|
Thanks @yaooqinn and @dongjoon-hyun for the review. |
zifeif2
pushed a commit
to zifeif2/spark
that referenced
this pull request
Nov 14, 2025
…erfile` building ### What changes were proposed in this pull request? This PR aims to add `libwebp-dev` to recover `spark-rm/Dockerfile` building. ### Why are the changes needed? `Apache Spark` release docker image compilation has been broken for last 7 days due to the SparkR package compilation. - https://github.com/apache/spark/actions/workflows/release.yml - https://github.com/apache/spark/actions/runs/17425825244 ``` apache#11 559.4 No package 'libwebpmux' found ... apache#11 559.4 -------------------------- [ERROR MESSAGE] --------------------------- apache#11 559.4 <stdin>:1:10: fatal error: ft2build.h: No such file or directory apache#11 559.4 compilation terminated. apache#11 559.4 -------------------------------------------------------------------- apache#11 559.4 ERROR: configuration failed for package 'ragg' ``` ### Does this PR introduce _any_ user-facing change? No, this is a fix for Apache Spark release tool. ### How was this patch tested? Manually build. ``` $ cd dev/create-release/spark-rm $ docker build . ``` **BEFORE** ``` ... Dockerfile:83 -------------------- 82 | # See more in SPARK-39959, roxygen2 < 7.2.1 83 | >>> RUN Rscript -e "install.packages(c('devtools', 'knitr', 'markdown', \ 84 | >>> 'rmarkdown', 'testthat', 'devtools', 'e1071', 'survival', 'arrow', \ 85 | >>> 'ggplot2', 'mvtnorm', 'statmod', 'xml2'), repos='https://cloud.r-project.org/')" && \ 86 | >>> Rscript -e "devtools::install_version('roxygen2', version='7.2.0', repos='https://cloud.r-project.org')" && \ 87 | >>> Rscript -e "devtools::install_version('lintr', version='2.0.1', repos='https://cloud.r-project.org')" && \ 88 | >>> Rscript -e "devtools::install_version('pkgdown', version='2.0.1', repos='https://cloud.r-project.org')" && \ 89 | >>> Rscript -e "devtools::install_version('preferably', version='0.4', repos='https://cloud.r-project.org')" 90 | -------------------- ERROR: failed to build: failed to solve: ``` **AFTER** ``` ... => [ 6/22] RUN add-apt-repository 'deb https://cloud.r-project.org/bin/linux/ubuntu jammy-cran40/' 3.8s => [ 7/22] RUN Rscript -e "install.packages(c('devtools', 'knitr', 'markdown', 'rmarkdown', 'testthat', 'devtools', 'e1071', 'survival', 'arrow', 892.2s => [ 8/22] RUN add-apt-repository ppa:pypy/ppa 15.3s ... ``` After merging this PR, we can validate via the daily release dry-run CI. - https://github.com/apache/spark/actions/workflows/release.yml ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#52339 from peter-toth/SPARK-53539-4.0. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What changes were proposed in this pull request?
This PR aims to add
libwebp-devto recoverspark-rm/Dockerfilebuilding.Why are the changes needed?
Apache Sparkrelease docker image compilation has been broken for last 7 days due to the SparkR package compilation.Does this PR introduce any user-facing change?
No, this is a fix for Apache Spark release tool.
How was this patch tested?
Manually build.
BEFORE
AFTER
After merging this PR, we can validate via the daily release dry-run CI.
Was this patch authored or co-authored using generative AI tooling?
No.