forked from apache/spark
-
Notifications
You must be signed in to change notification settings - Fork 0
Introducing StateSchemaV3 for the TransformWithState operator with HDFSMetadataLog #11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
9127c53 to
95d5b2a
Compare
ericm-db
pushed a commit
that referenced
this pull request
Oct 9, 2024
…r `postgreSQL/float4.sql` and `postgreSQL/int8.sql` ### What changes were proposed in this pull request? This pr regenerate Java 21 golden file for `postgreSQL/float4.sql` and `postgreSQL/int8.sql` to fix Java 21 daily test. ### Why are the changes needed? Fix Java 21 daily test: - https://github.com/apache/spark/actions/runs/10823897095/job/30030200710 ``` [info] - postgreSQL/float4.sql *** FAILED *** (1 second, 100 milliseconds) [info] postgreSQL/float4.sql [info] Expected "...arameters" : { [info] "[ansiConfig" : "\"spark.sql.ansi.enabled\"", [info] "]expression" : "'N A ...", but got "...arameters" : { [info] "[]expression" : "'N A ..." Result did not match for query #11 [info] SELECT float('N A N') (SQLQueryTestSuite.scala:663) ... [info] - postgreSQL/int8.sql *** FAILED *** (2 seconds, 474 milliseconds) [info] postgreSQL/int8.sql [info] Expected "...arameters" : { [info] "[ansiConfig" : "\"spark.sql.ansi.enabled\"", [info] "]sourceType" : "\"BIG...", but got "...arameters" : { [info] "[]sourceType" : "\"BIG..." Result did not match for query apache#66 [info] SELECT CAST(q1 AS int) FROM int8_tbl WHERE q2 <> 456 (SQLQueryTestSuite.scala:663) ... [info] *** 2 TESTS FAILED *** [error] Failed: Total 3559, Failed 2, Errors 0, Passed 3557, Ignored 4 [error] Failed tests: [error] org.apache.spark.sql.SQLQueryTestSuite [error] (sql / Test / test) sbt.TestsFailedException: Tests unsuccessful ``` ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Pass Github Acitons - Manual checked: `build/sbt "sql/testOnly org.apache.spark.sql.SQLQueryTestSuite" with Java 21, all test passed ` ### Was this patch authored or co-authored using generative AI tooling? No Closes apache#48089 from LuciferYang/SPARK-49578-FOLLOWUP. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
ericm-db
pushed a commit
that referenced
this pull request
Nov 18, 2025
…e` building ### What changes were proposed in this pull request? This PR aims to add `libwebp-dev` to recover `spark-rm/Dockerfile` building. ### Why are the changes needed? `Apache Spark` release docker image compilation has been broken for last 7 days due to the SparkR package compilation. - https://github.com/apache/spark/actions/workflows/release.yml - https://github.com/apache/spark/actions/runs/17425825244 ``` #11 559.4 No package 'libwebpmux' found ... #11 559.4 -------------------------- [ERROR MESSAGE] --------------------------- #11 559.4 <stdin>:1:10: fatal error: ft2build.h: No such file or directory #11 559.4 compilation terminated. #11 559.4 -------------------------------------------------------------------- #11 559.4 ERROR: configuration failed for package 'ragg' ``` ### Does this PR introduce _any_ user-facing change? No, this is a fix for Apache Spark release tool. ### How was this patch tested? Manually build. ``` $ cd dev/create-release/spark-rm $ docker build . ``` **BEFORE** ``` ... Dockerfile:83 -------------------- 82 | # See more in SPARK-39959, roxygen2 < 7.2.1 83 | >>> RUN Rscript -e "install.packages(c('devtools', 'knitr', 'markdown', \ 84 | >>> 'rmarkdown', 'testthat', 'devtools', 'e1071', 'survival', 'arrow', \ 85 | >>> 'ggplot2', 'mvtnorm', 'statmod', 'xml2'), repos='https://cloud.r-project.org/')" && \ 86 | >>> Rscript -e "devtools::install_version('roxygen2', version='7.2.0', repos='https://cloud.r-project.org')" && \ 87 | >>> Rscript -e "devtools::install_version('lintr', version='2.0.1', repos='https://cloud.r-project.org')" && \ 88 | >>> Rscript -e "devtools::install_version('pkgdown', version='2.0.1', repos='https://cloud.r-project.org')" && \ 89 | >>> Rscript -e "devtools::install_version('preferably', version='0.4', repos='https://cloud.r-project.org')" 90 | -------------------- ERROR: failed to build: failed to solve: ``` **AFTER** ``` ... => [ 6/22] RUN add-apt-repository 'deb https://cloud.r-project.org/bin/linux/ubuntu jammy-cran40/' 3.8s => [ 7/22] RUN Rscript -e "install.packages(c('devtools', 'knitr', 'markdown', 'rmarkdown', 'testthat', 'devtools', 'e1071', 'survival', 'arrow', 892.2s => [ 8/22] RUN add-apt-repository ppa:pypy/ppa 15.3s ... ``` After merging this PR, we can validate via the daily release dry-run CI. - https://github.com/apache/spark/actions/workflows/release.yml ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#52290 from dongjoon-hyun/SPARK-53539. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
ericm-db
pushed a commit
that referenced
this pull request
Nov 18, 2025
…rsion at the end ### What changes were proposed in this pull request? This PR aims to fix `spark-rm` Dockefile to install `pkgdown` version at the end. ### Why are the changes needed? Although `pkgdown` is supposed to be `2.0.1`, it's changed at the next package installation like the following. We should install `pkgdown` at the end to make it sure. https://github.com/apache/spark/blob/0311f44e33e5cf8ba60ccc330de3df4f688f5847/dev/create-release/spark-rm/Dockerfile#L89 - https://github.com/apache/spark/actions/workflows/release.yml - https://github.com/apache/spark/actions/runs/19386198324/job/55473421715 ``` #11 1007.3 Downloading package from url: https://cloud.r-project.org/src/contrib/Archive/preferably/preferably_0.4.tar.gz #11 1008.9 pkgdown (2.0.1 -> 2.2.0) [CRAN] #11 1008.9 Installing 1 packages: pkgdown #11 1008.9 Installing package into '/usr/local/lib/R/site-library' #11 1008.9 (as 'lib' is unspecified) #11 1009.4 trying URL 'https://cloud.r-project.org/src/contrib/pkgdown_2.2.0.tar.gz' #11 1009.7 Content type 'application/x-gzip' length 1280630 bytes (1.2 MB) #11 1009.7 ================================================== #11 1009.7 downloaded 1.2 MB #11 1009.7 #11 1010.2 * installing *source* package 'pkgdown' ... #11 1010.2 ** package 'pkgdown' successfully unpacked and MD5 sums checked #11 1010.2 ** using staged installation #11 1010.3 ** R #11 1010.3 ** inst #11 1010.3 ** byte-compile and prepare package for lazy loading #11 1013.1 ** help #11 1013.2 *** installing help indices #11 1013.2 *** copying figures #11 1013.2 ** building package indices #11 1013.5 ** installing vignettes #11 1013.5 ** testing if installed package can be loaded from temporary location #11 1013.8 ** testing if installed package can be loaded from final location #11 1014.1 ** testing if installed package keeps a record of temporary installation path #11 1014.1 * DONE (pkgdown) ``` ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Manual review. ``` $ dev/create-release/do-release-docker.sh -d /tmp/spark-4.1.0 -n -s docs $ docker run -it --rm --entrypoint /bin/bash spark-rm spark-rm923a388425fa:/opt/spark-rm/output$ Rscript -e 'installed.packages()' | grep pkgdown | head -n1 pkgdown "pkgdown" "/usr/local/lib/R/site-library" "2.0.1" ``` ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#53083 from dongjoon-hyun/SPARK-54371. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What changes were proposed in this pull request?
Why are the changes needed?
Does this PR introduce any user-facing change?
How was this patch tested?
Was this patch authored or co-authored using generative AI tooling?