You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[MINOR][DOCS] Fix R documentation generation instruction for roxygen2
## What changes were proposed in this pull request?
This PR proposes to fix `roxygen2` to `5.0.1` in `docs/README.md` for SparkR documentation generation.
If I use higher version and creates the doc, it shows the diff below. Not a big deal but it bothered me.
```diff
diff --git a/R/pkg/DESCRIPTION b/R/pkg/DESCRIPTION
index 855eb5b..159fca6 100644
--- a/R/pkg/DESCRIPTION
+++ b/R/pkg/DESCRIPTION
-57,6 +57,6 Collate:
'types.R'
'utils.R'
'window.R'
-RoxygenNote: 5.0.1
+RoxygenNote: 6.0.1
VignetteBuilder: knitr
NeedsCompilation: no
```
## How was this patch tested?
Manually tested. I met this every time I set the new environment for Spark dev but I have kept forgetting to fix it.
Author: hyukjinkwon <gurwls223@apache.org>
Closes#21020 from HyukjinKwon/minor-r-doc.
(cherry picked from commit 87611bb)
Signed-off-by: hyukjinkwon <gurwls223@apache.org>
Signed-off-by: hyukjinkwon <gurwls223@apache.org>
$ sudo Rscript -e 'devtools::install_version("roxygen2", version = "5.0.1", repos="http://cran.stat.ucla.edu/")'
26
27
```
27
28
28
-
(Note: If you are on a system with both Ruby 1.9 and Ruby 2.0 you may need to replace gem with gem2.0)
29
+
Note: If you are on a system with both Ruby 1.9 and Ruby 2.0 you may need to replace gem with gem2.0.
30
+
31
+
Note: Other versions of roxygen2 might work in SparkR documentation generation but `RoxygenNote` field in `$SPARK_HOME/R/pkg/DESCRIPTION` is 5.0.1, which is updated if the version is mismatched.
29
32
30
33
## Generating the Documentation HTML
31
34
@@ -62,12 +65,12 @@ $ PRODUCTION=1 jekyll build
62
65
63
66
## API Docs (Scaladoc, Javadoc, Sphinx, roxygen2, MkDocs)
64
67
65
-
You can build just the Spark scaladoc and javadoc by running `build/sbt unidoc` from the `SPARK_HOME` directory.
68
+
You can build just the Spark scaladoc and javadoc by running `build/sbt unidoc` from the `$SPARK_HOME` directory.
66
69
67
70
Similarly, you can build just the PySpark docs by running `make html` from the
68
-
`SPARK_HOME/python/docs` directory. Documentation is only generated for classes that are listed as
69
-
public in `__init__.py`. The SparkR docs can be built by running `SPARK_HOME/R/create-docs.sh`, and
70
-
the SQL docs can be built by running `SPARK_HOME/sql/create-docs.sh`
71
+
`$SPARK_HOME/python/docs` directory. Documentation is only generated for classes that are listed as
72
+
public in `__init__.py`. The SparkR docs can be built by running `$SPARK_HOME/R/create-docs.sh`, and
73
+
the SQL docs can be built by running `$SPARK_HOME/sql/create-docs.sh`
71
74
after [building Spark](https://github.com/apache/spark#building-spark) first.
72
75
73
76
When you run `jekyll build` in the `docs` directory, it will also copy over the scaladoc and javadoc for the various
0 commit comments