Skip to content

Commit ef671d4

Browse files
committed
Keep frames in JavaDoc links, and other small tweaks
1 parent 1bf4112 commit ef671d4

File tree

5 files changed

+75
-37
lines changed

5 files changed

+75
-37
lines changed

docs/js/api-docs.js

Lines changed: 20 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,27 @@
1+
/*
2+
* Licensed to the Apache Software Foundation (ASF) under one or more
3+
* contributor license agreements. See the NOTICE file distributed with
4+
* this work for additional information regarding copyright ownership.
5+
* The ASF licenses this file to You under the Apache License, Version 2.0
6+
* (the "License"); you may not use this file except in compliance with
7+
* the License. You may obtain a copy of the License at
8+
*
9+
* http://www.apache.org/licenses/LICENSE-2.0
10+
*
11+
* Unless required by applicable law or agreed to in writing, software
12+
* distributed under the License is distributed on an "AS IS" BASIS,
13+
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14+
* See the License for the specific language governing permissions and
15+
* limitations under the License.
16+
*/
17+
118
/* Dynamically injected post-processing code for the API docs */
219

320
$(document).ready(function() {
421
var annotations = $("dt:contains('Annotations')").next("dd").children("span.name");
5-
addBadges(annotations, "AlphaComponent", ":: AlphaComponent ::", "<span class='alphaComponent badge'>Alpha Component</span>");
6-
addBadges(annotations, "DeveloperApi", ":: DeveloperApi ::", "<span class='developer badge'>Developer API</span>");
7-
addBadges(annotations, "Experimental", ":: Experimental ::", "<span class='experimental badge'>Experimental</span>");
22+
addBadges(annotations, "AlphaComponent", ":: AlphaComponent ::", '<span class="alphaComponent badge">Alpha Component</span>');
23+
addBadges(annotations, "DeveloperApi", ":: DeveloperApi ::", '<span class="developer badge">Developer API</span>');
24+
addBadges(annotations, "Experimental", ":: Experimental ::", '<span class="experimental badge">Experimental</span>');
825
});
926

1027
function addBadges(allAnnotations, name, tag, html) {

docs/js/main.js

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,23 @@
1+
/*
2+
* Licensed to the Apache Software Foundation (ASF) under one or more
3+
* contributor license agreements. See the NOTICE file distributed with
4+
* this work for additional information regarding copyright ownership.
5+
* The ASF licenses this file to You under the Apache License, Version 2.0
6+
* (the "License"); you may not use this file except in compliance with
7+
* the License. You may obtain a copy of the License at
8+
*
9+
* http://www.apache.org/licenses/LICENSE-2.0
10+
*
11+
* Unless required by applicable law or agreed to in writing, software
12+
* distributed under the License is distributed on an "AS IS" BASIS,
13+
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14+
* See the License for the specific language governing permissions and
15+
* limitations under the License.
16+
*/
17+
18+
/* Custom JavaScript code in the MarkDown docs */
19+
20+
// Enable language-specific code tabs
121
function codeTabs() {
222
var counter = 0;
323
var langImages = {
@@ -62,6 +82,7 @@ function makeCollapsable(elt, accordionClass, accordionBodyId, title) {
6282
);
6383
}
6484

85+
// Enable "view solution" sections (for exercises)
6586
function viewSolution() {
6687
var counter = 0
6788
$("div.solution").each(function() {

docs/mllib-guide.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -84,9 +84,9 @@ val vector: Vector = Vectors.dense(array) // a dense vector
8484
<div data-lang="java" markdown="1">
8585

8686
We used to represent a feature vector by `double[]`, which is replaced by
87-
[`Vector`](api/scala/index.html#org.apache.spark.mllib.linalg.Vector) in v1.0. Algorithms that used
87+
[`Vector`](api/java/index.html?org/apache/spark/mllib/linalg/Vector.html) in v1.0. Algorithms that used
8888
to accept `RDD<double[]>` now take
89-
`RDD<Vector>`. [`LabeledPoint`](api/scala/index.html#org.apache.spark.mllib.regression.LabeledPoint)
89+
`RDD<Vector>`. [`LabeledPoint`](api/java/index.html?org/apache/spark/mllib/regression/LabeledPoint.html)
9090
is now a wrapper of `(double, Vector)` instead of `(double, double[])`. Converting `double[]` to
9191
`Vector` is straightforward:
9292

docs/programming-guide.md

Lines changed: 15 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ import org.apache.spark.SparkConf
5555
Spark {{site.SPARK_VERSION}} works with Java 6 and higher. If you are using Java 8, Spark supports
5656
[lambda expressions](http://docs.oracle.com/javase/tutorial/java/javaOO/lambdaexpressions.html)
5757
for concisely writing functions, otherwise you can use the classes in the
58-
[org.apache.spark.api.java.function](api/java/org/apache/spark/api/java/function/package-summary.html) package.
58+
[org.apache.spark.api.java.function](api/java/index.html?org/apache/spark/api/java/function/package-summary.html) package.
5959

6060
To write a Spark application in Java, you need to add a dependency on Spark. Spark is available through Maven Central at:
6161

@@ -126,8 +126,8 @@ new SparkContext(conf)
126126

127127
<div data-lang="java" markdown="1">
128128

129-
The first thing a Spark program must do is to create a [JavaSparkContext](api/java/org/apache/spark/api/java/JavaSparkContext.html) object, which tells Spark
130-
how to access a cluster. To create a `SparkContext` you first need to build a [SparkConf](api/java/org/apache/spark/SparkConf.html) object
129+
The first thing a Spark program must do is to create a [JavaSparkContext](api/java/index.html?org/apache/spark/api/java/JavaSparkContext.html) object, which tells Spark
130+
how to access a cluster. To create a `SparkContext` you first need to build a [SparkConf](api/java/index.html?org/apache/spark/SparkConf.html) object
131131
that contains information about your application.
132132

133133
{% highlight java %}
@@ -265,7 +265,7 @@ We describe operations on distributed datasets later on.
265265

266266
**Note:** *In this guide, we'll often use the concise Java 8 lambda syntax to specify Java functions, but
267267
in older versions of Java you can implement the interfaces in the
268-
[org.apache.spark.api.java.function](api/java/org/apache/spark/api/java/function/package-summary.html) package.
268+
[org.apache.spark.api.java.function](api/java/index.html?org/apache/spark/api/java/function/package-summary.html) package.
269269
We describe [passing functions to Spark](#passing-functions-to-spark) in more detail below.*
270270

271271
</div>
@@ -546,7 +546,7 @@ def doStuff(rdd: RDD[String]): RDD[String] = {
546546

547547
Spark's API relies heavily on passing functions in the driver program to run on the cluster.
548548
In Java, functions are represented by classes implementing the interfaces in the
549-
[org.apache.spark.api.java.function](api/java/org/apache/spark/api/java/function/package-summary.html) package.
549+
[org.apache.spark.api.java.function](api/java/index.html?org/apache/spark/api/java/function/package-summary.html) package.
550550
There are two ways to create such functions:
551551

552552
* Implement the Function interfaces in your own class, either as an anonymous inner class or a named one,
@@ -697,7 +697,7 @@ from the Scala standard library. You can simply call `new Tuple2(a, b)` to creat
697697
its fields later with `tuple._1()` and `tuple._2()`.
698698

699699
RDDs of key-value pairs are represented by the
700-
[JavaPairRDD](api/java/org/apache/spark/api/java/JavaPairRDD.html) class. You can construct
700+
[JavaPairRDD](api/java/index.html?org/apache/spark/api/java/JavaPairRDD.html) class. You can construct
701701
JavaPairRDDs from JavaRDDs using special versions of the `map` operations, like
702702
`mapToPair` and `flatMapToPair`. The JavaPairRDD will have both standard RDD functions and special
703703
key-value ones.
@@ -749,11 +749,11 @@ We could also use `counts.sortByKey()`, for example, to sort the pairs alphabeti
749749
The following table lists some of the common transformations supported by Spark. Refer to the
750750
RDD API doc
751751
([Scala](api/scala/index.html#org.apache.spark.rdd.RDD),
752-
[Java](api/java/org/apache/spark/api/java/JavaRDD.html),
752+
[Java](api/java/index.html?org/apache/spark/api/java/JavaRDD.html),
753753
[Python](api/python/pyspark.rdd.RDD-class.html))
754754
and pair RDD functions doc
755755
([Scala](api/scala/index.html#org.apache.spark.rdd.PairRDDFunctions),
756-
[Java](api/java/org/apache/spark/api/java/JavaPairRDD.html))
756+
[Java](api/java/index.html?org/apache/spark/api/java/JavaPairRDD.html))
757757
for details.
758758

759759
<table class="table">
@@ -852,11 +852,11 @@ for details.
852852
The following table lists some of the common actions supported by Spark. Refer to the
853853
RDD API doc
854854
([Scala](api/scala/index.html#org.apache.spark.rdd.RDD),
855-
[Java](api/java/org/apache/spark/api/java/JavaRDD.html),
855+
[Java](api/java/index.html?org/apache/spark/api/java/JavaRDD.html),
856856
[Python](api/python/pyspark.rdd.RDD-class.html))
857857
and pair RDD functions doc
858858
([Scala](api/scala/index.html#org.apache.spark.rdd.PairRDDFunctions),
859-
[Java](api/java/org/apache/spark/api/java/JavaPairRDD.html))
859+
[Java](api/java/index.html?org/apache/spark/api/java/JavaPairRDD.html))
860860
for details.
861861

862862
<table class="table">
@@ -931,7 +931,7 @@ to persist the dataset on disk, persist it in memory but as serialized Java obje
931931
replicate it across nodes, or store it off-heap in [Tachyon](http://tachyon-project.org/).
932932
These levels are set by passing a
933933
`StorageLevel` object ([Scala](api/scala/index.html#org.apache.spark.storage.StorageLevel),
934-
[Java](api/java/org/apache/spark/storage/StorageLevel.html),
934+
[Java](api/java/index.html?org/apache/spark/storage/StorageLevel.html),
935935
[Python](api/python/pyspark.storagelevel.StorageLevel-class.html))
936936
to `persist()`. The `cache()` method is a shorthand for using the default storage level,
937937
which is `StorageLevel.MEMORY_ONLY` (store deserialized objects in memory). The full set of
@@ -1150,7 +1150,7 @@ accum.value();
11501150
{% endhighlight %}
11511151

11521152
While this code used the built-in support for accumulators of type Integer, programmers can also
1153-
create their own types by subclassing [AccumulatorParam](api/java/org/apache/spark/AccumulatorParam.html).
1153+
create their own types by subclassing [AccumulatorParam](api/java/index.html?org/apache/spark/AccumulatorParam.html).
11541154
The AccumulatorParam interface has two methods: `zero` for providing a "zero value" for your data
11551155
type, and `addInPlace` for adding two values together. For example, supposing we had a `Vector` class
11561156
representing mathematical vectors, we could write:
@@ -1166,10 +1166,10 @@ class VectorAccumulatorParam implements AccumulatorParam<Vector> {
11661166
}
11671167

11681168
// Then, create an Accumulator of this type:
1169-
Accumulator<Vector> vecAccum = sc.accumulator(new Vector(...))(new VectorAccumulatorParam());
1169+
Accumulator<Vector> vecAccum = sc.accumulator(new Vector(...), new VectorAccumulatorParam());
11701170
{% endhighlight %}
11711171

1172-
In Java, Spark also supports the more general [Accumulable](api/java/org/apache/spark/Accumulable.html)
1172+
In Java, Spark also supports the more general [Accumulable](api/java/index.html?org/apache/spark/Accumulable.html)
11731173
interface to accumulate data where the resulting type is not the same as the elements added (e.g. build
11741174
a list by collecting together elements).
11751175

@@ -1205,7 +1205,7 @@ class VectorAccumulatorParam(AccumulatorParam):
12051205
return v1
12061206

12071207
# Then, create an Accumulator of this type:
1208-
vecAccum = sc.accumulator(Vector(...))(VectorAccumulatorParam())
1208+
vecAccum = sc.accumulator(Vector(...), VectorAccumulatorParam())
12091209
{% endhighlight %}
12101210

12111211
</div>

docs/streaming-programming-guide.md

Lines changed: 17 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -136,7 +136,7 @@ The complete code can be found in the Spark Streaming example
136136
<div data-lang="java" markdown="1">
137137

138138
First, we create a
139-
[JavaStreamingContext](api/java/org/apache/spark/streaming/api/java/JavaStreamingContext.html) object,
139+
[JavaStreamingContext](api/java/index.html?org/apache/spark/streaming/api/java/JavaStreamingContext.html) object,
140140
which is the main entry point for all streaming
141141
functionality. Besides Spark's configuration, we specify that any DStream would be processed
142142
in 1 second batches.
@@ -215,7 +215,7 @@ jssc.awaitTermination(); // Wait for the computation to terminate
215215
{% endhighlight %}
216216

217217
The complete code can be found in the Spark Streaming example
218-
[JavaNetworkWordCount]({{site.SPARK_GITHUB_URL}}/blob/master/examples/src/main/java/org/apache/spark/examples/streaming/JavaNetworkWordCount.java).
218+
[JavaNetworkWordCount]({{site.SPARK_GITHUB_URL}}/blob/master/examples/src/main/java/index.html?org/apache/spark/examples/streaming/JavaNetworkWordCount.java).
219219
<br>
220220

221221
</div>
@@ -813,8 +813,8 @@ output operators are defined:
813813
The complete list of DStream operations is available in the API documentation. For the Scala API,
814814
see [DStream](api/scala/index.html#org.apache.spark.streaming.dstream.DStream)
815815
and [PairDStreamFunctions](api/scala/index.html#org.apache.spark.streaming.dstream.PairDStreamFunctions).
816-
For the Java API, see [JavaDStream](api/java/org/apache/spark/streaming/api/java/JavaDStream.html)
817-
and [JavaPairDStream](api/java/org/apache/spark/streaming/api/java/JavaPairDStream.html).
816+
For the Java API, see [JavaDStream](api/java/index.html?org/apache/spark/streaming/api/java/JavaDStream.html)
817+
and [JavaPairDStream](api/java/index.html?org/apache/spark/streaming/api/java/JavaPairDStream.html).
818818

819819
## Persistence
820820
Similar to RDDs, DStreams also allow developers to persist the stream's data in memory. That is,
@@ -876,7 +876,7 @@ sending the data to two destinations (i.e., the earlier and upgraded application
876876

877877
- The existing application is shutdown gracefully (see
878878
[`StreamingContext.stop(...)`](api/scala/index.html#org.apache.spark.streaming.StreamingContext)
879-
or [`JavaStreamingContext.stop(...)`](api/java/org/apache/spark/streaming/api/java/JavaStreamingContext.html)
879+
or [`JavaStreamingContext.stop(...)`](api/java/index.html?org/apache/spark/streaming/api/java/JavaStreamingContext.html)
880880
for graceful shutdown options) which ensure data that have been received is completely
881881
processed before shutdown. Then the
882882
upgraded application can be started, which will start processing from the same point where the earlier
@@ -1311,10 +1311,10 @@ This section elaborates the steps required to migrate your existing code to 1.0.
13111311
`FlumeUtils.createStream`, etc.) now returns
13121312
[InputDStream](api/scala/index.html#org.apache.spark.streaming.dstream.InputDStream) /
13131313
[ReceiverInputDStream](api/scala/index.html#org.apache.spark.streaming.dstream.ReceiverInputDStream)
1314-
(instead of DStream) for Scala, and [JavaInputDStream](api/java/org/apache/spark/streaming/api/java/JavaInputDStream.html) /
1315-
[JavaPairInputDStream](api/java/org/apache/spark/streaming/api/java/JavaPairInputDStream.html) /
1316-
[JavaReceiverInputDStream](api/java/org/apache/spark/streaming/api/java/JavaReceiverInputDStream.html) /
1317-
[JavaPairReceiverInputDStream](api/java/org/apache/spark/streaming/api/java/JavaPairReceiverInputDStream.html)
1314+
(instead of DStream) for Scala, and [JavaInputDStream](api/java/index.html?org/apache/spark/streaming/api/java/JavaInputDStream.html) /
1315+
[JavaPairInputDStream](api/java/index.html?org/apache/spark/streaming/api/java/JavaPairInputDStream.html) /
1316+
[JavaReceiverInputDStream](api/java/index.html?org/apache/spark/streaming/api/java/JavaReceiverInputDStream.html) /
1317+
[JavaPairReceiverInputDStream](api/java/index.html?org/apache/spark/streaming/api/java/JavaPairReceiverInputDStream.html)
13181318
(instead of JavaDStream) for Java. This ensures that functionality specific to input streams can
13191319
be added to these classes in the future without breaking binary compatibility.
13201320
Note that your existing Spark Streaming applications should not require any change
@@ -1365,14 +1365,14 @@ package and renamed for better clarity.
13651365
[ZeroMQUtils](api/scala/index.html#org.apache.spark.streaming.zeromq.ZeroMQUtils$), and
13661366
[MQTTUtils](api/scala/index.html#org.apache.spark.streaming.mqtt.MQTTUtils$)
13671367
- Java docs
1368-
* [JavaStreamingContext](api/java/org/apache/spark/streaming/api/java/JavaStreamingContext.html),
1369-
[JavaDStream](api/java/org/apache/spark/streaming/api/java/JavaDStream.html) and
1370-
[PairJavaDStream](api/java/org/apache/spark/streaming/api/java/PairJavaDStream.html)
1371-
* [KafkaUtils](api/java/org/apache/spark/streaming/kafka/KafkaUtils.html),
1372-
[FlumeUtils](api/java/org/apache/spark/streaming/flume/FlumeUtils.html),
1373-
[TwitterUtils](api/java/org/apache/spark/streaming/twitter/TwitterUtils.html),
1374-
[ZeroMQUtils](api/java/org/apache/spark/streaming/zeromq/ZeroMQUtils.html), and
1375-
[MQTTUtils](api/java/org/apache/spark/streaming/mqtt/MQTTUtils.html)
1368+
* [JavaStreamingContext](api/java/index.html?org/apache/spark/streaming/api/java/JavaStreamingContext.html),
1369+
[JavaDStream](api/java/index.html?org/apache/spark/streaming/api/java/JavaDStream.html) and
1370+
[PairJavaDStream](api/java/index.html?org/apache/spark/streaming/api/java/PairJavaDStream.html)
1371+
* [KafkaUtils](api/java/index.html?org/apache/spark/streaming/kafka/KafkaUtils.html),
1372+
[FlumeUtils](api/java/index.html?org/apache/spark/streaming/flume/FlumeUtils.html),
1373+
[TwitterUtils](api/java/index.html?org/apache/spark/streaming/twitter/TwitterUtils.html),
1374+
[ZeroMQUtils](api/java/index.html?org/apache/spark/streaming/zeromq/ZeroMQUtils.html), and
1375+
[MQTTUtils](api/java/index.html?org/apache/spark/streaming/mqtt/MQTTUtils.html)
13761376

13771377
* More examples in [Scala]({{site.SPARK_GITHUB_URL}}/tree/master/examples/src/main/scala/org/apache/spark/examples/streaming)
13781378
and [Java]({{site.SPARK_GITHUB_URL}}/tree/master/examples/src/main/java/org/apache/spark/examples/streaming)

0 commit comments

Comments
 (0)