Skip to content

Commit acef8cd

Browse files
committed
Sphinx-doc absolute path for auto_example images
1 parent 90fe87f commit acef8cd

File tree

7 files changed

+35
-37
lines changed

7 files changed

+35
-37
lines changed

doc/modules/linear_model.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1081,7 +1081,7 @@ and :class:`RANSACRegressor` because it does not ignore the effect of the outlie
10811081
but gives a lesser weight to them.
10821082

10831083
.. figure:: /auto_examples/linear_model/images/sphx_glr_plot_huber_vs_ridge_001.png
1084-
:target: /auto_examples/linear_model/plot_huber_vs_ridge.html
1084+
:target: ../auto_examples/linear_model/plot_huber_vs_ridge.html
10851085
:align: center
10861086
:scale: 50%
10871087

doc/tutorial/basic/tutorial.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -190,7 +190,7 @@ which we have not used to train the classifier::
190190

191191
The corresponding image is the following:
192192

193-
.. image:: ../../auto_examples/datasets/images/sphx_glr_plot_digits_last_image_001.png
193+
.. image:: /auto_examples/datasets/images/sphx_glr_plot_digits_last_image_001.png
194194
:target: ../../auto_examples/datasets/plot_digits_last_image.html
195195
:align: center
196196
:scale: 50

doc/tutorial/statistical_inference/model_selection.rst

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -183,7 +183,7 @@ scoring method.
183183
.. topic:: **Exercise**
184184
:class: green
185185

186-
.. image:: ../../auto_examples/exercises/images/sphx_glr_plot_cv_digits_001.png
186+
.. image:: /auto_examples/exercises/images/sphx_glr_plot_cv_digits_001.png
187187
:target: ../../auto_examples/exercises/plot_cv_digits.html
188188
:align: right
189189
:scale: 90
@@ -290,5 +290,3 @@ appended to their name.
290290
:lines: 17-24
291291

292292
**Solution:** :ref:`sphx_glr_auto_examples_exercises_plot_cv_diabetes.py`
293-
294-

doc/tutorial/statistical_inference/putting_together.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ Pipelining
1111
We have seen that some estimators can transform data and that some estimators
1212
can predict variables. We can also create combined estimators:
1313

14-
.. image:: ../../auto_examples/images/sphx_glr_plot_digits_pipe_001.png
14+
.. image:: /auto_examples/images/sphx_glr_plot_digits_pipe_001.png
1515
:target: ../../auto_examples/plot_digits_pipe.html
1616
:scale: 65
1717
:align: right

doc/tutorial/statistical_inference/settings.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ needs to be preprocessed in order to be used by scikit-learn.
3131

3232
.. topic:: An example of reshaping data would be the digits dataset
3333

34-
.. image:: ../../auto_examples/datasets/images/sphx_glr_plot_digits_last_image_001.png
34+
.. image:: /auto_examples/datasets/images/sphx_glr_plot_digits_last_image_001.png
3535
:target: ../../auto_examples/datasets/plot_digits_last_image.html
3636
:align: right
3737
:scale: 60

doc/tutorial/statistical_inference/supervised_learning.rst

Lines changed: 16 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ Nearest neighbor and the curse of dimensionality
3838

3939
.. topic:: Classifying irises:
4040

41-
.. image:: ../../auto_examples/datasets/images/sphx_glr_plot_iris_dataset_001.png
41+
.. image:: /auto_examples/datasets/images/sphx_glr_plot_iris_dataset_001.png
4242
:target: ../../auto_examples/datasets/plot_iris_dataset.html
4343
:align: right
4444
:scale: 65
@@ -75,7 +75,7 @@ Scikit-learn documentation for more information about this type of classifier.)
7575

7676
**KNN (k nearest neighbors) classification example**:
7777

78-
.. image:: ../../auto_examples/neighbors/images/sphx_glr_plot_classification_001.png
78+
.. image:: /auto_examples/neighbors/images/sphx_glr_plot_classification_001.png
7979
:target: ../../auto_examples/neighbors/plot_classification.html
8080
:align: center
8181
:scale: 70
@@ -159,7 +159,7 @@ in its simplest form, fits a linear model to the data set by adjusting
159159
a set of parameters in order to make the sum of the squared residuals
160160
of the model as small as possible.
161161

162-
.. image:: ../../auto_examples/linear_model/images/sphx_glr_plot_ols_001.png
162+
.. image:: /auto_examples/linear_model/images/sphx_glr_plot_ols_001.png
163163
:target: ../../auto_examples/linear_model/plot_ols.html
164164
:scale: 40
165165
:align: right
@@ -200,7 +200,7 @@ Shrinkage
200200
If there are few data points per dimension, noise in the observations
201201
induces high variance:
202202

203-
.. image:: ../../auto_examples/linear_model/images/sphx_glr_plot_ols_ridge_variance_001.png
203+
.. image:: /auto_examples/linear_model/images/sphx_glr_plot_ols_ridge_variance_001.png
204204
:target: ../../auto_examples/linear_model/plot_ols_ridge_variance.html
205205
:scale: 70
206206
:align: right
@@ -229,7 +229,7 @@ regression coefficients to zero: any two randomly chosen set of
229229
observations are likely to be uncorrelated. This is called :class:`Ridge`
230230
regression:
231231

232-
.. image:: ../../auto_examples/linear_model/images/sphx_glr_plot_ols_ridge_variance_002.png
232+
.. image:: /auto_examples/linear_model/images/sphx_glr_plot_ols_ridge_variance_002.png
233233
:target: ../../auto_examples/linear_model/plot_ols_ridge_variance.html
234234
:scale: 70
235235
:align: right
@@ -275,15 +275,15 @@ Sparsity
275275
----------
276276

277277

278-
.. |diabetes_ols_1| image:: ../../auto_examples/linear_model/images/sphx_glr_plot_ols_3d_001.png
278+
.. |diabetes_ols_1| image:: /auto_examples/linear_model/images/sphx_glr_plot_ols_3d_001.png
279279
:target: ../../auto_examples/linear_model/plot_ols_3d.html
280280
:scale: 65
281281

282-
.. |diabetes_ols_3| image:: ../../auto_examples/linear_model/images/sphx_glr_plot_ols_3d_003.png
282+
.. |diabetes_ols_3| image:: /auto_examples/linear_model/images/sphx_glr_plot_ols_3d_003.png
283283
:target: ../../auto_examples/linear_model/plot_ols_3d.html
284284
:scale: 65
285285

286-
.. |diabetes_ols_2| image:: ../../auto_examples/linear_model/images/sphx_glr_plot_ols_3d_002.png
286+
.. |diabetes_ols_2| image:: /auto_examples/linear_model/images/sphx_glr_plot_ols_3d_002.png
287287
:target: ../../auto_examples/linear_model/plot_ols_3d.html
288288
:scale: 65
289289

@@ -350,7 +350,7 @@ application of Occam's razor: *prefer simpler models*.
350350
Classification
351351
---------------
352352

353-
.. image:: ../../auto_examples/linear_model/images/sphx_glr_plot_logistic_001.png
353+
.. image:: /auto_examples/linear_model/images/sphx_glr_plot_logistic_001.png
354354
:target: ../../auto_examples/linear_model/plot_logistic.html
355355
:scale: 65
356356
:align: right
@@ -377,7 +377,7 @@ function or **logistic** function:
377377

378378
This is known as :class:`LogisticRegression`.
379379

380-
.. image:: ../../auto_examples/linear_model/images/sphx_glr_plot_iris_logistic_001.png
380+
.. image:: /auto_examples/linear_model/images/sphx_glr_plot_iris_logistic_001.png
381381
:target: ../../auto_examples/linear_model/plot_iris_logistic.html
382382
:scale: 83
383383

@@ -425,11 +425,11 @@ the separating line (less regularization).
425425

426426
.. currentmodule :: sklearn.svm
427427
428-
.. |svm_margin_unreg| image:: ../../auto_examples/svm/images/sphx_glr_plot_svm_margin_001.png
428+
.. |svm_margin_unreg| image:: /auto_examples/svm/images/sphx_glr_plot_svm_margin_001.png
429429
:target: ../../auto_examples/svm/plot_svm_margin.html
430430
:scale: 70
431431

432-
.. |svm_margin_reg| image:: ../../auto_examples/svm/images/sphx_glr_plot_svm_margin_002.png
432+
.. |svm_margin_reg| image:: /auto_examples/svm/images/sphx_glr_plot_svm_margin_002.png
433433
:target: ../../auto_examples/svm/plot_svm_margin.html
434434
:scale: 70
435435

@@ -476,11 +476,11 @@ build a decision function that is not linear but may be polynomial instead.
476476
This is done using the *kernel trick* that can be seen as
477477
creating a decision energy by positioning *kernels* on observations:
478478

479-
.. |svm_kernel_linear| image:: ../../auto_examples/svm/images/sphx_glr_plot_svm_kernels_001.png
479+
.. |svm_kernel_linear| image:: /auto_examples/svm/images/sphx_glr_plot_svm_kernels_001.png
480480
:target: ../../auto_examples/svm/plot_svm_kernels.html
481481
:scale: 65
482482

483-
.. |svm_kernel_poly| image:: ../../auto_examples/svm/images/sphx_glr_plot_svm_kernels_002.png
483+
.. |svm_kernel_poly| image:: /auto_examples/svm/images/sphx_glr_plot_svm_kernels_002.png
484484
:target: ../../auto_examples/svm/plot_svm_kernels.html
485485
:scale: 65
486486

@@ -518,7 +518,7 @@ creating a decision energy by positioning *kernels* on observations:
518518

519519

520520

521-
.. |svm_kernel_rbf| image:: ../../auto_examples/svm/images/sphx_glr_plot_svm_kernels_003.png
521+
.. |svm_kernel_rbf| image:: /auto_examples/svm/images/sphx_glr_plot_svm_kernels_003.png
522522
:target: ../../auto_examples/svm/plot_svm_kernels.html
523523
:scale: 65
524524

@@ -551,7 +551,7 @@ creating a decision energy by positioning *kernels* on observations:
551551
``svm_gui.py``; add data points of both classes with right and left button,
552552
fit the model and change parameters and data.
553553

554-
.. image:: ../../auto_examples/datasets/images/sphx_glr_plot_iris_dataset_001.png
554+
.. image:: /auto_examples/datasets/images/sphx_glr_plot_iris_dataset_001.png
555555
:target: ../../auto_examples/datasets/plot_iris_dataset.html
556556
:align: right
557557
:scale: 70

doc/tutorial/statistical_inference/unsupervised_learning.rst

Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ Note that there exist a lot of different clustering criteria and associated
2424
algorithms. The simplest clustering algorithm is
2525
:ref:`k_means`.
2626

27-
.. image:: ../../auto_examples/cluster/images/sphx_glr_plot_cluster_iris_002.png
27+
.. image:: /auto_examples/cluster/images/sphx_glr_plot_cluster_iris_002.png
2828
:target: ../../auto_examples/cluster/plot_cluster_iris.html
2929
:scale: 70
3030
:align: right
@@ -45,15 +45,15 @@ algorithms. The simplest clustering algorithm is
4545
>>> print(y_iris[::10])
4646
[0 0 0 0 0 1 1 1 1 1 2 2 2 2 2]
4747

48-
.. |k_means_iris_bad_init| image:: ../../auto_examples/cluster/images/sphx_glr_plot_cluster_iris_003.png
48+
.. |k_means_iris_bad_init| image:: /auto_examples/cluster/images/sphx_glr_plot_cluster_iris_003.png
4949
:target: ../../auto_examples/cluster/plot_cluster_iris.html
5050
:scale: 63
5151

52-
.. |k_means_iris_8| image:: ../../auto_examples/cluster/images/sphx_glr_plot_cluster_iris_001.png
52+
.. |k_means_iris_8| image:: /auto_examples/cluster/images/sphx_glr_plot_cluster_iris_001.png
5353
:target: ../../auto_examples/cluster/plot_cluster_iris.html
5454
:scale: 63
5555

56-
.. |cluster_iris_truth| image:: ../../auto_examples/cluster/images/sphx_glr_plot_cluster_iris_004.png
56+
.. |cluster_iris_truth| image:: /auto_examples/cluster/images/sphx_glr_plot_cluster_iris_004.png
5757
:target: ../../auto_examples/cluster/plot_cluster_iris.html
5858
:scale: 63
5959

@@ -85,27 +85,27 @@ algorithms. The simplest clustering algorithm is
8585

8686
**Don't over-interpret clustering results**
8787

88-
.. |face| image:: ../../auto_examples/cluster/images/sphx_glr_plot_face_compress_001.png
88+
.. |face| image:: /auto_examples/cluster/images/sphx_glr_plot_face_compress_001.png
8989
:target: ../../auto_examples/cluster/plot_face_compress.html
9090
:scale: 60
9191

92-
.. |face_regular| image:: ../../auto_examples/cluster/images/sphx_glr_plot_face_compress_002.png
92+
.. |face_regular| image:: /auto_examples/cluster/images/sphx_glr_plot_face_compress_002.png
9393
:target: ../../auto_examples/cluster/plot_face_compress.html
9494
:scale: 60
9595

96-
.. |face_compressed| image:: ../../auto_examples/cluster/images/sphx_glr_plot_face_compress_003.png
96+
.. |face_compressed| image:: /auto_examples/cluster/images/sphx_glr_plot_face_compress_003.png
9797
:target: ../../auto_examples/cluster/plot_face_compress.html
9898
:scale: 60
9999

100-
.. |face_histogram| image:: ../../auto_examples/cluster/images/sphx_glr_plot_face_compress_004.png
100+
.. |face_histogram| image:: /auto_examples/cluster/images/sphx_glr_plot_face_compress_004.png
101101
:target: ../../auto_examples/cluster/plot_face_compress.html
102102
:scale: 60
103103

104104
.. topic:: **Application example: vector quantization**
105105

106106
Clustering in general and KMeans, in particular, can be seen as a way
107107
of choosing a small number of exemplars to compress the information.
108-
The problem is sometimes known as
108+
The problem is sometimes known as
109109
`vector quantization <https://en.wikipedia.org/wiki/Vector_quantization>`_.
110110
For instance, this can be used to posterize an image::
111111

@@ -177,7 +177,7 @@ This can be useful, for instance, to retrieve connected regions (sometimes
177177
also referred to as connected components) when
178178
clustering an image:
179179

180-
.. image:: ../../auto_examples/cluster/images/sphx_glr_plot_face_ward_segmentation_001.png
180+
.. image:: /auto_examples/cluster/images/sphx_glr_plot_face_ward_segmentation_001.png
181181
:target: ../../auto_examples/cluster/plot_face_ward_segmentation.html
182182
:scale: 40
183183
:align: right
@@ -200,7 +200,7 @@ features: **feature agglomeration**. This approach can be implemented by
200200
clustering in the feature direction, in other words clustering the
201201
transposed data.
202202

203-
.. image:: ../../auto_examples/cluster/images/sphx_glr_plot_digits_agglomeration_001.png
203+
.. image:: /auto_examples/cluster/images/sphx_glr_plot_digits_agglomeration_001.png
204204
:target: ../../auto_examples/cluster/plot_digits_agglomeration.html
205205
:align: right
206206
:scale: 57
@@ -242,11 +242,11 @@ Principal component analysis: PCA
242242
:ref:`PCA` selects the successive components that
243243
explain the maximum variance in the signal.
244244

245-
.. |pca_3d_axis| image:: ../../auto_examples/decomposition/images/sphx_glr_plot_pca_3d_001.png
245+
.. |pca_3d_axis| image:: /auto_examples/decomposition/images/sphx_glr_plot_pca_3d_001.png
246246
:target: ../../auto_examples/decomposition/plot_pca_3d.html
247247
:scale: 70
248248

249-
.. |pca_3d_aligned| image:: ../../auto_examples/decomposition/images/sphx_glr_plot_pca_3d_002.png
249+
.. |pca_3d_aligned| image:: /auto_examples/decomposition/images/sphx_glr_plot_pca_3d_002.png
250250
:target: ../../auto_examples/decomposition/plot_pca_3d.html
251251
:scale: 70
252252

@@ -295,7 +295,7 @@ Independent Component Analysis: ICA
295295
a maximum amount of independent information. It is able to recover
296296
**non-Gaussian** independent signals:
297297

298-
.. image:: ../../auto_examples/decomposition/images/sphx_glr_plot_ica_blind_source_separation_001.png
298+
.. image:: /auto_examples/decomposition/images/sphx_glr_plot_ica_blind_source_separation_001.png
299299
:target: ../../auto_examples/decomposition/plot_ica_blind_source_separation.html
300300
:scale: 70
301301
:align: center

0 commit comments

Comments
 (0)