Skip to content

Commit a567f79

Browse files
[tune] Put examples under proper version control (#9427)
Co-authored-by: krfricke <krfricke@users.noreply.github.com>
1 parent 7abf7a0 commit a567f79

32 files changed

+199
-41
lines changed

doc/source/tune/_tutorials/overview.rst

Lines changed: 20 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -148,55 +148,55 @@ If any example is broken, or if you'd like to add an example to this page, feel
148148
General Examples
149149
~~~~~~~~~~~~~~~~
150150

151-
- `async_hyperband_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/async_hyperband_example.py>`__: Example of using a Trainable class with AsyncHyperBandScheduler.
152-
- `hyperband_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/hyperband_example.py>`__: Example of using a Trainable class with HyperBandScheduler. Also uses the Experiment class API for specifying the experiment configuration. Also uses the AsyncHyperBandScheduler.
153-
- `pbt_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_example.py>`__: Example of using a Trainable class with PopulationBasedTraining scheduler.
154-
- `PBT with Function API <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_function.py>`__: Example of using the function API with a PopulationBasedTraining scheduler.
155-
- `pbt_ppo_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_ppo_example.py>`__: Example of optimizing a distributed RLlib algorithm (PPO) with the PopulationBasedTraining scheduler.
156-
- `logging_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/logging_example.py>`__: Example of custom loggers and custom trial directory naming.
151+
- :doc:`/tune/examples/async_hyperband_example`: Example of using a Trainable class with AsyncHyperBandScheduler.
152+
- :doc:`/tune/examples/hyperband_example`: Example of using a Trainable class with HyperBandScheduler. Also uses the Experiment class API for specifying the experiment configuration. Also uses the AsyncHyperBandScheduler.
153+
- :doc:`/tune/examples/pbt_example`: Example of using a Trainable class with PopulationBasedTraining scheduler.
154+
- :doc:`/tune/examples/pbt_function`: Example of using the function API with a PopulationBasedTraining scheduler.
155+
- :doc:`/tune/examples/pbt_ppo_example`: Example of optimizing a distributed RLlib algorithm (PPO) with the PopulationBasedTraining scheduler.
156+
- :doc:`/tune/examples/logging_example`: Example of custom loggers and custom trial directory naming.
157157

158158
Search Algorithm Examples
159159
~~~~~~~~~~~~~~~~~~~~~~~~~
160160

161-
- `Ax example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/ax_example.py>`__: Optimize a Hartmann function with `Ax <https://ax.dev>`_ with 4 parallel workers.
162-
- `HyperOpt Example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/hyperopt_example.py>`__: Optimizes a basic function using the function-based API and the HyperOptSearch (SearchAlgorithm wrapper for HyperOpt TPE).
163-
- `Nevergrad example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/nevergrad_example.py>`__: Optimize a simple toy function with the gradient-free optimization package `Nevergrad <https://github.com/facebookresearch/nevergrad>`_ with 4 parallel workers.
164-
- `Bayesian Optimization example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/bayesopt_example.py>`__: Optimize a simple toy function using `Bayesian Optimization <https://github.com/fmfn/BayesianOptimization>`_ with 4 parallel workers.
161+
- :doc:`/tune/examples/ax_example`: Optimize a Hartmann function with `Ax <https://ax.dev>`_ with 4 parallel workers.
162+
- :doc:`/tune/examples/hyperopt_example`: Optimizes a basic function using the function-based API and the HyperOptSearch (SearchAlgorithm wrapper for HyperOpt TPE).
163+
- :doc:`/tune/examples/nevergrad_example`: Optimize a simple toy function with the gradient-free optimization package `Nevergrad <https://github.com/facebookresearch/nevergrad>`_ with 4 parallel workers.
164+
- :doc:`/tune/examples/bayesopt_example`: Optimize a simple toy function using `Bayesian Optimization <https://github.com/fmfn/BayesianOptimization>`_ with 4 parallel workers.
165165

166166
Tensorflow/Keras Examples
167167
~~~~~~~~~~~~~~~~~~~~~~~~~
168168

169-
- `tune_mnist_keras <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/tune_mnist_keras.py>`__: Converts the Keras MNIST example to use Tune with the function-based API and a Keras callback. Also shows how to easily convert something relying on argparse to use Tune.
170-
- `pbt_memnn_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_memnn_example.py>`__: Example of training a Memory NN on bAbI with Keras using PBT.
171-
- `Tensorflow 2 Example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/tf_mnist_example.py>`__: Converts the Advanced TF2.0 MNIST example to use Tune with the Trainable. This uses `tf.function`. Original code from tensorflow: https://www.tensorflow.org/tutorials/quickstart/advanced
169+
- :doc:`/tune/examples/tune_mnist_keras`: Converts the Keras MNIST example to use Tune with the function-based API and a Keras callback. Also shows how to easily convert something relying on argparse to use Tune.
170+
- :doc:`/tune/examples/pbt_memnn_example`: Example of training a Memory NN on bAbI with Keras using PBT.
171+
- :doc:`/tune/examples/tf_mnist_example`: Converts the Advanced TF2.0 MNIST example to use Tune with the Trainable. This uses `tf.function`. Original code from tensorflow: https://www.tensorflow.org/tutorials/quickstart/advanced
172172

173173

174174
PyTorch Examples
175175
~~~~~~~~~~~~~~~~
176176

177-
- `mnist_pytorch <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/mnist_pytorch.py>`__: Converts the PyTorch MNIST example to use Tune with the function-based API. Also shows how to easily convert something relying on argparse to use Tune.
178-
- `mnist_pytorch_trainable <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/mnist_pytorch_trainable.py>`__: Converts the PyTorch MNIST example to use Tune with Trainable API. Also uses the HyperBandScheduler and checkpoints the model at the end.
177+
- :doc:`/tune/examples/mnist_pytorch`: Converts the PyTorch MNIST example to use Tune with the function-based API. Also shows how to easily convert something relying on argparse to use Tune.
178+
- :doc:`/tune/examples/mnist_pytorch_trainable`: Converts the PyTorch MNIST example to use Tune with Trainable API. Also uses the HyperBandScheduler and checkpoints the model at the end.
179179

180180

181181
XGBoost Example
182182
~~~~~~~~~~~~~~~
183183

184184
- :ref:`XGBoost tutorial <tune-xgboost>`: A guide to tuning XGBoost parameters with Tune.
185-
- `xgboost_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/xgboost_example.py>`__: Trains a basic XGBoost model with Tune with the function-based API and an XGBoost callback.
185+
- :doc:`/tune/examples/xgboost_example`: Trains a basic XGBoost model with Tune with the function-based API and an XGBoost callback.
186186

187187

188188
LightGBM Example
189189
~~~~~~~~~~~~~~~~
190190

191-
- `lightgbm_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/lightgbm_example.py>`__: Trains a basic LightGBM model with Tune with the function-based API and a LightGBM callback.
191+
- :doc:`/tune/examples/lightgbm_example`: Trains a basic LightGBM model with Tune with the function-based API and a LightGBM callback.
192192

193193

194194
Contributed Examples
195195
~~~~~~~~~~~~~~~~~~~~
196196

197-
- `pbt_tune_cifar10_with_keras <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_tune_cifar10_with_keras.py>`__: A contributed example of tuning a Keras model on CIFAR10 with the PopulationBasedTraining scheduler.
198-
- `genetic_example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/genetic_example.py>`__: Optimizing the michalewicz function using the contributed GeneticSearch algorithm with AsyncHyperBandScheduler.
199-
- `tune_cifar10_gluon <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/tune_cifar10_gluon.py>`__: MXNet Gluon example to use Tune with the function-based API on CIFAR-10 dataset.
197+
- :doc:`/tune/examples/pbt_tune_cifar10_with_keras`: A contributed example of tuning a Keras model on CIFAR10 with the PopulationBasedTraining scheduler.
198+
- :doc:`/tune/examples/genetic_example`: Optimizing the michalewicz function using the contributed GeneticSearch algorithm with AsyncHyperBandScheduler.
199+
- :doc:`/tune/examples/tune_cifar10_gluon`: MXNet Gluon example to use Tune with the function-based API on CIFAR-10 dataset.
200200

201201
Open Source Projects using Tune
202202
-------------------------------

doc/source/tune/_tutorials/tune-usage.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ To leverage GPUs, you must set ``gpu`` in ``resources_per_trial``. This will aut
5050
# If you have 4 CPUs on your machine and 1 GPU, this will run 1 trial at a time.
5151
tune.run(trainable, num_samples=10, resources_per_trial={"cpu": 2, "gpu": 1})
5252
53-
You can find an example of this in the `Keras MNIST example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/tune_mnist_keras.py>`__.
53+
You can find an example of this in the :doc:`Keras MNIST example </tune/examples/tune_mnist_keras>`.
5454

5555
.. warning:: If 'gpu' is not set, ``CUDA_VISIBLE_DEVICES`` environment variable will be set as empty, disallowing GPU access.
5656

doc/source/tune/api_docs/logging.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ You can then pass in your own logger as follows:
5858
5959
These loggers will be called along with the default Tune loggers. You can also check out `logger.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/logger.py>`__ for implementation details.
6060

61-
An example of creating a custom logger can be found in `logging_example.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/logging_example.py>`__.
61+
An example of creating a custom logger can be found in :doc:`/tune/examples/logging_example`.
6262

6363
.. _trainable-logging:
6464

@@ -164,7 +164,7 @@ CSVLogger
164164
MLFLowLogger
165165
------------
166166

167-
Tune also provides a default logger for `MLFlow <https://mlflow.org>`_. You can install MLFlow via ``pip install mlflow``. An example can be found `mlflow_example.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/mlflow_example.py>`__. Note that this currently does not include artifact logging support. For this, you can use the native MLFlow APIs inside your Trainable definition.
167+
Tune also provides a default logger for `MLFlow <https://mlflow.org>`_. You can install MLFlow via ``pip install mlflow``. An example can be found in :doc:`/tune/examples/mlflow_example`. Note that this currently does not include artifact logging support. For this, you can use the native MLFlow APIs inside your Trainable definition.
168168

169169
.. autoclass:: ray.tune.logger.MLFLowLogger
170170

doc/source/tune/api_docs/schedulers.rst

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -32,23 +32,23 @@ When using schedulers, you may face compatibility issues, as shown in the below
3232
* - :ref:`ASHA <tune-scheduler-hyperband>`
3333
- No
3434
- Yes
35-
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/async_hyperband_example.py>`__
35+
- :doc:`Link </tune/examples/async_hyperband_example>`
3636
* - :ref:`Median Stopping Rule <tune-scheduler-msr>`
3737
- No
3838
- Yes
3939
- :ref:`Link <tune-scheduler-msr>`
4040
* - :ref:`HyperBand <tune-original-hyperband>`
4141
- Yes
4242
- Yes
43-
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/hyperband_example.py>`__
43+
- :doc:`Link </tune/examples/hyperband_example>`
4444
* - :ref:`BOHB <tune-scheduler-bohb>`
4545
- Yes
4646
- Only TuneBOHB
47-
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/bohb_example.py>`__
47+
- :doc:`Link </tune/examples/bohb_example>`
4848
* - :ref:`Population Based Training <tune-scheduler-pbt>`
4949
- Yes
5050
- Not Compatible
51-
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_example.py>`__
51+
- :doc:`Link </tune/examples/pbt_example>`
5252

5353
.. _tune-scheduler-hyperband:
5454

@@ -69,7 +69,7 @@ The `ASHA <https://openreview.net/forum?id=S1Y7OOlRZ>`__ scheduler can be used b
6969
brackets=1)
7070
tune.run( ... , scheduler=asha_scheduler)
7171
72-
Compared to the original version of HyperBand, this implementation provides better parallelism and avoids straggler issues during eliminations. **We recommend using this over the standard HyperBand scheduler.** An example of this can be `found here <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/async_hyperband_example.py>`_.
72+
Compared to the original version of HyperBand, this implementation provides better parallelism and avoids straggler issues during eliminations. **We recommend using this over the standard HyperBand scheduler.** An example of this can be found here: :doc:`/tune/examples/async_hyperband_example`.
7373

7474
Even though the original paper mentions a bracket count of 3, discussions with the authors concluded that the value should be left to 1 bracket. This is the default used if no value is provided for the ``brackets`` argument.
7575

@@ -141,7 +141,7 @@ Tune includes a distributed implementation of `Population Based Training (PBT) <
141141
142142
When the PBT scheduler is enabled, each trial variant is treated as a member of the population. Periodically, top-performing trials are checkpointed (this requires your Trainable to support :ref:`save and restore <tune-checkpoint>`). Low-performing trials clone the checkpoints of top performers and perturb the configurations in the hope of discovering an even better variation.
143143

144-
You can run this `toy PBT example <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_example.py>`__ to get an idea of how how PBT operates. When training in PBT mode, a single trial may see many different hyperparameters over its lifetime, which is recorded in its ``result.json`` file. The following figure generated by the example shows PBT with optimizing a LR schedule over the course of a single experiment:
144+
You can run this :doc:`toy PBT example </tune/examples/pbt_function>` to get an idea of how how PBT operates. When training in PBT mode, a single trial may see many different hyperparameters over its lifetime, which is recorded in its ``result.json`` file. The following figure generated by the example shows PBT with optimizing a LR schedule over the course of a single experiment:
145145

146146
.. image:: /pbt.png
147147

@@ -157,7 +157,7 @@ This class is a variant of HyperBand that enables the `BOHB Algorithm <https://a
157157

158158
This is to be used in conjunction with the Tune BOHB search algorithm. See :ref:`TuneBOHB <suggest-TuneBOHB>` for package requirements, examples, and details.
159159

160-
An example of this in use can be found in `bohb_example.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/bohb_example.py>`_.
160+
An example of this in use can be found here: :doc:`/tune/examples/bohb_example`.
161161

162162
.. autoclass:: ray.tune.schedulers.HyperBandForBOHB
163163

doc/source/tune/api_docs/suggestion.rst

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -25,39 +25,39 @@ Summary
2525
* - :ref:`AxSearch <tune-ax>`
2626
- Bayesian/Bandit Optimization
2727
- [`Ax <https://ax.dev/>`__]
28-
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/ax_example.py>`__
28+
- :doc:`/tune/examples/ax_example`
2929
* - :ref:`DragonflySearch <Dragonfly>`
3030
- Scalable Bayesian Optimization
3131
- [`Dragonfly <https://dragonfly-opt.readthedocs.io/>`__]
32-
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/dragonfly_example.py>`__
32+
- :doc:`/tune/examples/dragonfly_example`
3333
* - :ref:`SkoptSearch <skopt>`
3434
- Bayesian Optimization
3535
- [`Scikit-Optimize <https://scikit-optimize.github.io>`__]
36-
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/skopt_example.py>`__
36+
- :doc:`/tune/examples/skopt_example`
3737
* - :ref:`HyperOptSearch <tune-hyperopt>`
3838
- Tree-Parzen Estimators
3939
- [`HyperOpt <http://hyperopt.github.io/hyperopt>`__]
40-
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/hyperopt_example.py>`__
40+
- :doc:`/tune/examples/hyperopt_example`
4141
* - :ref:`BayesOptSearch <bayesopt>`
4242
- Bayesian Optimization
4343
- [`BayesianOptimization <https://github.com/fmfn/BayesianOptimization>`__]
44-
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/bayesopt_example.py>`__
44+
- :doc:`/tune/examples/bayesopt_example`
4545
* - :ref:`TuneBOHB <suggest-TuneBOHB>`
4646
- Bayesian Opt/HyperBand
4747
- [`BOHB <https://github.com/automl/HpBandSter>`__]
48-
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/bohb_example.py>`__
48+
- :doc:`/tune/examples/bohb_example`
4949
* - :ref:`NevergradSearch <nevergrad>`
5050
- Gradient-free Optimization
5151
- [`Nevergrad <https://github.com/facebookresearch/nevergrad>`__]
52-
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/nevergrad_example.py>`__
52+
- :doc:`/tune/examples/nevergrad_example`
5353
* - :ref:`ZOOptSearch <zoopt>`
5454
- Zeroth-order Optimization
5555
- [`ZOOpt <https://github.com/polixir/ZOOpt>`__]
56-
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/zoopt_example.py>`__
56+
- :doc:`/tune/examples/zoopt_example`
5757
* - :ref:`SigOptSearch <sigopt>`
5858
- Closed source
5959
- [`SigOpt <https://sigopt.com/>`__]
60-
- `Link <https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/sigopt_example.py>`__
60+
- :doc:`/tune/examples/sigopt_example`
6161

6262

6363
.. note::Search algorithms will require a different search space declaration than the default Tune format - meaning that you will not be able to combine ``tune.grid_search`` with the below integrations.
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
:orphan:
2+
3+
async_hyperband_example
4+
~~~~~~~~~~~~~~~~~~~~~~~
5+
6+
.. literalinclude:: /../../python/ray/tune/examples/async_hyperband_example.py
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
:orphan:
2+
3+
ax_example
4+
~~~~~~~~~~
5+
6+
.. literalinclude:: /../../python/ray/tune/examples/ax_example.py
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
:orphan:
2+
3+
bayesopt_example
4+
~~~~~~~~~~~~~~~~
5+
6+
.. literalinclude:: /../../python/ray/tune/examples/bayesopt_example.py
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
:orphan:
2+
3+
bohb_example
4+
~~~~~~~~~~~~~~~
5+
6+
.. literalinclude:: /../../python/ray/tune/examples/bohb_example.py
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
:orphan:
2+
3+
dragonfly_example
4+
~~~~~~~~~~~~~~~~~
5+
6+
.. literalinclude:: /../../python/ray/tune/examples/dragonfly_example.py

0 commit comments

Comments
 (0)