Skip to content

Latest commit

 

History

History
111 lines (68 loc) · 4.23 KB

tune.rst

File metadata and controls

111 lines (68 loc) · 4.23 KB

Tune: Scalable Hyperparameter Search

images/tune.png

Tune is a scalable framework for hyperparameter search with a focus on deep learning and deep reinforcement learning.

You can find the code for Tune here on GitHub. To get started with Tune, try going through our tutorial of using Tune with Keras.

(Experimental): You can try out the above tutorial on a free hosted server via Binder.

Features

Take a look at the User Guide for a comprehensive overview on how to use Tune's features.

Getting Started

Installation

You'll need to first install ray to import Tune.

pip install ray  # also recommended: ray[debug]

Quick Start

This example runs a small grid search over a neural network training function using Tune, reporting status on the command line until the stopping condition of mean_accuracy >= 99 is reached. Tune works with any deep learning framework.

Tune uses Ray as a backend, so we will first import and initialize Ray.

import ray
from ray import tune

ray.init()

For the function you wish to tune, pass in a reporter object:

 def train_func(config, reporter):  # add a reporter arg
     model = ( ... )
     optimizer = SGD(model.parameters(),
                     momentum=config["momentum"])
     dataset = ( ... )

     for idx, (data, target) in enumerate(dataset):
         accuracy = model.fit(data, target)
         reporter(mean_accuracy=accuracy) # report metrics

Finally, configure your search and execute it on your Ray cluster:

all_trials = tune.run(
    train_func,
    name="quick-start",
    stop={"mean_accuracy": 99},
    config={"momentum": tune.grid_search([0.1, 0.2])}
)

Tune can be used anywhere Ray can, e.g. on your laptop with ray.init() embedded in a Python script, or in an auto-scaling cluster for massive parallelism.

Citing Tune

If Tune helps you in your academic research, you are encouraged to cite our paper. Here is an example bibtex:

@article{liaw2018tune,
    title={Tune: A Research Platform for Distributed Model Selection and Training},
    author={Liaw, Richard and Liang, Eric and Nishihara, Robert
            and Moritz, Philipp and Gonzalez, Joseph E and Stoica, Ion},
    journal={arXiv preprint arXiv:1807.05118},
    year={2018}
}