-
Notifications
You must be signed in to change notification settings - Fork 5.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[tune] HyperOpt Support (v2) #1763
Conversation
Test FAILed. |
Test FAILed. |
Test PASSed. |
Test PASSed. |
Test FAILed. |
Test FAILed. |
Test PASSed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking good, a bunch of clarification comments.
python/ray/tune/experiment.py
Outdated
self.spec = spec | ||
|
||
|
||
class JSONExperiment(Experiment): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can this be a function that returns an Experiment object instead, e.g.,
Experiment.from_json(name, spec)
python/ray/tune/hpo_scheduler.py
Outdated
try: | ||
import hyperopt as hpo | ||
except Exception as e: | ||
print("HyperOpt not found; perhaps not installed?") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you want to make this a soft dep, set hpo = None, and check this in the HyperOptScheduler constructor
from ray.tune.variant_generator import to_argv | ||
|
||
|
||
class HyperOptScheduler(FIFOScheduler): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
class doc
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
python/ray/tune/hpo_scheduler.py
Outdated
class HyperOptScheduler(FIFOScheduler): | ||
|
||
def __init__(self, max_concurrent=10, loss_attr="mean_loss"): | ||
self._max_concurrent = max_concurrent # NOTE: this is modified later |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't we always launch as many trials at once as resources permits?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not necessarily; the more concurrent trials the less effective your search. For example, SigOpt only allows for 10 concurrent trials.
Now, for this particular case it could make sense to default to as many as resource permits.
python/ray/tune/hpo_scheduler.py
Outdated
self._loss_attr = loss_attr | ||
self._experiment = None | ||
|
||
def track_experiment(self, experiment, trial_runner): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add_experiment()
try: | ||
trial_runner.add_trial(next(generator)) | ||
except StopIteration: | ||
break |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Move this into TrialScheduler?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no strong opinion about this; is there a benefit to keeping TrialScheduler as an interface as opposed to turning it into an abstract class?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It doesn't seem particularly specific to fifo scheduler. I imagine most schedulers will want to extend this instead of fifoscheduler.
Speaking of which should HyperOpt extend the scheduler interface instead of fifoscheduler?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Moved this method.
I think wrt extending HyperOpt it's fine to extend fifo, as it does call FIFOScheduler
methods for on_trial_add
and also for choose_trial_to_run
.
That being said, the choose_trial_to_run
invocation is done as a weird class method invocation rather than a super invocation and could work either way, so I'm not married to anything...
python/ray/tune/trial_scheduler.py
Outdated
@@ -47,11 +48,18 @@ def on_trial_remove(self, trial_runner, trial): | |||
|
|||
raise NotImplementedError | |||
|
|||
def choose_trial_to_run(self, trial_runner, trials): | |||
def track_experiment(self, experiment, trial_runner): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add_experiment
python/ray/tune/trial_scheduler.py
Outdated
@@ -47,11 +48,18 @@ def on_trial_remove(self, trial_runner, trial): | |||
|
|||
raise NotImplementedError | |||
|
|||
def choose_trial_to_run(self, trial_runner, trials): | |||
def track_experiment(self, experiment, trial_runner): | |||
"""Tracks experiment from which trials and generated and queued.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"""Adds an experiment to the scheduler. The scheduler is responsible for adding the trials of the experiment to the runner, which can be done immediately (if there are a finite set of trials), or over time (if there is an infinite stream of trials)."""
python/ray/tune/trial_scheduler.py
Outdated
"""Called to choose a new trial to run. | ||
|
||
This should return one of the trials in trial_runner that is in | ||
the PENDING or PAUSED state.""" | ||
the PENDING or PAUSED state. This is assumed to be idempotent. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This function must be idempotent.
python/ray/tune/tune.py
Outdated
try: | ||
_SCHEDULERS[key] = get_scheduler(key) | ||
except ImportError as e: | ||
print("Could not import scheduler: {}".format(key)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If this is just for HyperOpt, let's just have that fail in the constructor instead of on import.
Test PASSed. |
Test FAILed. |
Test FAILed. |
Test FAILed. |
retest this please |
1 similar comment
retest this please |
Test FAILed. |
retest this please |
Test FAILed. |
retest this please |
Test PASSed. |
Test PASSed. |
Test PASSed. |
Test PASSed. |
travis completely (!) passes on private branch. This also subsumes #1823, so waiting on approval there. |
What! This is unheard of :) |
pip install --upgrade git+git://github.com/hyperopt/hyperopt.git