-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Description
Treating integer-valued parameters as floats can be sub-optimal for the maximization procedure. This is discussed in https://arxiv.org/abs/1706.03673, where the authors have shown that accounting for the integer nature of some parameters can lead to faster convergence (see for example their figure 3). Given that optimizing the machine learning hyperparameters involves many times integer parameters, I think it is a worthwhile problem.
The same paper proposes a simple way of dealing with integer parameters with minimal changes to the code. For backward compatibility, the simplest way to implement the changes would be to add an extra dictionary where you specify the integer variables (if no such dictionary is given, then all are assumed to be floats).
Is this a change worth considering? If so, I can give it a try and implement it as a pull request.