This repo is outdated and will no longer be maintained.
Unofficial implementation of the lookahead mechanism for optimizers.
pip install git+https://github.com/cyberzhg/keras-lookahead.gitArguments:
optimizer: Original optimizer.sync_period: thekin the paper. The synchronization period.slow_step: theαin the paper. The step size of slow weights.
from keras_lookahead import Lookahead
optimizer = Lookahead('adam', sync_period=5, slow_step=0.5)Custom optimizers can also be used:
from keras_radam import RAdam
from keras_lookahead import Lookahead
optimizer = Lookahead(RAdam())