-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tfp.optimizer.lbfgs_minimize as Keras optimizer? #565
Comments
Thanks for the recommendation, @Hoeze! Are you using stand-alone Keras for your work, or |
@dynamicwebpaige Thanks for your answer. I would be cool if it was implemented as a Tensorflow optimizer similar to ADAM: The big problem with |
@Hoeze I was looking for the same functionality and I found this blog that show how to use the lbfgs_minimize() with a tf.keras model: https://pychao.com/2019/11/02/optimize-tensorflow-keras-models-with-l-bfgs-from-tensorflow-probability/ However it would be very useful if the tf.keras development team could implement this as a Tensorflow optimizer in a future update, just like you suggest!! |
You can follow also tensorflow/tensorflow#48167 |
That blog post is a great resource for how to glue the functional form in TFP together with tf.Variables (keras). I think we should probably fix the requirement that the parameters be a single 1D tensor, as this is inconsistent with other places in TFP, but it is not currently a high priority. |
Hi, is there a way to use
tfp.optimizer.lbfgs_minimize
as Keras optimizer?This would be quite useful in certain cases where the loss function is approximately quadratic.
A colleague of mine would very much need it since an autoencoder written in R with negative-binomial loss converges faster than its Keras counterpart.
The text was updated successfully, but these errors were encountered: