ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
-
Updated
Feb 27, 2023 - Python
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation preconditioner and more)
An implementation of PSGD Kron second-order optimizer for PyTorch
Distributed K-FAC preconditioner for PyTorch
FEDL-Federated Learning algorithm using TensorFlow (Transaction on Networking 2021)
This repository implements FEDL using pytorch
PyTorch implementation of the Hessian-free optimizer
Tensorflow implementation of preconditioned stochastic gradient descent
Implementation of PSGD optimizer in JAX
Hessian-based stochastic optimization in TensorFlow and keras
This package is dedicated to high-order optimization methods. All the methods can be used similarly to standard PyTorch optimizers.
Minimalist deep learning library with first and second-order optimization algorithms made for educational purpose
FOSI library for improving first order optimizers with second order information
The repository contains code to reproduce the experiments from our paper Error Feedback Can Accurately Compress Preconditioners available below:
Newton’s second-order optimization methods in python
Modular optimization library for PyTorch.
Matrix-multiplication-only KFAC; Code for ICML 2023 paper on Simplifying Momentum-based Positive-definite Submanifold Optimization with Applications to Deep Learning
NG+: A new second-order optimizer for deep learning
Stochastic Second-Order Methods in JAX
sophia optimizer further projected towards flat areas of loss landscape
Add a description, image, and links to the second-order-optimization topic page so that developers can more easily learn about it.
To associate your repository with the second-order-optimization topic, visit your repo's landing page and select "manage topics."