Learning Rate Warmup in PyTorch
-
Updated
Nov 15, 2024 - Python
Learning Rate Warmup in PyTorch
Gradient based Hyperparameter Tuning library in PyTorch
optimizer & lr scheduler & loss function collections in PyTorch
Polynomial Learning Rate Decay Scheduler for PyTorch
A guide that integrates Pytorch DistributedDataParallel, Apex, warmup, learning rate scheduler, also mentions the set-up of early-stopping and random seed.
Pytorch cyclic cosine decay learning rate scheduler
Automatic learning-rate scheduler
Warmup learning rate wrapper for Pytorch Scheduler
A learning rate recommending and benchmarking tool.
sharpDARTS: Faster and More Accurate Differentiable Architecture Search
[PENDING] The official repo for the paper: "A Lightweight Multi-Head Attention Transformer for Stock Price Forecasting".
Keras Callback to Automatically Adjust the learning rate when it stops improving
Pytorch implementation of arbitrary learning rate and momentum schedules, including the One Cycle Policy
Implementation of fluctuation dissipation relations for automatic learning rate annealing.
Code to reproduce the experiments of ICLR2023-paper: How I Learned to Stop Worrying and Love Retraining
A method for assigning separate learning rate schedulers to different parameters group in a model.
Comprehensive image classification for training multilayer perceptron (MLP), LeNet, LeNet5, conv2, conv4, conv6, VGG11, VGG13, VGG16, VGG19 with batch normalization, ResNet18, ResNet34, ResNet50, MobilNetV2 on MNIST, CIFAR10, CIFAR100, and ImageNet1K.
(GECCO2023 Best Paper Nomination & ACM TELO) CMA-ES with Learning Rate Adaptation
Add a description, image, and links to the learning-rate-scheduling topic page so that developers can more easily learn about it.
To associate your repository with the learning-rate-scheduling topic, visit your repo's landing page and select "manage topics."