collections of efficient tools for deep learning experiments, e.g., experiments control, hyperparameter optimizaiton
Recommendations or PR are welcomed!
-
https://paperswithcode.com/: paper with code
-
https://linggle.com/: academic collocations
-
https://mathpix.com/: formular pictures to Latex code
-
logging: a python module for logging
-
tensorboard(for TensorFlow)/tensorboardX(for PyTorch): visualization during experiments (update: PyTorch officially support TensorboardX since v1.1.0, please use
from torch.utils.tensorboard import SummaryWriter
.) -
pyyaml or ruamel.yaml: python modules for yaml configuration
- Pytorch-Hydra-template: A clean and scalable template to kickstart your deep learning project
- Pytorch-Project-Template: to be released
- Pytorch-Lighting:The lightweight PyTorch wrapper for high-performance AI research.
- apex: mixed-precisin (no longer being maintainted)
- Horovod by Uber: a distributed deep learning training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
-
hydra(github): A framework for elegantly configuring complex applications
-
yacs: Yet Another Configuration System by Ross Girshick
-
alfred: A deep learning utility library for visualization and sensor fusion purpose
-
DALI: A library containing both highly optimized building blocks and an execution engine for data pre-processing in deep learning applications
-
Project manifest. Part of Catalyst Ecosystem:
-
tensorboard.dev: visualization and tracking
-
wandb: A tool for visualizing and tracking your machine learning experiments.
-
fitlog by Fudan University: A tool for logging and code management
-
runx by NVIDA: Deep Learning Experiment Management
-
NNI (Neural Network Intelligence) by Microsoft: a toolkit to help users design and tune machine learning models (e.g., hyperparameters), neural network architectures, or complex system’s parameters, in an efficient and automatic way
-
TorchTracer: a tool package for visualization and storage management in pytorch AI task.
- Tune: a Python library for experiment execution and hyperparameter tuning at any scale.
- Bayesian Optimization: A Python implementation of global optimization with gaussian processes.
- adatune: Gradient based Hyperparameter Tuning library in PyTorch
- FAR-HO: Gradient based hyperparameter optimization & meta-learning package for TensorFlow
- optuna: An open source hyperparameter optimization framework to automate hyperparameter search
- lightly(github): a computer vision framework for self-supervised learning.
- vissl(github): Vision library for state-of-the-art Self-Supervised Learning research with PyTorch
- DomainBed:a PyTorch suite containing benchmark datasets and algorithms for domain generalization, as introduced in In Search of Lost Domain Generalization.