You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a collection of PyTorch snippets that have helped me (and others) significantly with doing ML research.
2
2
3
-
*A minimal [CIFAR-10 training script](minimal_cifar) to get 94% in ~150 lines, for a simple but strong baseline.
4
-
*[FastMNIST](fast_mnist): a drop-in replacement for the standard MNIST dataset which speeds up training on the **GPU** (2x !) by avoiding unnecessary preprocessing.
3
+
*[Minimal CIFAR-10](minimal_cifar)A very simple training script to get 94% accuracy with just ~150 lines of code, for an easy but strong baseline.
4
+
*[FastMNIST](fast_mnist): a drop-in replacement for the standard MNIST dataset which speeds up training on the **GPU** (by 2x !) by avoiding unnecessary preprocessing that pegs the cpu at 100% for small models.
5
5
*[Subset of ImageNet](subset_of_imagenet): it's remarkably difficult to train on a subset of the ImageNet classes with the default TorchVision datasets. This snippet makes the minimal changes to them to make it very easy.
6
+
*[ImageNet dogs vs not dogs](imagenet_dogs_vs_notdogs): a standardized setup for the ImageNet Dogs vs Not Dogs out-of-distribution detection task.
6
7
7
-
In a [separate repository](https://github.com/y0ast/slurm-for-ml) I explain how I use `slurm` job arrays without pain using a simple 1 file shell script.
8
+
In a separate repository, [Slurm for ML](https://github.com/y0ast/slurm-for-ml), I explain how I use `slurm` job arrays without pain using a simple 1 file shell script.
0 commit comments