Availability of models trained on only ImageNet1k #1400
Replies: 1 comment
-
@khawar-islam most models are in1k pretrained, many 21k/22k pretrained have a tag in the name... I guess the biggest exceptions being the vit models, which were updated to use 21k pretrain by default (as they are much much better and 1k recipes weren't great) and some special cases like beit, wswl/ssl models w/ large dataset pretrain or special techniques. A new 'tag' / metada feature will be added soon for pretrained cfgs that will make it more clear what pretrain dataset is... You can load all 1k vit models for this paper https://arxiv.org/abs/2106.10270 by exploring the checkpoints using this notebook as a template https://colab.research.google.com/github/google-research/vision_transformer/blob/main/vit_jax_augreg.ipynb ... |
Beta Was this translation helpful? Give feedback.
-
Are there any models that only trained on ImageNET-1k dataset?
I have seen a lot of models that are trained on 22k imageNet dataset and finetune on Imagenet1k dataset. But, I need a model that only trained on imageNet-1k dataset.
Beta Was this translation helpful? Give feedback.
All reactions