OpenMMLab Pre-training Toolbox and Benchmark
-
Updated
Nov 1, 2024 - Python
OpenMMLab Pre-training Toolbox and Benchmark
OpenMMLab Self-Supervised Learning Toolbox and Benchmark
The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [TPAMI'23] "ViTPose++: Vision Transformer for Generic Body Pose Estimation"
[NeurIPS 2022 Spotlight] VideoMAE: Masked Autoencoders are Data-Efficient Learners for Self-Supervised Video Pre-Training
solo-learn: a library of self-supervised methods for visual representation learning powered by Pytorch Lightning
[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"
Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models
ConvMAE: Masked Convolution Meets Masked Autoencoders
[ICCV 2023] You Only Look at One Partial Sequence
[Survey] Masked Modeling for Self-supervised Representation Learning on Vision and Beyond (https://arxiv.org/abs/2401.00897)
PASSL包含 SimCLR,MoCo v1/v2,BYOL,CLIP,PixPro,simsiam, SwAV, BEiT,MAE 等图像自监督算法以及 Vision Transformer,DEiT,Swin Transformer,CvT,T2T-ViT,MLP-Mixer,XCiT,ConvNeXt,PVTv2 等基础视觉算法
[ECCV 2024] Improving 2D Feature Representations by 3D-Aware Fine-Tuning
Official Code of Paper "Reversible Column Networks" "RevColv2"
[ICCV 2023] Code base for Revisiting Scene Text Recognition: A Data Perspective
PySODEvalToolkit: A Python-based Evaluation Toolbox for Salient Object Detection and Camouflaged Object Detection
reproduction of semantic segmentation using masked autoencoder (mae)
Paddle Large Scale Classification Tools,supports ArcFace, CosFace, PartialFC, Data Parallel + Model Parallel. Model includes ResNet, ViT, Swin, DeiT, CaiT, FaceViT, MoCo, MAE, ConvMAE, CAE.
Foundation models based medical image analysis
visualization:filter、feature map、attention map、image-mask、grad-cam、human keypoint、guided-backpro
Efficient Network Traffic Classification via Pre-training Unidirectional Mamba
Add a description, image, and links to the mae topic page so that developers can more easily learn about it.
To associate your repository with the mae topic, visit your repo's landing page and select "manage topics."