Skip to content

Code for the project "Impact of the pre-training data distribution on the fine-tuned performance of Masked Autoencoders".

Notifications You must be signed in to change notification settings

Sam-Oliveira/pretraining_mae

Repository files navigation

Impact of the pre-training data distribution on the fine-tuned performance of MAEs

This repo contains the code developed to study the impact of different pre-training data distributions on the downstream fine-tuned performance of Masked Autoencoders. The report can be accessed here.

How to run

The "instruction.pdf" file contains detailed instructions on how to reproduce the results contained in the report.

About

Code for the project "Impact of the pre-training data distribution on the fine-tuned performance of Masked Autoencoders".

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published