Skip to content

๐ŸŒŸ Welcome to the official code repository for our ICML 2025 paper MindAligner!

Notifications You must be signed in to change notification settings

Da1yuqin/MindAligner

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

18 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

MindAligner

Code for our ICML 2025 paper : "MindAligner: Explicit Brain Functional Alignment for Cross-Subject Brain Visual Decoding with Limited Data"

โœจ Welcome to give a star! โœจ

Environment

Use our setup.sh for environment preparation:

bash setup.sh

Data Preparation

In our experiments, we use NSD for both training and evaluation.

  1. Agree to the Natural Scenes Dataset's Terms and Conditions and fill out the NSD Data Access form.

  2. Download the processed dataset from here and unzip them to the ./dataset folder.

  3. Run the following command to automatically obtain similar image pairs used by our MindAligner:

     cd sim_dataset
     python process_dataset.py
    

    The preprocessed features will be automatically generated in the ./sim_dataset/v2subj1257 folder after running the script above.

  4. Please download the pretrained decoding model from here to the directory ./decoding_model. We only use final_multisubject_subj0{args.n_subj}/last.pth, so you only need to download the relevant checkpoints. Here, subj_id={1,2,5,7}.

Training

Our models are all trained with single NVIDIA V100 GPU.

python train.py --n_subj 1 --k_subj 2 --num_sessions 1  

Evaluation

To test with our pretrained models, please download the weights from here (huggingface) into ./ckpts folder.

1. Generate Results

To test the MindAligner and generate results:

python recon.py --n_subj 1 --k_subj 2 

All files used for evaluation will be stored to .evals/1->2. The generated images can be found in .evals/out_plot.

2. Enhance Results

To obtain final enhanced restuls:

python enhance.py --n_subj 1 --k_subj 2 

All files used for evaluation will be stored to .evals/1->2.

Contributing

A huge thank you to the following contributor for her outstanding work on the code! ๐Ÿ’–๐Ÿ™Œโœจ

Zhouheng Yao
Zhouheng Yao
๐Ÿญ๐Ÿฝ๐Ÿฑ

Citation

@article{dai2025mindaligner,
    title={MindAligner: Explicit Brain Functional Alignment for Cross-Subject Visual Decoding from Limited fMRI Data},
    author={Dai, Yuqin and Yao, Zhouheng and Song, Chunfeng and Zheng, Qihao and Mai, Weijian and Peng, Kunyu and Lu, Shuai and Ouyang, Wanli and Yang, Jian and Wu, Jiamin},
    journal={arXiv preprint arXiv:2502.05034},
    year={2025}
}

About

๐ŸŒŸ Welcome to the official code repository for our ICML 2025 paper MindAligner!

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published