Skip to content

Project that aims at improving the robustness and generalization of transformer-based neural decoders for brain–machine interfaces using self-supervised learning (TS-TCC) on ECoG signals.

Notifications You must be signed in to change notification settings

HugoDemRey/SSL-motor-decoding

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 

Repository files navigation

Improving Cross-Session Generalization in ECoG-Based Motor Decoding Using Self-Supervised Learning

Abstract—Decoding motor behavior from electrocorticography (ECoG) signals is difficult due to strong variability across recording sessions and the limited availability of labeled data. The goal of this semester project was to improve a supervised baseline model by exploring self-supervised learning (SSL) approaches and alternative training losses. The baseline model is a transformer-based architecture that takes wavelet-transformed ECoG signals as input and predicts the wrist position of a monkey. TS-TCC is applied as a self-supervised pretraining method, using contextual and temporal contrastive losses without relying on labels. After pretraining, only the encoder is retained and used for downstream regression tasks. The results show that the self-supervised pretrained model generalizes better to future recording sessions compared to the fully supervised baseline, achieving higher mean R^2 scores across recordings. Label fraction experiments further demonstrate that SSL allows the model to reach reasonable performance with as little as 1% of labeled data. Additional losses based on soft temporal and instance-level contrastive learning were also evaluated. While these losses showed promising behavior on smaller datasets, they tended to degrade performance when applied to large-scale pretraining, often smoothing predictions and reducing performance on well-performing sessions. Overall, this project highlights the potential of self-supervised learning to improve generalization and label efficiency for ECoG-based motor decoding.

👉 Read the full project's report here.


Project by Hugo Demule supervised by Yuhan Xie and Prof. Shoaran from the Integrated Neurotechnologies Laboratory, Campus Biotech, Geneva, Switzerland

About

Project that aims at improving the robustness and generalization of transformer-based neural decoders for brain–machine interfaces using self-supervised learning (TS-TCC) on ECoG signals.

Topics

Resources

Stars

Watchers

Forks

Languages