A method for Automatic Modulation Classification using Mamba structure
A project employing the Selective State Space Model (Mamba) method for Automatic Modulation Classification (AMC) in a scenario of extended signal length.
The increased sequence length complicates the learning process and diminishes accuracy, while simultaneously escalating memories and reduces timeliness. This issue brings the following impacts:
if our codes helped your reasearch, please consider citing the corresponding submission
@article{zhang2024mamca,
title={MAMCA -- Optimal on Accuracy and Efficiency for Automatic Modulation Classification with Extended Signal Length},
author={Yezhuo Zhang and Zinan Zhou and Yichao Cao and Guangyu Li and Xuanpeng Li},
year={2024},
journal={arXiv preprint arXiv:2405.11263},
}
We utilize a denosing unit for better accuracy performance under noise interference, while using Mamba as the backbone for low GPU occupancy and training/inference time.
To related AMC works, as well as sorce code:
-
Deep Learning Based Automatic Modulation Recognition: Models, Datasets, and Challenges
To the denosing method employed in our work, as well as source code:
-
IEEE Transactions on Industrial Informatics 2020
-
Deep-Residual-Shrinkage-Networks-for-intelligent-fault-diagnosis-DRSN
To the Mamba method employed in our work, as well as source code:
-
arXiv preprint arXiv:2312.00752
Mamba: Linear-Time Sequence Modeling with Selective State Spaces
-
mamba
pip install -r requirements.txt
cd into code/script
and do
bash RML2016.10a.sh
If you have any problem with our code or any suggestions, including discussion on SEI, please feel free to contact
- Yezhuo Zhang (zhang_yezhuo@seu.edu.cn | zhang_yezhuo@outlook.com)
- Zinan Zhou (zhouzinan919@seu.edu.cn)
- Xuanpeng Li (li_xuanpeng@seu.edu.cn)