Skip to content

Radio modulation recognition with CNN, CLDNN, CGDNN and MCTransformer architectures. Best results were achieved with the CGDNN architecture, which has roughly 50,000 parameters, and the final model has a memory footprint of 636kB. More details can be found in my bachelor thesis linked in the readme file.

License

Notifications You must be signed in to change notification settings

KristynaPijackova/Radio-Modulation-Recognition-Networks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Radio Modulation Classification Using Deep Learning Architectures


Author: Kristyna Pijackova


This notebook contains code for my bachelor thesis in the academic year 2020/2021.

Abstract

The bachelor thesis is focused on radio modulation classification with a deep learningapproach. There are four deep learning architectures presented in the thesis. Three ofthem use convolutional and recurrent neural networks, and the fourth uses a transformerarchitecture. The final number of parameters of each model was considered during thedesign phase, as it can have a big impact on a memory footprint of a deployed model.The architectures were written in Keras, which is a software library, which provides a Python interface for neural networks. The results of the architectures were additionally compared to results from other research papers on this topic.


The code structure is following:

  • Imports - Import needed libraries
  • Defined Functions - Functions defined for an easier manipulation with the data later on
  • Accessing the datasets - you may skip this part and download the datasets elsewhere if you please
  • Loading Data - Load the data and divide them into training, validation and test sets
  • Deep Learning Part -Contains the architectures, which are prepared to be trained and evaluated
  • Load Trained Model - Optionaly you can download the CGDNN model and see how it does on the corresponding dataset
  • Layer Visualization - A part of code which was written to visualize the activation maps of the convolutional and recurrent layers
  • Plotting - You can plot the confusion matrices in this part

This code was used in Google Colab enviroment utilizing its free GPU. If you'd like to see how it works feel free to get a copy from the GitHub repository and experiment on your own.


Dataset used in the thesis

RadioML Datasets

Both datasets are left unchanged, however, the RadioML2016.10b version is not stored as the original data, but is already splitted into X and labels

Migou-Mod Dataset

The following version of the dataset contain only a fraction of the original samples (550,000 samples compared to 8.8 million samples in the original dataset)

VUT Dataset

This dataset was generated in MATLAB with 1000 samples per SNR value and each modulation type. It includes three QAM modulation schemes and further OFDM, GFDM, and FBMC modulations which are not included in previous datasets. To mimic the RadioML dataset, the data are represented as 2x128 vectors of I/Q signals in the SNR range from -20 dB to 18 dB.

DL Architectures

CNN

CNN

CLDNN

CLDNN

CGDNN

CGDNN

MCTransformer

MCTransformer

Source: [2]

Confusion Matrices - Examples of CNN and CLDNN Matrices

CMs

Source: [1]

References

[1] Pijackova, Kristyna, and Tomas Gotthans. "Radio Modulation Classification Using Deep Learning Architectures." 2021 31st International Conference Radioelektronika (RADIOELEKTRONIKA). IEEE, 2021.

[2] Pijackova, Kristýna "Radio Modulation Recognition Networks" Brno: Brno Univer-sity of Technology, Faculty of Electrical Engineering and Communication, Departmentof Radio Electronics, 2021, 61 p. Bachelor’s Thesis. Advised by doc. Ing. Tomas Gotthans, Ph.D, [ONLINE] https://www.vutbr.cz/studenti/zav-prace/detail/133594

If you end up using some of the architectures, please, consider citing one of the above mentioned works :)

About

Radio modulation recognition with CNN, CLDNN, CGDNN and MCTransformer architectures. Best results were achieved with the CGDNN architecture, which has roughly 50,000 parameters, and the final model has a memory footprint of 636kB. More details can be found in my bachelor thesis linked in the readme file.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published