Skip to content

(paper source) Context-Aware Prototype-Guided State-Space Model for Efficient Wearable Human Activity Recognition in IoT Devices

License

Notifications You must be signed in to change notification settings

lky473736/CAP-SSM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CAP-SSM

(paper source) Context-Aware Prototype-Guided State-Space Model for Efficient Wearable Human Activity Recognition in IoT Devices

This repository implements the methodology proposed in the paper "Context-Aware Prototype-Guided State-Space Model for Efficient Wearable Human Activity Recognition in IoT Devices".

Requirements

The following libraries are required:

torch>=1.12.0
numpy>=1.21.0
pandas>=1.3.0
scikit-learn>=1.0.0
matplotlib>=3.5.0
seaborn>=0.11.0
scipy>=1.7.0
requests>=2.25.0
thop>=0.1.1
mlxtend>=0.21.0

You can install all required packages using:

pip install -r requirements.txt

Citing this Repository

If you use this code in your research, please cite:

@article{Lim2025-CAP-SSM,
  title = {Context-Aware Prototype-Guided State-Space Model for Efficient Wearable Human Activity Recognition in IoT Devices},
  author={Gyuyeon Lim and Myung-Kyu Yi}
  journal={},
  volume={},
  Issue={},
  pages={},
  year={}
  publisher={}
}

Contact

For questions or issues, please contact:

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

(paper source) Context-Aware Prototype-Guided State-Space Model for Efficient Wearable Human Activity Recognition in IoT Devices

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages