FedDSHAR: A Dual-Strategy Federated Learning Approach for Human Activity Recognition under Extremely Noisy Label Conditions
Federated learning (FL) has recently achieved remarkable success in privacy-sensitive healthcare applications, including medical analysis. However, most previous studies assume that collected user data are well-annotated—a strong assumption in practice. For instance, in the human activity recognition (HAR) task, the goal is to train a model that predicts a person’s activity based on sensor data collected over a period of time. Due to diverse and incomplete annotation methods, user-side data invariably contain significant label noise, which can greatly hinder model convergence and degrade performance. To address this issue, we propose FedDSHAR, a novel federated learning framework that partitions user-side data into clean and noisy subsets. Two distinct strategies are applied on the respective subsets: strategic time-series augmentation on the clean data and a semi-supervised learning scheme for the noisy data. Extensive experiments on three public, real-world HAR datasets demonstrate that FedDSHAR outperforms six state-of-the-art methods, particularly in scenarios with extreme label noise encountered in distributed noisy HAR applications.
-
Create a Python virtual environment (recommended):
conda create -n feddshar python=3.7.9 conda activate feddshar
-
Install dependencies:
pip install -r requirements.txt
-
Prepare your dataset by placing it in the
dataset/directory. -
Run the training script:
python train_fed_dshar.py
-
To run all experiments, use:
python run_all.py
For benchmark datasets, please refer to the HARBOX dataset.
- Python 3.7.9
- PyTorch 1.11.0
- torchvision 0.12.0
- numpy 1.21.5
- pandas 1.4.2
- scikit-learn 1.0.2
- efficientnet-pytorch 0.7.1
- pretrainedmodels 0.7.4
- tensorboardx 2.2
- pillow 9.0.1
- FedAvg
Paper - FedProx
Paper - RoFL
Paper | Code - FedLSR
Paper | Code - FedCorr
Paper | Code - FedNoRo
Paper | Code
This work has been accepted for publication. The final published article is available at ScienceDirect.