Data Source - http://archive.ics.uci.edu/ml/datasets/mhealth+dataset
The MHEALTH (Mobile HEALTH) dataset comprises body motion and vital signs recordings for ten volunteers of diverse profile while performing several physical activities. Sensors placed on the subject's chest, right wrist and left ankle are used to measure the motion experienced by diverse body parts, namely, acceleration, rate of turn and magnetic field orientation. The sensor positioned on the chest also provides 2-lead ECG measurements, which can be potentially used for basic heart monitoring, checking for various arrhythmias or looking at the effects of exercise on the ECG.
The activity set is listed in the following:
- L1: Standing still (1 min)
- L2: Sitting and relaxing (1 min)
- L3: Lying down (1 min)
- L4: Walking (1 min)
- L5: Climbing stairs (1 min)
- L6: Waist bends forward (20x)
- L7: Frontal elevation of arms (20x)
- L8: Knees bending (crouching) (20x)
- L9: Cycling (1 min)
- L10: Jogging (1 min)
- L11: Running (1 min)
- L12: Jump front & back (20x)
NOTE: In brackets are the number of repetitions (Nx) or the duration of the exercises (min).
Whole project is split into 3 parts:
- Fetching Data from UCI machine learning repository.
- Performing EDA to gain insights about the data and prepare dataset for DL.
- Train Deep Learning model and make predictions.
Well we can easily distinguish between static and dynamic activities by looking at the data but for differentiate between same kind, we need Machine learning or deep learning model. As this is time series problem, I've used LSTMs for model training.
Model Architechture:
It performs well on almost all activities but confuses between few activity.