Human Activity Recognition, is the problem of predicting what kind of activity a person is performing based on a signals detected by smartphone sensors on their waist.
- 30 volunteers referred to as subjects performed the experiment for data collection wearing smartphones sensors on their waist.
- he two smartphone sensors captured the 3 axial linear acceleration as well as the 3 axial angular velocity of the subject.
- The sensor signals were sampled in fixed-width sliding windows of 2.56 sec and 50% overlap (128 readings/window).
- The data were recorded at the constant frequency of 50Hz (50 data points were recorded each second )
- Data is downloaded from following source: https://archive.ics.uci.edu/ml/datasets/human+activity+recognition+using+smartphones
- Feature names are present in UCI_HAR_dataset/features.txt
- Train Data
UCI_HAR_dataset/train/X_train.txt
UCI_HAR_dataset/train/subject_train.txt
UCI_HAR_dataset/train/y_train.txt
- Test Data
UCI_HAR_dataset/test/X_test.txt
UCI_HAR_dataset/test/subject_test.txt
UCI_HAR_dataset/test/y_test.txt
Put the HumanActivityRecognition.ipynb
file inside the Dataset folder to run the .ipynb file(The HumanActivityRecognition.ipynb
file and UCI_HAR_DATASET must be in the same directory and should have same parent)
Predict one of the following six activities that a Smartphone user is performing at that 2.56 Seconds time window by using either 561 feature data or raw features of 128 reading.
- Walking
- Walking Upstairs
- Walking Downstairs
- Sitting
- Standing
- Laying
- Neha Kumari
- Kumar Shivam Ranjan
- Madhav Bansal