Skip to content
/ HAR Public template

Recognize one of six human activities such as standing, sitting, and walking using a Softmax Classifier trained on mobile phone sensor data.

License

Notifications You must be signed in to change notification settings

RubixML/HAR

Repository files navigation

Human Activity Recognizer

This is an example project that demonstrates the problem of human activity recognition (HAR) given mobile phone sensor data (Gyroscope, Accelerometer, etc.). The training data are the human annotated sensor readings of 30 volunteers while performing various tasks such as sitting, standing, and laying down. Each sample is 561 dimensional, however we demonstrate that with a technique called random projection we can reduce the dimensionality down to 120 without any loss in accuracy. The estimator employed to make the predictions is a Softmax Classifier which is a multiclass generalization of the Logistic Regression classifier used in the Credit Card Default Predictor example project.

  • Difficulty: Medium
  • Training time: < 5 Minutes
  • Memory needed: < 1G

Installation

Clone the repository locally using Git:

$ git clone https://github.com/RubixML/HAR

Install dependencies using Composer:

$ composer install

Requirements

  • PHP 7.1.3 or above

Project Description

The experiments have been carried out with a group of 30 volunteers within an age bracket of 19-48 years. Each person performed six activities (WALKING, WALKING_UPSTAIRS, WALKING_DOWNSTAIRS, SITTING, STANDING, LAYING) wearing a smartphone (Samsung Galaxy S II) on the waist. Using its embedded accelerometer and gyroscope, we captured 3-axial linear acceleration and 3-axial angular velocity at a constant rate of 50Hz. The experiments have been video-recorded to label the data manually. The obtained dataset has been randomly partitioned into two sets, where 70% of the volunteers was selected for generating the training data and 30% the test data.

The sensor signals (accelerometer and gyroscope) were pre-processed by applying noise filters and then sampled in fixed-width sliding windows of 2.56 sec and 50% overlap (128 readings/window). The sensor acceleration signal, which has gravitational and body motion components, was separated using a Butterworth low-pass filter into body acceleration and gravity. The gravitational force is assumed to have only low frequency components, therefore a filter with 0.3 Hz cutoff frequency was used. From each window, a vector of features was obtained by calculating variables from the time and frequency domain. See 'features_info.txt' for more details.

Tutorial

On the map ...

Original Dataset

Contact: Jorge L. Reyes-Ortiz(1,2), Davide Anguita(1), Alessandro Ghio(1), Luca Oneto(1) and Xavier Parra(2) Institutions: 1 - Smartlab - Non-Linear Complex Systems Laboratory DITEN - University degli Studi di Genova, Genoa (I-16145), Italy. 2 - CETpD - Technical Research Centre for Dependency Care and Autonomous Living Universitat Polit�cnica de Catalunya (BarcelonaTech). Vilanova i la Geltr� (08800), Spain activityrecognition '@' smartlab.ws

References:

[1] Davide Anguita, Alessandro Ghio, Luca Oneto, Xavier Parra and Jorge L. Reyes-Ortiz. A Public Domain Dataset for Human Activity Recognition Using Smartphones. 21th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN 2013. Bruges, Belgium 24-26 April 2013.