Presentation is loading. Please wait.

Presentation is loading. Please wait.

Mobile Activity Recognition

Similar presentations


Presentation on theme: "Mobile Activity Recognition"— Presentation transcript:

1 Data Mining Sample Research: Activity Recognition Classification in Action

2 Mobile Activity Recognition
Mobile devices like smartphones and smartwatches have many sensors Some sensors measure motion Tri-axial accelerometer Gyroscope GPS and other location sensors Activity Recognition is now pretty common but wasn’t when this research started It took a year for our Fitbit order to be delivered

3 What is Activity Recognition?
Identifying a user’s activity based on data In our case the mobile sensor data from the accelerometer and gyroscope What type of data mining task is this? Classification How would you formulate this as a classification task? Not so obvious if you have not read the paper, since time dimension complicates things

4 More on Activity Recognition
Examples of activities Walking, jogging, running, jumping, washing dishes, playing basketball, reading, partying, studying, eating, drinking, etc Why do we care? Context sensitive “smart” devices Fitness and health applications To track what we do for other purposes

5 The Data The data is collected at 20 Hz
A timestamped sequence of numbers for each of 3 dimensions for both sensors

6 Walking Data Watch Gyroscope Phone accelerometer

7 Phone Jogging Accelerometer

8 Phone Accelerometer Standing Data

9 WISDM Activity Recognition Studies
2010 study using only smartphones Good results, but only 6 basic activities (29 subjects) More refined studies over next few years, including impact of personal models 2016 study: smartphones & smartwatches Good results over 18 activities (17 subjects) Hand-based activities including eating In progress Increasing test subjects to and more thorough evaluation of the four sensors Phone accel, phone gyro, watch accel, watch gyro, fusion

10 The 2016 Smartwatch Activities
General Activities Walking* Jogging* Climbing Stairs* Sitting* Standing* Kicking Soccer Ball General Activities (hand-oriented) Dribbling Basketball Playing Catch with Tennis Ball Typing Handwriting Clapping Brushing Teeth Folding Clothes Eating Activities (hand-oriented) Eating Pasta Eating Soup Eating Sandwich Eating Chips Drinking from a Cup * These used in the 2010 smartphone study

11 Formulation as Classification
Take the raw time series sensor data for non- overlapping 10 second chunks and create one example Use higher level features to describe behavior over the 10 second period This is data transformation Mapping the data to a very different representation

12 High Level Features: 43 Total
Average[3]: Average acceleration (per axis) Standard Deviation[3]: SD per axis Average Absolute difference[3]: per axis Average Resultant Acceleration[3]: average of square root of sum of squares of 3 values Time Between Peaks[3] Binned Distribution[30]: For each axis take max – min value, create 10 equal sized bins, and record fraction in each bin

13 Types of Models Impersonal Models Personal Models
Generated using data from a panel of other users Personal Models Generated using data from the intended user. Must be separate from test data, as usual.

14 Results

15 2010 Study using Impersonal Model (IB3 Method)
72.4% Accuracy  Predicted Class Walking Jogging Stairs Sitting Standing Lying Down Actual Class 2209 46 789 2 4 45 1656 148 1 412 54 869 3 10 47 553 30 241 8 57 6 448 5 7 301 13 131

16 2010 Study using Personal Model (IB3 Method)
98.4% accuracy  Predicted Class Walking Jogging Stairs Sitting Standing Lying Down Actual Class 3033 1 24 4 1788 42 1292 870 2 6 5 11 509 8 7 442

17 2010 Study Accuracy Results
% of Records Correctly Classified Personal Universal Straw Man IB3 J48 NN Walking 99.2 97.5 99.1 72.4 77.3 60.6 37.7 Jogging 99.6 98.9 99.9 89.5 89.7 89.9 22.8 Stairs 96.5 91.7 98.0 64.9 56.7 67.6 16.5 Sitting 98.6 97.6 97.7 62.8 78.0 10.9 Standing 96.8 96.4 97.3 85.8 92.0 93.6 6.4 Lying Down 95.9 95.0 96.9 28.6 26.2 60.7 5.7 Overall 98.4 96.6 98.7 74.9 71.2

18 2016 Study Universal Models
Algorithm Phone accel (%) Watch accel (%) Watch gyro (%) RF 35.1 70.3 57.5 J48 24.1 59.3 49.6 IB3 22.5 62.0 49.3 NB 26.2 63.8 53.5 MLP 18.9 64.6 57.7 Average 25.3 64.0 Note: based on 18 activities

19 2016 Study Personal Models Algorithm Phone accel (%) Watch accel (%)
Watch gyro (%) RF 75.5 93.3 79.0 J48 65.5 86.1 73.0 IB3 67.7 60.1 NB 77.1 92.7 80.2 MLP 77.0 94.2 70.0 Average 72.6 91.9 72.4

20 2016 Detailed Summary Results
Random Forest  Impersonal (%) Personal (%) Activity Watch accel Phone accel Watch gyro Walking 79.8 60.7 87.0 94.2 88.5 93.5 Jogging 97.7 93.8 48.6 99.2 68.8 98.1 Stairs 58.5 66.7 43.1 88.9 80.0 Sitting 84.9 26.9 70.5 97.5 82.2 Standing 96.3 65.9 57.9 73.1 68.6 Kicking 71.3 72.5 41.4 88.7 91.7 67.9 Dribbling 89.3 26.1 86.0 98.7 84.8 96.9 Catch 66.0 68.9 93.3 78.3 94.6 Typing 80.4 76.9 60.8 99.4 72.0 88.6 Handwriting 85.2 12.9 63.1 100.0 75.9 80.5 Clapping 76.3 40.9 77.3 95.6 Brush Teeth 84.5 19.2 66.2 97.3 96.2 89.6 Fold Clothes 80.8 8.3 37.8 95.0 79.2 Eat Pasta 47.1 0.0 40.0 72.9 Eat Soup 52.7 47.7 90.7 82.4 69.8 Eat Sandwich 29.0 7.1 31.1 63.0 44.2 Eat Chips 65.0 16.0 50.6 83.4 76.0 52.5 Drink 62.7 31.8 61.1 78.5 Overall 70.3 35.1 57.5 75.5 79.0

21 Actitracker The phone-bases research was incorporated into a deployed app/system called Actitracker The development effort to handle real-time activity recognition was substantial Actitracker is no longer supported

22 New Directions My WISDM Lab is finishing work on the smartwatch activity recognition Beginning to consider data mining of static sensors since cheap Bluetooth sensors are now available Research related to Internet of Thing (IoT)

23 Data Collection Collecting the data is quite time intensive
We are still collecting data for “definitive” set of AR experiements so if you want to volunteer, please me. Data collection usually at RH Will provide an Amazon gift card


Download ppt "Mobile Activity Recognition"

Similar presentations


Ads by Google