Mobile Activity Recognition

Slides:



Advertisements
Similar presentations
Outline Activity recognition applications
Advertisements

* Fordham University Department of Computer and Information Science ** Healthy Pet Technologies WagTag  : A Dog Collar Accessory for Monitoring Canine.
Data Mining Methodology 1. Why have a Methodology  Don’t want to learn things that aren’t true May not represent any underlying reality ○ Spurious correlation.
Use it Free: Instantly Knowing Your Phone Attitude Pengfei Zhou*, Mo Li Nanyang Technological University Guobin (Jacky) Shen Microsoft Research.
Use it Free: Instantly Knowing Your Phone Attitude Pengfei Zhou*, Mo Li Nanyang Technological University Guobin (Jacky) Shen Microsoft Research.
Gary M. Weiss and Jeffrey Lockhart Fordham University, New York, NY 1UbiMI UBICOMP Sept
Activity Recognition Taiwoo Park May 7, 2013
Energy expenditure estimation with wearable accelerometers Mitja Luštrek, Božidara Cvetković and Simon Kozina Jožef Stefan Institute Department of Intelligent.
Smartphone-based Activity Recognition for Pervasive Healthcare - Utilizing Cloud Infrastructure for Data Modeling Bingchuan Yuan, John Herbert University.
Studying Relationships between Human Posture and Health Risk Factors during Sedentary Activities Tejas Srinivasan Mentors: Vladimir Pavlovic Saehoon Yi.
A Practical Approach to Recognizing Physical Activities Jonathan Lester, Tanzeem Choudhury, and Gaetano Borriello In Proceedings of the Fourth International.
A Practical Approach to Recognizing Physical Activities Jonathan Lester Tanzeem Choudhury Gaetano Borriello.
Activity Recognition from User- Annotated Acceleration Data Presented by James Reinebold CSCI 546.
Learning from Multiple Outlooks Maayan Harel and Shie Mannor ICML 2011 Presented by Minhua Chen.
Gary M. Weiss Fordham University
Feature Extraction Spring Semester, Accelerometer Based Gestural Control of Browser Applications M. Kauppila et al., In Proc. of Int. Workshop on.
School of Electronic Information Engineering, Tianjin University Human Action Recognition by Learning Bases of Action Attributes and Parts Jia pingping.
南台科技大學 資訊工程系 Posture Monitoring System for Context Awareness in Mobile Computing Authors: Jonghun Baek and Byoung-Ju Yun Adviser: Yu-Chiang Li Speaker:
July 25, 2010 SensorKDD Activity Recognition Using Cell Phone Accelerometers Jennifer Kwapisz, Gary Weiss, Samuel Moore Department of Computer &
September Activity Recognition and Biometric Identification Using Cell Phone Accelerometers WISDM Project Department of Computer & Info. Science.
TEMPLATE DESIGN © Detecting User Activities Using the Accelerometer on Android Smartphones Sauvik Das, Supervisor: Adrian.
TapPrints: Your Finger Taps Have Fingerprints Emiliano Miluzzo*, Alex Varshavsky*, Suhrid Balakrishnan*, Romit R. Choudhury + * at&t Labs – Research, USA.
Slice&Dice: recognizing food preparation activities using embedded accelerometers Cuong Pham & Patrick Olivier Culture Lab School of Computing Science.
Human Activity Recognition Using Accelerometer on Smartphones
Department of Computer and Electrical Engineering A Study of Time-based Features and Regularity of Manipulation to Improve the Detection of Eating Activity.
Abstract: Accelerometers As part of the smartphone philosophy, every phone has a wide variety of sensors available to the user. Sensors include light and.
Final Year Projects Prof. Joseph Ng. Mobile Application Development Sensor Programming Android Platform Preferred Applications on the following domains:
Winston H. Wu, Maxim A. Batalin, Lawrence K. Au, Alex A. T. Bui, and William J. Kaiser.
Outdoors Indoors For your Health Ends in a “K’ Mixed.
J.-Y. Yang, J.-S. Wang and Y.-P. Chena, Using acceleration measurements for activity recognition: An effective learning algorithm for constructing neural.
IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, VOL. 19, NO
Saisakul Chernbumroong, Shuang Cang, Anthony Atkins, Hongnian Yu Expert Systems with Applications 40 (2013) 1662–1674 Elderly activities recognition and.
A Behavioral Biometrics User Authentication Study Using Android Device Accelerometer and Gyroscope Data Jonathan Lee, Aliza Levinger, Beqir Simnica, Khushbu.
Work supported by NSF Grant No and numerous Fordham University grants
Emerging Mobile Threats and Our Defense
Telepath: Sensory Offloading for Wearable Devices
Where is it? Find the Action.
My Tiny Ping-Pong Helper
Tracking Mobile Web Users Through Motion Sensors: Attacks and Defenses
Posture Monitoring System for Context Awareness in Mobile Computing
Transport mode detection in the city of Lyon using mobile phone sensors Jorge Chong Internship for MLDM M1 Jean Monnet University
Walking Speed Detection from 5G Prototype System
Machine Learning for the Quantified Self
Map for Easy Paths GIANLUCA BARDARO
Recognizing Smoking Gestures with Inertial Measurements Unit (IMU)
Vijay Srinivasan Thomas Phan
the action or process of moving or being moved.
Transportation Mode Recognition using Smartphone Sensor Data
Massachusetts Institute of Technology
Human Activity Recognition Using Smartphone Sensor Data
Mobile Sensor-Based Biometrics Using Common Daily Activities
Yan Chen Lab of Internet and Security Technology (LIST)
Scan Sampling Time Animal and Behavior 1200
Chao Xu, Parth H. Pathak, et al. HotMobile’15
Evaluation of Mobile Interfaces
Closing the Gaps in Inertial Motion Tracking
Nisha Vinayaga-Sureshkanth† Anindya Maiti†‡ Murtuza Jadliwala†
Anindya Maiti, Murtuza Jadliwala, Jibo He Igor Bilogrevic
iSRD Spam Review Detection with Imbalanced Data Distributions
WISDM Activity Recognition & Biometrics Applications of Classification
Activity Recognition Classification in Action
Xin Qi, Matthew Keally, Gang Zhou, Yantao Li, Zhen Ren
Sensor Fusion Localization and Navigation for Visually Impaired People
Data Transformations targeted at minimizing experimental variance
William Fadel, Ph.D. August 1, 2018
Bench press exercise detection and repetition counting
Power-Accuracy Tradeoffs in Human Activity Transition Detection
Raveen Wijewickrama Anindya Maiti Murtuza Jadliwala
When Machine Learning Meets Security – Secure ML or Use ML to Secure sth.? ECE 693.
Mole: Motion Leaks through Smartwatch Sensors
Presentation transcript:

Data Mining Sample Research: Activity Recognition Classification in Action

Mobile Activity Recognition Mobile devices like smartphones and smartwatches have many sensors Some sensors measure motion Tri-axial accelerometer Gyroscope GPS and other location sensors Activity Recognition is now pretty common but wasn’t when this research started It took a year for our Fitbit order to be delivered

What is Activity Recognition? Identifying a user’s activity based on data In our case the mobile sensor data from the accelerometer and gyroscope What type of data mining task is this? Classification How would you formulate this as a classification task? Not so obvious if you have not read the paper, since time dimension complicates things

More on Activity Recognition Examples of activities Walking, jogging, running, jumping, washing dishes, playing basketball, reading, partying, studying, eating, drinking, etc Why do we care? Context sensitive “smart” devices Fitness and health applications To track what we do for other purposes

The Data The data is collected at 20 Hz A timestamped sequence of numbers for each of 3 dimensions for both sensors

Walking Data Watch Gyroscope Phone accelerometer

Phone Jogging Accelerometer

Phone Accelerometer Standing Data

WISDM Activity Recognition Studies 2010 study using only smartphones Good results, but only 6 basic activities (29 subjects) More refined studies over next few years, including impact of personal models 2016 study: smartphones & smartwatches Good results over 18 activities (17 subjects) Hand-based activities including eating In progress Increasing test subjects to 50-100 and more thorough evaluation of the four sensors Phone accel, phone gyro, watch accel, watch gyro, fusion

The 2016 Smartwatch Activities General Activities Walking* Jogging* Climbing Stairs* Sitting* Standing* Kicking Soccer Ball General Activities (hand-oriented) Dribbling Basketball Playing Catch with Tennis Ball Typing Handwriting Clapping Brushing Teeth Folding Clothes Eating Activities (hand-oriented) Eating Pasta Eating Soup Eating Sandwich Eating Chips Drinking from a Cup * These used in the 2010 smartphone study

Formulation as Classification Take the raw time series sensor data for non- overlapping 10 second chunks and create one example Use higher level features to describe behavior over the 10 second period This is data transformation Mapping the data to a very different representation

High Level Features: 43 Total Average[3]: Average acceleration (per axis) Standard Deviation[3]: SD per axis Average Absolute difference[3]: per axis Average Resultant Acceleration[3]: average of square root of sum of squares of 3 values Time Between Peaks[3] Binned Distribution[30]: For each axis take max – min value, create 10 equal sized bins, and record fraction in each bin

Types of Models Impersonal Models Personal Models Generated using data from a panel of other users Personal Models Generated using data from the intended user. Must be separate from test data, as usual.

Results

2010 Study using Impersonal Model (IB3 Method) 72.4% Accuracy  Predicted Class Walking Jogging Stairs Sitting Standing Lying Down Actual Class 2209 46 789 2 4 45 1656 148 1 412 54 869 3 10 47 553 30 241 8 57 6 448 5 7 301 13 131

2010 Study using Personal Model (IB3 Method) 98.4% accuracy  Predicted Class Walking Jogging Stairs Sitting Standing Lying Down Actual Class 3033 1 24 4 1788 42 1292 870 2 6 5 11 509 8 7 442

2010 Study Accuracy Results % of Records Correctly Classified Personal Universal Straw Man IB3 J48 NN Walking 99.2 97.5 99.1 72.4 77.3 60.6 37.7 Jogging 99.6 98.9 99.9 89.5 89.7 89.9 22.8 Stairs 96.5 91.7 98.0 64.9 56.7 67.6 16.5 Sitting 98.6 97.6 97.7 62.8 78.0 10.9 Standing 96.8 96.4 97.3 85.8 92.0 93.6 6.4 Lying Down 95.9 95.0 96.9 28.6 26.2 60.7 5.7 Overall 98.4 96.6 98.7 74.9 71.2

2016 Study Universal Models Algorithm Phone accel (%) Watch accel (%) Watch gyro (%) RF 35.1 70.3 57.5 J48 24.1 59.3 49.6 IB3 22.5 62.0 49.3 NB 26.2 63.8 53.5 MLP 18.9 64.6 57.7 Average 25.3 64.0 Note: based on 18 activities

2016 Study Personal Models Algorithm Phone accel (%) Watch accel (%) Watch gyro (%) RF 75.5 93.3 79.0 J48 65.5 86.1 73.0 IB3 67.7 60.1 NB 77.1 92.7 80.2 MLP 77.0 94.2 70.0 Average 72.6 91.9 72.4

2016 Detailed Summary Results Random Forest  Impersonal (%) Personal (%) Activity Watch accel Phone accel Watch gyro Walking 79.8 60.7 87.0 94.2 88.5 93.5 Jogging 97.7 93.8 48.6 99.2 68.8 98.1 Stairs 58.5 66.7 43.1 88.9 80.0 Sitting 84.9 26.9 70.5 97.5 82.2 Standing 96.3 65.9 57.9 73.1 68.6 Kicking 71.3 72.5 41.4 88.7 91.7 67.9   Dribbling 89.3 26.1 86.0 98.7 84.8 96.9 Catch 66.0 68.9 93.3 78.3 94.6 Typing 80.4 76.9 60.8 99.4 72.0 88.6 Handwriting 85.2 12.9 63.1 100.0 75.9 80.5 Clapping 76.3 40.9 77.3 95.6 Brush Teeth 84.5 19.2 66.2 97.3 96.2 89.6 Fold Clothes 80.8 8.3 37.8 95.0 79.2 Eat Pasta 47.1 0.0 40.0 72.9 Eat Soup 52.7 47.7 90.7 82.4 69.8 Eat Sandwich 29.0 7.1 31.1 63.0 44.2 Eat Chips 65.0 16.0 50.6 83.4 76.0 52.5 Drink 62.7 31.8 61.1 78.5 Overall 70.3 35.1 57.5 75.5 79.0

Actitracker The phone-bases research was incorporated into a deployed app/system called Actitracker The development effort to handle real-time activity recognition was substantial Actitracker is no longer supported

New Directions My WISDM Lab is finishing work on the smartwatch activity recognition Beginning to consider data mining of static sensors since cheap Bluetooth sensors are now available Research related to Internet of Thing (IoT)

Data Collection Collecting the data is quite time intensive We are still collecting data for “definitive” set of AR experiements so if you want to volunteer, please email me. Data collection usually at RH Will provide an Amazon gift card