Recognizing Smoking Gestures with Inertial Measurements Unit (IMU)

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Gestures Recognition. Image acquisition Image acquisition at BBC R&D studios in London using eight different viewpoints. Sequence frame-by-frame segmentation.
Recognizing Human Actions by Attributes CVPR2011 Jingen Liu, Benjamin Kuipers, Silvio Savarese Dept. of Electrical Engineering and Computer Science University.
Slide Ruler. ? X 5" On today’s menu...  What happened with Gravity  Noise  The tool today  Fundamental Limitations  Magical Christmas Land  (Where.
Use it Free: Instantly Knowing Your Phone Attitude Pengfei Zhou*, Mo Li Nanyang Technological University Guobin (Jacky) Shen Microsoft Research.
Use it Free: Instantly Knowing Your Phone Attitude Pengfei Zhou*, Mo Li Nanyang Technological University Guobin (Jacky) Shen Microsoft Research.
COMP322/S2000/L41 Classification of Robot Arms:by Control Method The Control unit is the brain of the robot. It contains the instructions that direct the.
“Mapping while walking”
Intelligent Systems Lab. Recognizing Human actions from Still Images with Latent Poses Authors: Weilong Yang, Yang Wang, and Greg Mori Simon Fraser University,
Software Quality Ranking: Bringing Order to Software Modules in Testing Fei Xing Michael R. Lyu Ping Guo.
Secure Unlocking of Mobile Touch Screen Devices by Simple Gestures – You can see it but you can not do it Arjmand Samuel Microsoft Research Muhammad Shahzad.
Accelerometer-based Transportation Mode Detection on Smartphones
Smartphone Touchless Screen
Smart Traveller with Visual Translator. What is Smart Traveller? Mobile Device which is convenience for a traveller to carry Mobile Device which is convenience.
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
Jacinto C. Nascimento, Member, IEEE, and Jorge S. Marques
Learning Transportation Mode from Raw GPS Data for Geographic Applications on the Web Yu Zheng, Like Liu, Xing Xie Microsoft Research.
Shanshan Chen, Christopher L. Cunningham, John Lach UVA Center for Wireless Health University of Virginia BSN, 2011 Extracting Spatio-Temporal Information.
© 2013 IBM Corporation Efficient Multi-stage Image Classification for Mobile Sensing in Urban Environments Presented by Shashank Mujumdar IBM Research,
1. The Promise of MEMS to LBS and Navigation Applications Dr. Naser El-Shiemy, CEO Trusted Positioning Inc. 2.
Feature Extraction Spring Semester, Accelerometer Based Gestural Control of Browser Applications M. Kauppila et al., In Proc. of Int. Workshop on.
Today Evaluation Measures Accuracy Significance Testing
Identifying Computer Graphics Using HSV Model And Statistical Moments Of Characteristic Functions Xiao Cai, Yuewen Wang.
* Hassan Khan,Aaron Atwater,and Urs Hengartner. Outline Background Introduction for IA Previous work on IA challenge Introduction to Itus Itus for app.
APT: Accurate Outdoor Pedestrian Tracking with Smartphones TsungYun
Multimodal Information Analysis for Emotion Recognition
Department of Computer and Electrical Engineering A Study of Time-based Features and Regularity of Manipulation to Improve the Detection of Eating Activity.
CS654: Digital Image Analysis Lecture 25: Hough Transform Slide credits: Guillermo Sapiro, Mubarak Shah, Derek Hoiem.
Human pose recognition from depth image MS Research Cambridge.
GENDER AND AGE RECOGNITION FOR VIDEO ANALYTICS SOLUTION PRESENTED BY: SUBHASH REDDY JOLAPURAM.
Secure Unlocking of Mobile Touch Screen Devices by Simple Gestures – You can see it but you can not do it Muhammad Shahzad, Alex X. Liu Michigan State.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
Dead Reckoning with Smart Phone Sensors for Emergency Rooms Ravi Pitapurapu, Ajay Gupta, Kurt Maly, Tameer Nadeem, Ramesh Govindarajulu, Sandip Godambe,
Objectives: Terminology Components The Design Cycle Resources: DHS Slides – Chapter 1 Glossary Java Applet URL:.../publications/courses/ece_8443/lectures/current/lecture_02.ppt.../publications/courses/ece_8443/lectures/current/lecture_02.ppt.
High resolution product by SVM. L’Aquila experience and prospects for the validation site R. Anniballe DIET- Sapienza University of Rome.
Chapter 6 Activity Recognition from Trajectory Data Yin Zhu, Vincent Zheng and Qiang Yang HKUST November 2011.
Mobile Activity Recognition
Intelligent Learning Systems Design for Self-Defense Education
Emerging Mobile Threats and Our Defense
San Diego May 22, 2013 Giovanni Saponaro Giampiero Salvi
MobileMiner: Mining Your Frequent Behavior Patterns On Your Phone
My Tiny Ping-Pong Helper
Supervised Time Series Pattern Discovery through Local Importance
Posture Monitoring System for Context Awareness in Mobile Computing
Transport mode detection in the city of Lyon using mobile phone sensors Jorge Chong Internship for MLDM M1 Jean Monnet University
Kinematics Relative Motion
Action Recognition in the Presence of One
Mobile health consultant Copyright (
Mobile Handset Sensors
Vijay Srinivasan Thomas Phan
Massachusetts Institute of Technology
Mobile Sensor-Based Biometrics Using Common Daily Activities
Yan Chen Lab of Internet and Security Technology (LIST)
Chao Xu, Parth H. Pathak, et al. HotMobile’15
Keystroke Biometric Studies with Short Numeric Input on Smartphones
Mitchell Kossoris, Catelyn Scholl, Zhi Zheng
DAISY Friend or Foe? Your Wearable Devices Reveal Your Personal PIN
Presented by :- Vishal Vijayshankar Mishra
Anindya Maiti, Murtuza Jadliwala, Jibo He Igor Bilogrevic
WISDM Activity Recognition & Biometrics Applications of Classification
Activity Recognition Classification in Action
Xin Qi, Matthew Keally, Gang Zhou, Yantao Li, Zhen Ren
We can track you if you take the metro
Sofia Pediaditaki and Mahesh Marina University of Edinburgh
Visual Recognition of American Sign Language Using Hidden Markov Models 문현구 문현구.
Kostas Kolomvatsos, Christos Anagnostopoulos
Power-Accuracy Tradeoffs in Human Activity Transition Detection
MyoHMI Architecture Background
A STUDY ON MOTION MODE IDENTIFICATION FOR CYBORG ROACHES
Presenter: Donovan Orn
Presentation transcript:

Recognizing Smoking Gestures with Inertial Measurements Unit (IMU) Abhinav Parate Mengh-Chieh Chiu Chaniel Chadowitz Deepak Ganesan Evangelos Kalogerakis University of Massachusetts, Amherst UNIVERSITY OF MASSACHUSETTS AMHERST

Smoking According to CDC, smoking is responsible for 440,000 deaths in the United States $96 billion in medical costs $97 billion in lost productivity Over a billion smokers worldwide! UNIVERSITY OF MASSACHUSETTS AMHERST

Smoking Cessation 40% smokers try to quit each year. Most efforts end in relapse. Well-timed interventions help! Less than 10 % success rate Requires presence of a ubiquitous agent UNIVERSITY OF MASSACHUSETTS AMHERST

RisQ: A Mobile Solution for Intervention Smartphone Always with the user Can sense user environment Real-time intervention Wristband Equipped with 9-axis Inertial Measurement Unit (IMU) Real-time smoking detection UNIVERSITY OF MASSACHUSETTS AMHERST

Hand-to-mouth gesture characteristics IMU Signals for various hand-to-mouth gestures UNIVERSITY OF MASSACHUSETTS AMHERST

1: Orientation-dependent Characteristics Signal characteristics change with user’s body orientation Gesture characteristics for the same smoking gesture when the user faces in opposite directions UNIVERSITY OF MASSACHUSETTS AMHERST

2: Unknown Gesture Boundaries How to identify gesture boundaries in a passive manner? Where does a gesture start? UNIVERSITY OF MASSACHUSETTS AMHERST

3: Collecting labels for training How to collect fine-grained labels for training a classification model? UNIVERSITY OF MASSACHUSETTS AMHERST

Outline Introduction Challenges Data Collection using IMUs Data Processing Pipeline Evaluation Conclusion UNIVERSITY OF MASSACHUSETTS AMHERST

IMU signal: Background Quaternion Mathematical entity to represent orientation of an object in 3D space q = q_s + q_x i+q_y j+q_z k One scalar and 3 imaginary components Angle of rotation: a x y q = cos(a/2) + x sin(a/2) i + y sin(a/2) j + z sin(a/2) k z UNIVERSITY OF MASSACHUSETTS AMHERST

3D coordinates using Quaternions Point p w.r.t. IMU’s local frame of reference IMU device orientation in the form of a quaternion q Coordinates of p w.r.t world frame of reference q = cos(a/2) + x sin(a/2) i + y sin(a/2) j + z sin(a/2) k q’ = cos(a/2) – x sin(a/2) I – y sin(a/2) j – z sin(a/2) k p’ = q.p.q’ UNIVERSITY OF MASSACHUSETTS AMHERST

Wrist Trajectory using Quaternions Visualizing gestures using a wristband and an armband equipped with IMUs UNIVERSITY OF MASSACHUSETTS AMHERST

Outline Introduction Challenges Data Collection using IMUs Data Processing Pipeline Evaluation Conclusion UNIVERSITY OF MASSACHUSETTS AMHERST

Gesture Classification Segment Extraction Feature Extraction Gesture Classification Session Detection Segment Executed on phone Relative to elbow Peak Detection algorithm UNIVERSITY OF MASSACHUSETTS AMHERST

Gesture Classification Segment Extraction Feature Extraction Gesture Classification Session Detection Orientation Independent Features A set of 34 spatio-temporal features Duration-based features (4) Gesture duration, time to raise arm, etc. Velocity-based features (6) Maximum wrist speed, etc. Displacement-based features (6) Vertical displacement, XY displacement, etc. Angle-based features (18) Angle with the gravity, angular velocity, etc. UNIVERSITY OF MASSACHUSETTS AMHERST

Gesture Classification Segment Extraction Feature Extraction Gesture Classification Session Detection Segment Extraction Feature Extraction Gesture Classification Session Detection UNIVERSITY OF MASSACHUSETTS AMHERST

Outline Introduction Challenges Data Collection using IMUs Data Processing Pipeline Evaluation Conclusion UNIVERSITY OF MASSACHUSETTS AMHERST

Evaluation Dataset Dataset 28 hours of data from 15 volunteers 17 smoking sessions (369 puffs) 10 eating sessions (252 food bites) 6 drinking sessions UNIVERSITY OF MASSACHUSETTS AMHERST

Smoking Session Detection 17 15 Statistic Avg ± Std Dev Duration of smoking sessions 326.21 ± 19.65 s Error in estimation 65.7 ± 30.6 s Leave-one-session-out Cross-validation UNIVERSITY OF MASSACHUSETTS AMHERST

Smoking Gesture Recognition Mechanism Performance Metrics Accuracy Recall Precision FPR Random Forests 93.00% 0.85 0.72 0.023 CRF 95.74% 0.81 0.91 0.005 10-fold Cross-validation 369 puffs 252 bites 4976 other gestures CRF improves precision at a cost of slight drop in recall UNIVERSITY OF MASSACHUSETTS AMHERST

User Study Recruited 4 subjects for 3 days. Used our smoking detection app developed for Android OS. UNIVERSITY OF MASSACHUSETTS AMHERST

User Study Rarely missed any smoking session. Day 1 Day 2 Day 3 Rarely missed any smoking session. Fewer than 2 false positives per day! UNIVERSITY OF MASSACHUSETTS AMHERST

Conclusion An algorithm to recognize hand-gestures using a wristband Demonstrated an application to detect smoking in real-time. Smartphones in conjunction with wearable accessories present a great platform to sense health-related behaviors like smoking, eating, and so on. Remarkable opportunity to create effective intervention strategies using smartphones. Software/Code available at: http://people.cs.umass.edu/~aparate/risq.html UNIVERSITY OF MASSACHUSETTS AMHERST

Eating Gesture Recognition Mechanism Eating Sessions All data Recall Precision Bite-Counter 0.60 0.57 0.65 0.03 Random Forests 0.92 0.78 0.69 0.64 CRF N/A Eating gesture recognition Bite-Counter detects food bites when user explicitly indicates that eating session is in progress. UNIVERSITY OF MASSACHUSETTS AMHERST

Measured on Samsung Galaxy Nexus System Overhead Statistic Value Time for segmentation 92.34ms Time for feature extraction 79.88ms Time for CRF inference 5.89ms Memory 12-20MB Binary Size 1.7MB Measured on Samsung Galaxy Nexus UNIVERSITY OF MASSACHUSETTS AMHERST

Optimizing Performance Optimal cost for best performance Use a cost function during RF classifier training to assign penalty for missing a smoking gesture. High cost results in lower precision Low cost results in lower recall and low FPR UNIVERSITY OF MASSACHUSETTS AMHERST

3: Concurrent Activities User is stationary User is walking Concurrent activities can modify the characteristic patterns of gestures UNIVERSITY OF MASSACHUSETTS AMHERST