Presentation is loading. Please wait.

Presentation is loading. Please wait.

Survey on Activity Recognition from Acceleration Data.

Similar presentations


Presentation on theme: "Survey on Activity Recognition from Acceleration Data."— Presentation transcript:

1 Survey on Activity Recognition from Acceleration Data

2 2 Outline Research goal of the project Contexts of mobile devices Activity recognition from acceleration data[1,2] –Overview –Related works –Data collection –Feature extraction –Experiment results –Analysis Summary Future works

3 3 Research Goal Context-aware computing and interaction for mobile devices –Improving usability of mobile devices –Providing adequate services to users based on context –By recognizing context/situation of users and devices autonomously –Applications Reduce interruptions from mobile devices[5] Health-care system[6] …

4 4 Context Definition of context –Instantaneous, detectable, and relevant property of the environment, the system or users –Examples location, time, light intensity, noise level, power, user schedule, …

5 5 Contexts of Mobile Phones[7] – Where the user is – indoors – outdoors – in a meeting – at the desk – … – Where the devices is in relation to the user – in hand – in a pocket – on the table – … – What the user is doing (with the device) – walking – talking – sitting – running – …  Physical activity recognition

6 6 Activity Recognition from Acceleration Data[1,2] Recognizing physical activities –20 activities including common household affairs –using data from 5 biaxial accelerometers Under semi-naturalistic conditions –No wires –Weighed less than 120g –No observation by researchers –Without any restriction on movement or fear of damaging electronics –Minimizing subject awareness of data collection –Subjects annotate start, stop times by themselves

7 7 Activity Labels WalkingWalking carrying items Sitting & relaxingWorking on computers Standing stillEating or drinking Watching TVReading RunningBicycling StretchingStrength-training ScrubbingVacuuming Folding laundryLying down & relaxing Brushing teethClimbing stairs Riding elevatorRiding escalator

8 8 Related Works Ref.Recognition Rate Activities Recognized No. of Subj. Data Type No. of Sensor s Sensor Placement [8]92.85% ~ 95.91% Ambulation8L22 thigh [9]83% ~ 90% Ambulation, posture6L63 left hip, 3 right hip [10]95.8%Ambulation, posture, typing, talking, bicycling 24L4Chest, thigh, wrist, forearm [11]89.30%Ambulation, posture5L2Chest, thigh [12]96.67%3 Kung Fu arm movements 1L22 wrist [4]42% ~ 96% Ambulation, posture, bicycling 1L22 lower back [13]85% ~ 90% Ambulation, posture10L22 knee [10]66.7%Ambulation, posture, typing, talking, bicycling 24N4Chest, thigh, wrist, forearm [14]86% ~ 93% Ambulation, posture, play 1N32 wrist, 1 thigh Data type L : laboratory setting N : naturalistic setting

9 9 Data Collection 5 biaxial accelerometers –Left thigh, right ankle, left arm, right wrist, right hip –Sampling frequency : 76.25Hz

10 10 Feature Extraction 512 sample windows (6.7 seconds) Sliding windows with 50% overlap Features –DC feature Mean over the window –Energy feature Sum of squared DFT component magnitudes –Frequency-domain entropy Normalized information entropy of the DFT component magnitudes Support discrimination of activities with similar energy values –Correlation Between two axes and between all pairwise combinations of axes on different boards

11 11 Example of Frequency-domain Entropy Ex) Bicycling and running –Similar amounts of energy in the hip acceleration –Bicycling Uniform circular movement of the legs DFT of hip acceleration in vertical direction show single dominant frequency component at 1Hz Low frequency-domain entropy –Running Complex hip acceleration and many major DFT frequency components between 0.5Hz and 2Hz  Higher frequency-domain entropy

12 12 Experiment Results Naïve Bayes –Unable to adequately model such rules Due to the assumptions of conditional independence and Gaussian distribution Insufficient data ClassifierUser-specific TrainingLeave-one-subject-out Training Nearest Neighbor69.2182.70 Decision Tree71.5884.26 Naïve Bayes34.9452.35

13 13 Experiment Results Decision tree –Capture conjunctions in feature values well –Ex) Sitting –1G downward acceleration, low energy at hip and arm Bicycling –moderate energy and low entropy at hip, low energy at arm Window scrubbing and brushing teeth –window scrubbing shows more energy in hip even though both activities shows high energy at arm

14 14 Experiment Results ActivityAccuracy(%)ActivityAccuracy(%) Walking89.71Walking carrying items82.10 Sitting & relaxing94.78Working on computers97.49 Standing still95.67Eating or drinking88.67 Watching TV77.29Reading91.79 Running87.68Bicycling96.29 Stretching41.42Strength-training82.51 Scrubbing81.09Vacuuming96.41 Folding laundry95.14Lying down & relaxing94.96 Brushing teeth85.27Climbing stairs85.61 Riding elevator43.58Riding escalator70.56 using decision tree classifier and leave-one-subject-out validation

15 15 Experiment Results Riding elevator –Often misclassified as “riding escalator” –Both involve the subject standing still and similar vertical acceleration Watching TV –Often misclassified as “sitting and relaxing” and “reading” –All activities involve sitting Stretching –Often misclassified as “folding laundry” –Moderate moving at arms

16 16 Experiment Results Comparing leave-one-subject-out and user-specific training –Equal amounts of training data –Result –User-specific training shows more accurate result ClassifierUser-specific Training Leave-one-subject-out Training Decision Tree77.31%72.99%

17 17 Discrimination Power of Each Accelerometer Thigh is the most powerful, hip is the second best location One accelerometer attached to a cell phone may enable recognition of certain activities Two accelerometers may enable effective recognition Accelerometer(s) Left InDifference in Recognition Accuracy Hip-34.12 Wrist-51.99 Arm-63.65 Ankle-37.08 Thigh-29.47 Thigh and Wrist-3.27 Hip and Wrist-4.78

18 18 Analysis Postures such as sitting, standing still, … –recognized by mean acceleration Ambulatory activities and bicycling –recognized by hip acceleration energy Bicycling and running –shows similar levels of hip acceleration mean and energy –recognized by entropy and correlation between arm and hip acceleration

19 19 Analysis Low recognition rates for stretching, scrubbing, riding an elevator or escalator –High level analysis is required –Ex) duration, time, day of activities Use of other sensor data may improve activity recognition

20 20 Summary Low recognition rates when using naturalistic data in prior works 84.26% recognition rates for 20 everyday activities –Using 5 biaxial accelerometers –DFT based features –By decision tree classifier –Hip is the second best location One accelerometer may recognize some activities which are not associated with upper part of body

21 21 Future Works Data collection –Making wireless board with accelerometers (and other sensors) –Under naturalistic condition Activity recognition algorithm for mobile phone –Only one accelerometer –Consideration for location and posture of mobile phone In hand, in a pocket, in a bag, … Different orientation of devices

22 22 References [1] L. Bao, S. S. Intille, “Activity recognition from user-annotated acceleration data”, Proc. of Pervasive 2004. [2] L. Bao, “Physical actiity recognition from acceleration data under semi-naturalistic conditions”, M.Eng. Thesis, MIT, 2003. [3] R.W. DeVaul, S. Dunn, “Real-time motion classification for wearable computing applications”, Technical report, MIT Midea Lab., 2001. [4] K. V. Laerhoven, O. Cakmakci, “What shall we teach our pants?”, In the 4 th international symposium on Wearable Computers, 2000. [5] J. Ho, S. S. Intille, “Using context-aware computing to reduce the perceived burden of interruptions from mobile devices”, CHI2005. [6] S. S. Intille, “A new research challenge: persuasive technology to motivate healthy aging”, IEEE Transactions on information technology in biomedicine, 2004. [7] E. Tuulari, “Methods and technologies for experimenting with ubiquitous computing”, Espoo2005, VTT Publications 560, 2005. [8] S.-W. Lee, K. Mase, “Activity and location recognition using wearable sensors”, IEEE Pervasive Computing, 2002. [9] J. Mantyjarvi, et. al., “Recognizing human motion with multiple acceleration sensors”, In Proc. of the IEEE International Conf. on Systems, Man, and Cybernetics, 2001. [10] F. Foerster, et. al., “Detection of posture and motion by accelerometry: a validation in ambulatory monitoring”, Computers in Human Behavior, 1999. [11] K. Aminian, et. al., “Physical activity monitoring based on accelerometry: validation and comparison with video observation”, Medical & Biological Engineering & Computing, 1999. [12] G.S. Chambers, et. al., “Hierarchical recognition of intentional human gestures for sports video annotation”, In Proc. of the 16 th International Conf. on Pattern Recognition, 2002. [13] C. Randell, H. Muller, “Context awareness by analysing accelerometer data”, The 4 th International Symposium on Wearable Computers, 2000. [14] M. Uiterwaal, et. al., “Ambulatory monitoring of physical activity in working situations, a validation study”, Journal of Medical Engineering & Technology, 1998.

23 23 Appendix Confusion matrix


Download ppt "Survey on Activity Recognition from Acceleration Data."

Similar presentations


Ads by Google