Download presentation
Presentation is loading. Please wait.
Published byJohnathan Boone Modified over 6 years ago
1
Meta-Activity Recognition: A Wearable Approach
INFOCOM 2017 Meta-Activity Recognition: A Wearable Approach for Logic Cognition-based Activity Sensing Lei Xie, Xu Dong, Wei Wang, and Dawei Huang State Key Laboratory for Novel Software Technology, Nanjing University, China Presenter: Dr. Lei Xie Associate Professor from Nanjing University
2
Outline Motivation and Problem Modeling the Human Motion System Design
1 Modeling the Human Motion 2 System Design 3 Performance Evaluation 4 Conclusion 5
3
Outline Motivation and Problem Modeling the Human Motion System Design
1 Modeling the Human Motion 2 System Design 3 Performance Evaluation 4 Conclusion 5
4
Smart Watch as a Typical
Motivation:Background Activity Sensing with Wearable Devices Elder Care Smart Watch as a Typical Wearable Device Exercise Monitoring
5
Motivation:Background
accelerometer gyroscope magnetometer Smart Watch Traditional Activity Sensing Continuously collect raw sensor data by tracking the human motions. Classify into the corresponding activities by matching the waveforms against the templates.
6
Motivation:Complex Activity
dumbbell curl sit ups Complex activity Have large range of movement and incur rotations on multiple joints of the limbs. Two complex aspects: widespread variations in activity details and large movement range.
7
YAO Ming Motivation:Complex Activity Complex activity
The user-specific characters like heights, limb lengths and moving behaviors Obvious deviations in raw inertial measurements from different human subjects during the complex activity. YAO Ming GUO Jingming
8
Motivation:Our Approach
Limitations of Traditional Approaches User-dependent recognition: require to record the training data from the current user to improve the recognition accuracy. Heavy training: require to collect a large quantity of training samples to build the templates. Our Approach User-independent recognition: require no training from the specific user. Lightweight-training recognition: require a small quantity of training samples to build the templates.
9
Motivation:Our Approach
Logic cognition-based activity sensing: use the meta-activity recognition in the logical representation level Observation: the human subject performs a specified activity experience a very similar sequence of small-range-activity units in the logical aspect, despite of the detailed differences in the waveforms.
10
4:Dumbbell Lateral Raise
Motivation:Our Approach Examples of Complex Activities 1:Upright Barbell Row 2:Dumbbell Curl 3:Dumbbell Flies 4:Dumbbell Lateral Raise 6: Rope Skipping 7: Butterfly 8: Cable Crossover 9: Ping-Pong Swing 5:Dumbbell Triceps Extension 10: Badminton Swing 10
11
Motivation:Challenges
Realize activity sensing in a user-independent approach the derived recognition model can recognize the activities of any arbitrary human subjects. Challenge 2 Build a consistent scheme to depict the human motion according to the inertial measurements from the wearable devices human subjects may perform the activities towards any arbitrary direction.
12
Outline Motivation and Problem Modeling the Human Motion System Design
1 Modeling the Human Motion 2 System Design 3 Performance Evaluation 4 Conclusion 5
13
Coordinate System Transformation
Motivation The inertial measurements are measured in watch coordinate system (WCS) continuously change with arm/wrist movement cannot use as a stable reference. Human subject perform the activity towards any arbitrary direction the movements depicted relative to the human body (BCS), regardless of the absolute moving direction. Transform the measurement from the watch coordinate system (WCS) to the body coordinate system (BCS) perform activity sensing in a scalable approach.
14
Coordinate System Transformation
From Watch Coordinate System (WCS) to Global Coordinate System (GCS) Extract a constant gravitational acceleration as a vector g. Extract magnetic force as a vector m. Build a global coordinate system (GCS). Transform inertial measurement from WCS to GCS by using the direction cosine representation. 1 2 3
15
Coordinate System Transformation
From Global Coordinate System (GCS) to Body Coordinate System (BCS) Set the vector for heading direction of human subject to represent Zb axis. Set the vector parallel to physical plane of body to represent Xb axis. Set the vector perpendicular to physical plane of body to represent Yb axis. Transform inertial measurement from GCS to BCS by using the direction cosine representation. 1 2 3
16
Model the Human Motion with Meta-Activity
Intuition Each complex activity is performed with a large range of movement, decomposed into a series of small range movements called meta-activities. Each meta-activity is defined as a unit movement with logically the minimal granularity for the moving range. Each complex activity ci in C can be depicted as a series of meta-activities, i.e. , ci =<mj1,mj2,…,mjk>, where mj in M.
17
Model the Human Motion with Meta-Activity
An Example of Meta-Activity Series m1 m2 m3 m4 m5 m6 m7 m8
18
Model the Human Motion with Meta-Activity
Question: What is an appropriate metric to measure the meta-activity? Observation 1: Traditional inertial measurements like the linear accelerations are sensitive to speeds and amplitudes of the limb movements fail to depict meta-activity in a scalable approach. Observation 2: During limb movements, the angle variations between limb and body are much more stable, regardless of the human-specific characters such as the height and arm length.
19
Model the Human Motion with Meta-Activity
Angle Profiles Use the angle profiles , i.e. , the angles between the arm and the three axes in the body coordinate system, to depict the meta-activities of limb movements. Ideal situation: the angle profiles of all skeletons in the body coordinate system. Real situation: the angle profiles of the lower arm, as the lower arm usually experiences a movement with fairly large range for human motion.
20
Model the Human Motion with Meta-Activity
Angle Profiles The arm-direction in the BCS can be determined via <α,β,γ>. Suppose the lower arm vector and the vector of the X -axis are v (v = xw ) and u in the BCS.
21
Model the Human Motion with Meta-Activity
Meta-Activity Profiles When performing any meta-activity, angle profiles <α,β,γ> are continuously changing. Property 1: The variation range of any angle profile should be less than a threshold , e.g. , 30 degree. Property 2: The variation trend of any angle profile should be monotonic, e.g. , monotonically increasing or decreasing. Property 3: The time duration of the meta-activity should be less than a threshold t , e.g. , 500ms.
22
Model the Human Motion with Meta-Activity
How to Depict Meta-Activity Profiles? Step1: Use sector to depict meta-activity in specified dimension For each dimension of angle profiles <α,β,γ>, uniformly divide rotation range [0;360] into multiple sectors with the angle ≤ δ. Step2: Label each sector according to the rotation trend For the i th sector, if the rotation direction is anti-clock-wise, then label it with sj , otherwise, label it with Sj . Discrete States rather than Continuous Waveforms Logic Cognition Level rather than Raw Data Level
23
Outline Motivation and Problem Modeling the Human Motion System Design
1 Modeling the Human Motion 2 System Design 3 Performance Evaluation 4 Conclusion 5
24
System Design Data Acquisition and Preprocessing:
Perform coordinate transformation from WCS to BCS. Extract angle profiles and split into separate complex activities. Meta-Activity Segmentation and Classification Segment single complex activity into a series of meta-activities Classify segmented meta-activities into corresponding categories. Complex Activity Recognition Perform activity recognition based on sequences of meta-activities, using the least edit distance-based matching scheme.
25
Data Acquisition and Preprocessing
Coordinate Transformation Transform from the WCS to BCS, using the Direction Cosine method. Two signal gestures to figure out rotation matrix between BCS and GCS Extend the arm to the front Extend the arm along the legs 1) Extend the arm to the front: extend the arm to the front of the body, the arm direction is consistent with the Yb axis in the BCS; 2) Drop the arm downward: drop the arm downward along the legs, the arm direction is opposite to the Zg axis in the BCS.
26
Data Acquisition and Preprocessing
Angle Profiles Extraction Extract the angle profiles <α,β,γ> over time in the BCS. 1) Extend the arm to the front: extend the arm to the front of the body, the arm direction is consistent with the Yb axis in the BCS; 2) Drop the arm downward: drop the arm downward along the legs, the arm direction is opposite to the Zg axis in the BCS.
27
Data Acquisition and Preprocessing
Complex Activity Segmentation Objective: Split complex activity series into separate activities. Observation: a short pause between two adjacent complex activities. Solution: use angle changes of <α,β,γ> to detect Start: the difference from one or more angle profiles exceeds a certain threshold; End: the difference from all angle profiles falls bellow threshold. Start End
28
Meta-Activity Segmentation and Classification
Objective: Segment complex activity into a series of meta-activities, according to angle profiles <α,β,γ>. Straight-forward solution: Check all three dimensions of <α,β,γ> simultaneously. If the condition is satisfied in any dimension, the entire series are segmented as a single meta-activity. Weakness: Cause the meta-activities to be too fragmented after segmentation.
29
Meta-Activity Segmentation and Classification
Solution: Separately perform meta-activity segmentation in each dimension of angle profiles. Details: For each dimension, use a sliding window to scan angle profiles. Verify if any segmentation condition is satisfied, segment the angle profile series as a single meta-activity in the corresponding dimension. After the segmentation, obtain a separate segmentation for each dimension of angle profiles.
30
Meta-Activity Segmentation and Classification
Meta-Activity Classification The meta-activity should be further classified into one specific sector based on moving range and rotation direction. We use Dynamic Time Warping (DTW) to match a test meta-activity to a corresponding sector by referring to the variation trend of angle profiles. angle 60° f(t) 30° 0° time δ(t) Use a linear function f(t) to depict the template of this meta-activity
31
Dumbbell Triceps Extension Dumbbell Lateral Raise
Meta-Activity Segmentation and Classification Complex Activity Recognition Meta-activity profiles: Each complex activity is decomposed into a sequence of meta-activities for 3 dimensions of angle profiles. Dumbbell Triceps Extension Upright Barbell Row Dumbbell Lateral Raise Butterfly
32
Meta-Activity Segmentation and Classification
Complex Activity Recognition Solution: Use Least Edit Distance-based matching (LED) to perform complex activity recognition. LED computes the least edit distance between each pair of test complex activity and template complex activity, and select the template with the smallest distance as the matching result. Details: Consider the distance between two meta activities with two issues, i.e. , distance between sectors and distance between rotation directions.
33
Meta-Activity Segmentation and Classification
Complex Activity Recognition Details: 2. Consider any two complex activities, e.g. , a and b , for each dimension of angles profiles, compute their distance according to the Levenshtein distance. 3. Add distances from 3 dimensions together for overall distance.
34
Meta-Activity Segmentation and Classification
Complex Activity Recognition Example: Test complex activity Template complex activity
35
Dumbbell Triceps Extension
Meta-Activity Segmentation and Classification Complex Activity Recognition Example : Template 1 Dumbbell Triceps Extension Test case Distance:
36
Meta-Activity Segmentation and Classification
Complex Activity Recognition Example : Template 2 Butter Fly Test case Distance:
37
Meta-Activity Segmentation and Classification
Complex Activity Recognition Example : Template 3 2:Dumbbell Curl Test case Least edit distance Distance: Dumbbell Curl
38
Outline Motivation and Problem Modeling the Human Motion System Design
1 Modeling the Human Motion 2 System Design 3 Performance Evaluation 4 Conclusion 5
39
Performance Evaluation
Three schemes for performance comparison Experiment Settings Acceleration-based Matching (AM) : Use DTW to perform waveform-based matching with acceleration measurements. 2) Angle Profiles-based Matching (APM) : Use DTW to perform waveform-based matching with angle profiles. 3) Meta-Activity Recognition (MAR) : Use least edit distance matching with meta-activity profiles.
40
Evaluate the Recognition Accuracy
Sensitivity to the number of training samples MAR achieves best performance whereas AM achieves the worst performance. MAR achieves rather good performance for user-independent recognition while requiring lightweight training. Fig. 10(c) Accuracy for different number of training samples
41
Evaluate the Recognition Accuracy
The matching ratios among multiple activities APM reduces most mismatches caused by AM. MAR further reduces mismatches and improve recognition accuracy to a high level. Fig. 10(d)-(f) Confusion matrix for AM, APM and MAR
42
Evaluate the Time Efficiency
MAR achieves the best time efficiency in tens of milliseconds. Fig. 10(g) Time delay for different solutions
43
Outline Motivation and Problem Modeling the Human Motion System Design
1 Modeling the Human Motion 2 System Design 3 Performance Evaluation 4 Conclusion 5
44
Conclusion Contributions
The first study of using “meta-activity recognition ” for logical cognition-based activity sensing. Use meta-activity profiles to depict complex activities in the logical level, the recognition model is scalable enough for activity recognition on any arbitrary human subjects. Implement a prototype system to evaluate the real performance, our meta-activity recognition achieves an average accuracy of 92% for user-independent activity sensing.
45
INFOCOM 2017 Questions ? Thank you !
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.