Download presentation
Presentation is loading. Please wait.
Published bySolomon Emory Byrd Modified over 9 years ago
1
J.-Y. Yang, J.-S. Wang and Y.-P. Chena, Using acceleration measurements for activity recognition: An effective learning algorithm for constructing neural classifiers Pattern Recognition Letters, vol. 29, no. 16, pp. 2213-2220, 2008. Spring Semester, 2010 Dynamic Time Warping and Neural Network
2
Outline 4 Background 4 Activity Recognition Strategy 4 Experiments 4 Summary 2
3
3 Background 4 Accelerometers can be used as a human motion detector and monitoring device –Biomedical engineering, medical nursing, interactive entertainment, … –Exercise intensity / distance, sleep cycle, and calorie consumption
4
Proposed Method Overview 4 One 3-D accelerometer on the dominant wrist 4 NNs –Pre-classifier static classifier or dynamic classifier 4 Eight domestic activities –Standing, sitting, walking, running, vacuuming, scrubbing, brushing teeth, and working at a computer Background 4
5
Neural Classifier 4 Neurons in the Brain –A neuron receives input from other neurons (generally thousands) from its synapses –Inputs are approximately summed –When the input exceeds a threshold the neuron sends an electrical spike that travels from the body, down the axon, to the next neuron(s) Background 5
6
Neurons in the Brain (cont.) 4 Amount of signal passing through a neuron depends on: –Intensity of signal from feeding neurons –Their synaptic strengths –Threshold of the receiving neuron 4 Hebb rule (plays key part in learning) –A synapse which repeatedly triggers the activation of a postsynaptic neuron will grow in strength, others will gradually weaken –Learn by adjusting magnitudes of synapses’ strengths Background 6
7
Artificial Neurons w1w1 w2w2 w3w3 x1x1 x2x2 x3x3 y ∑w.x g( ) Background 7
8
Neural Classifier (Perceptron) 4 Structure 4 Learning –Weights are changed in proportion to the difference (error) between target output and perceptron solution for each example –Back-propagation algorithm The gradient descent method, Slow convergence and local minima –The resilient back-propagation (RPROP) Ignore the magnitude of the gradient Background 8
9
9 Activity Recognition Strategy 4 Pre-Classifier 4 Static/Dynamic Classifier
10
10 Pre-Classifier (1/2) 4 Two components of the acceleration data –Gravitational acceleration (GA) –Body acceleration (BA): High-pass filtering to remove GA 4 Segmentation with overlapping windows –512 samples per window Activity Recognition Strategy
11
11 Pre-Classifier (2/2) 4 SMA (Signal Magnitude Area) –The sum of acceleration magnitude over three axes 4 AE (Average Energy) –Average of the energy over three axes –Energy: The sum of the squared discrete FFT component magnitudes of the signal in a window Activity Recognition Strategy
12
12 Feature Extraction 4 8 attributes × 3axis = 24 features –Mean, correlation between axes, energy, interquartile range (IQR), mean absolute deviation, root mean square, standard deviation, variance Activity Recognition Strategy
13
13 Feature Selection (1/2) 4 Common principal component analysis (CPCA) 4 If features are highly correlated, the corresponding vectors are similar clustering to group similar loadings Activity Recognition Strategy
14
Feature Selection (2/2) 4 Apply the PCA 4 Select the first p PCs (cumulative sum>90%) 4 Estimate CPC 4 Support vector clustering 14 Activity Recognition Strategy
15
Verification Activity Recognition Strategy 15
16
16 Experiments: Environment (1/2) 4 MMA7260Q tri-axial accelerometer –Sensitivity: -4.0g ~ +4.0g, 100Hz –Mount on the dominant wrist 4 Eight activities from seven subjects –Standing, sitting, walking, running, vacuuming, scrubbing, brushing teeth, and working at a computer –2min per activity
17
Environment (2/2) 4 Window size = 512 (with 256 overlapping) –22 windows in one min., 45 windows in two min. 4 Leave-one-subject-out cross-validation –Training: 1min per activity = 22 windows × 8 activities× 6 subjects –Test: 2min per activity = 45 windows × 8 activities Experiments 17
18
18 FSS Evaluation 4 Use six static selected features Experiments
19
19 Recognition Result 4 NN –Hidden node Pre-classifier: 3 Static-classifier: 5 Dynamic-classifier: 7 –Epochs: 500 4 Computational load of FSS –Training without FSS = 7.457s, training with FSS = 8.46s Experiments
20
20 Summary 4 Proposed method yielded 95% accuracy –Pre-classifier static / dynamic classifiers 4 Author’s other publication – Yen-Ping Chen, Jhun-Ying Yang, Shun-Nan Liou, Gwo-Yun Lee, Jeen-Shing Wang: Online classifier construction algorithm for human activity detection using a tri-axial accelerometer. – Applied Mathematics and Computation 205(2): 849-860 (2008)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.