J.-Y. Yang, J.-S. Wang and Y.-P. Chena, Using acceleration measurements for activity recognition: An effective learning algorithm for constructing neural.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Slides from: Doug Gray, David Poole
Introduction to Neural Networks Computing
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington.
Artificial Neural Networks - Introduction -
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Simple Neural Nets For Pattern Classification
1 Part I Artificial Neural Networks Sofia Nikitaki.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Speaker Adaptation for Vowel Classification
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Online classifier construction algorithm for human activity detection using a tri-axial accelerometer Yen-Ping Chen, Jhun-Ying Yang, Shun-Nan Liou, Gwo-Yun.
1 Automated Feature Abstraction of the fMRI Signal using Neural Network Clustering Techniques Stefan Niculescu and Tom Mitchell Siemens Medical Solutions,
Artificial Neural Networks
Data Mining with Neural Networks (HK: Chapter 7.5)
Neural Optimization of Evolutionary Algorithm Strategy Parameters Hiral Patel.
CS 4700: Foundations of Artificial Intelligence
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Feature Extraction Spring Semester, Accelerometer Based Gestural Control of Browser Applications M. Kauppila et al., In Proc. of Int. Workshop on.
Presented by: Kamakhaya Argulewar Guided by: Prof. Shweta V. Jain
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Computer Science and Engineering
1 Artificial Neural Networks Sanun Srisuk EECP0720 Expert Systems – Artificial Neural Networks.
2101INT – Principles of Intelligent Systems Lecture 10.
Chapter 9 Neural Network.
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Human Activity Recognition Using Accelerometer on Smartphones
Feed-Forward Neural Networks 主講人 : 虞台文. Content Introduction Single-Layer Perceptron Networks Learning Rules for Single-Layer Perceptron Networks – Perceptron.
Non-Bayes classifiers. Linear discriminants, neural networks.
EE459 Neural Networks Backpropagation
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Saisakul Chernbumroong, Shuang Cang, Anthony Atkins, Hongnian Yu Expert Systems with Applications 40 (2013) 1662–1674 Elderly activities recognition and.
Neural Networks 2nd Edition Simon Haykin
1 Perceptron as one Type of Linear Discriminants IntroductionIntroduction Design of Primitive UnitsDesign of Primitive Units PerceptronsPerceptrons.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Dimensions of Neural Networks Ali Akbar Darabi Ghassem Mirroshandel Hootan Nokhost.
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Artificial Neural Networks An Introduction. Outline Introduction Biological and artificial neurons Perceptrons (problems) Backpropagation network Training.
Learning: Neural Networks Artificial Intelligence CMSC February 3, 2005.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Learning with Perceptrons and Neural Networks
Real Neurons Cell structures Cell body Dendrites Axon
Ranga Rodrigo February 8, 2014
CSE 473 Introduction to Artificial Intelligence Neural Networks
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
CSE P573 Applications of Artificial Intelligence Neural Networks
Data Mining with Neural Networks (HK: Chapter 7.5)
CSE 573 Introduction to Artificial Intelligence Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Presentation transcript:

J.-Y. Yang, J.-S. Wang and Y.-P. Chena, Using acceleration measurements for activity recognition: An effective learning algorithm for constructing neural classifiers Pattern Recognition Letters, vol. 29, no. 16, pp , Spring Semester, 2010 Dynamic Time Warping and Neural Network

Outline 4 Background 4 Activity Recognition Strategy 4 Experiments 4 Summary 2

3 Background 4 Accelerometers can be used as a human motion detector and monitoring device –Biomedical engineering, medical nursing, interactive entertainment, … –Exercise intensity / distance, sleep cycle, and calorie consumption

Proposed Method Overview 4 One 3-D accelerometer on the dominant wrist 4 NNs –Pre-classifier  static classifier or dynamic classifier 4 Eight domestic activities –Standing, sitting, walking, running, vacuuming, scrubbing, brushing teeth, and working at a computer Background 4

Neural Classifier 4 Neurons in the Brain –A neuron receives input from other neurons (generally thousands) from its synapses –Inputs are approximately summed –When the input exceeds a threshold the neuron sends an electrical spike that travels from the body, down the axon, to the next neuron(s) Background 5

Neurons in the Brain (cont.) 4 Amount of signal passing through a neuron depends on: –Intensity of signal from feeding neurons –Their synaptic strengths –Threshold of the receiving neuron 4 Hebb rule (plays key part in learning) –A synapse which repeatedly triggers the activation of a postsynaptic neuron will grow in strength, others will gradually weaken –Learn by adjusting magnitudes of synapses’ strengths Background 6

Artificial Neurons w1w1 w2w2 w3w3 x1x1 x2x2 x3x3 y ∑w.x g( ) Background 7

Neural Classifier (Perceptron) 4 Structure 4 Learning –Weights are changed in proportion to the difference (error) between target output and perceptron solution for each example –Back-propagation algorithm The gradient descent method, Slow convergence and local minima –The resilient back-propagation (RPROP) Ignore the magnitude of the gradient Background 8

9 Activity Recognition Strategy 4 Pre-Classifier 4 Static/Dynamic Classifier

10 Pre-Classifier (1/2) 4 Two components of the acceleration data –Gravitational acceleration (GA) –Body acceleration (BA): High-pass filtering to remove GA 4 Segmentation with overlapping windows –512 samples per window Activity Recognition Strategy

11 Pre-Classifier (2/2) 4 SMA (Signal Magnitude Area) –The sum of acceleration magnitude over three axes 4 AE (Average Energy) –Average of the energy over three axes –Energy: The sum of the squared discrete FFT component magnitudes of the signal in a window Activity Recognition Strategy

12 Feature Extraction 4 8 attributes × 3axis = 24 features –Mean, correlation between axes, energy, interquartile range (IQR), mean absolute deviation, root mean square, standard deviation, variance Activity Recognition Strategy

13 Feature Selection (1/2) 4 Common principal component analysis (CPCA) 4 If features are highly correlated, the corresponding vectors are similar  clustering to group similar loadings Activity Recognition Strategy

Feature Selection (2/2) 4 Apply the PCA 4 Select the first p PCs (cumulative sum>90%) 4 Estimate CPC 4 Support vector clustering 14 Activity Recognition Strategy

Verification Activity Recognition Strategy 15

16 Experiments: Environment (1/2) 4 MMA7260Q tri-axial accelerometer –Sensitivity: -4.0g ~ +4.0g, 100Hz –Mount on the dominant wrist 4 Eight activities from seven subjects –Standing, sitting, walking, running, vacuuming, scrubbing, brushing teeth, and working at a computer –2min per activity

Environment (2/2) 4 Window size = 512 (with 256 overlapping) –22 windows in one min., 45 windows in two min. 4 Leave-one-subject-out cross-validation –Training: 1min per activity = 22 windows × 8 activities× 6 subjects –Test: 2min per activity = 45 windows × 8 activities Experiments 17

18 FSS Evaluation 4 Use six static selected features Experiments

19 Recognition Result 4 NN –Hidden node Pre-classifier: 3 Static-classifier: 5 Dynamic-classifier: 7 –Epochs: Computational load of FSS –Training without FSS = 7.457s, training with FSS = 8.46s Experiments

20 Summary 4 Proposed method yielded 95% accuracy –Pre-classifier  static / dynamic classifiers 4 Author’s other publication – Yen-Ping Chen, Jhun-Ying Yang, Shun-Nan Liou, Gwo-Yun Lee, Jeen-Shing Wang: Online classifier construction algorithm for human activity detection using a tri-axial accelerometer. – Applied Mathematics and Computation 205(2): (2008)