Behavior Recognition Based on Machine Learning Algorithms for a Wireless Canine Machine Interface Students: Avichay Ben Naim Lucie Levy 14 May, 2014 Ort Braude College – SE Dept.
Motivation Definitions Description of the research Useful algorithms The classification flow ConclusionsAgenda
It’s all about a canine training Bloodhound Watchdogs Guide dog Police dog Search and rescue dog
Definitions Inertial Measurement Unit (IMU) The gyroscope The accumulator moving average filter decision tree
Inertial Measurement Units
The gyroscope Etymology Description and diagram Modern Uses
moving average filter Statistical tool. analyze data points. A moving average is commonly used with time series.
decision tree Decision support tool. Decision trees are commonly used in decision analysis.
Useful algorithms Forward algorithm Viterbi algorithm Baum-Welch algorithm C4.5 algorithm Hidden Markov Model (HMM)
The state is directly visible to the observer The state transition probabilities are the only parameters H C Markov chain
Hidden Markov Model The state is not directly visible The output, dependent on the state, is visible. Each state has a probability distribution over the possible output tokens.
Hidden Markov Model
Three problems in HMM and their solutions Problem 1: Given a certain Markov model what is the probability of a certain sequence of observation. Solution 1: Forward algorithm. Problem 2: Given a certain sequence of observation and a certain Markov model what is the most possible sequence of states that create this sequence of observation. solution 2: Viterbi algorithm.
Forward algorithm Solve the first problem of HDD. Algorithm input: HMM. Algorithm output: probability of a certain sequence of observation. Equations:
Forward algorithm- Example Calculate the probability of :
Viterbi algorithm Dynamic programming algorithm. Find the most likely sequence of states from observed events. Equations: animated_demo.gif animated_demo.gif
Baum-Welch algorithm
Baum-Welch algorithm- Example To start we first guess the transition and emission matrices: The next step is to estimate a new transition matrix:
Baum-Welch algorithm- Example The new estimate for the S1 to S2 transition is now Then calculate the S2 to S1, S2 to S2 and S1 to S1probabilities.
Baum-Welch algorithm- Example Next, we want to estimate a new emission matrix : The new estimate for the E coming from S1 emission is now
Baum-Welch algorithm- Example calculate all the emission matrix. estimate the initial probabilities
C4.5 Algorithm C4.5 is an extension of ID3 algorithm The algorithm used to generate a decision tree greedy approach selecting the best attribute to split the dataset on each iteration Local worst splitBest split Good split
Entropy Normalized Information Gain Gini coefficient splitting criterions
Entropy H(S) is a measure of the amount of uncertainty in the set S. Where: S-The current set for which entropy is being calculated. X-set of classes in S. P(t)-The proportion of the number of elements in class X to the number of elements in set S. Where H(S)=0 the set S is perfectly classified. splitting criterion-Entropy
Example
IG(A) is the measure of the difference in entropy from before to after the set S is split on the attribute A. Where: H(S) is the entropy of set S. T-The subsets created from splitting set S by attribute A such that P(t)-The proportion of the number of elements in t to the number of elements in set S H(t)-entropy of the subset t. splitting criterion-Information Gain
splitting criterion-Gini coefficient The Gini coefficient measures the inequality among values of a frequency distribution. A Gini coefficient of zero expresses perfect equality, where all values are the same
It iterates through every unused attribute of the set S. calculates the entropy- H(S) or the Information gain- IG(A) of the attribute. Selects the attribute which has the smallest entropy or largest information gain value. pruning C4.5 Algorithm
The c’BAN Wireless communication device Wireless sensor platform Remote computational node
Data Collection protocol Five Labrador Retrievers Four different sensor sites: rump chest abdomen back
Description of the research CommentsThe BehaviorsNumber of repetitions Type of activity The dogs returned to a standing position between repetitions sitting, standing, lying down, eating off ground and standing on two legs 5static The dogs walk back to the starting position between repetitions walk up the stairs, walk across a platform to the ramp, walk down the ramp 3dynamic
The findings representative sample of acceleration data from the x-axis of the accelerometer in each of the four locations
The classification flow
stage 1 1. HMMs, Viterbi algorithm and Baum-Welch algorithm was used to identify each of the dynamic activities.
stage 2 2. decides if the activity is Dynamic or not.
stage 3 and 4 and 5 3. moving average filter. 4. Decision tree was used to distinguish between transitions and postures. 5. decides if the activity is a posture or a transition.
stage 6 and 7 6. Decision tree was used to classify the specific postures. 7. Finally the algorithm find the specific postures.
Conclusions
THANK YOU!