Processing Sequential Sensor Data The “John Krumm perspective” Thomas Plötz November 29 th, 2011
Sequential Data?
Sequential Data!
Sequential Data Analysis – Challenges Segmentation vs. Classification “chicken and egg” problem Noise, noise, and noise … … more noise [Evaluation – “Ground Truth”?]
Noise … filtering trivial (technically) - lag - no higher level variables (speed)
States vs. Direct Observations Idea: Assume (internal) state of the “system” Approach: Infer this very state by exploiting measurements / observations Examples: – Kalman Filter – Particle Filter – Hidden Markov Models
Kalman Filter state and observations: Explicit consideration of noise:
Kalman Filter – Linear Dynamics State at time i: linear function of state at time i-1 plus noise: System matrix describes linear relationship between i and i-1:
Kalman Filter – Parameters
Kalman Two-step procedure for every z i Result: mean and covariance of x i Step 1: extrapolate state and state error from previous estimates Step 2: update extrapolations with new measurement
Generalization: Particle Filter No linearity assumption, no Gaussian noise Sequence of unknown state vectors x i, and measurement vectors z i Probabilistic model for measurements, e.g. (!): … and for dynamics: PF samples from it, i.e., generates x i subject to p(x i | x i-1 )
Particle Filter: Dynamics Prediction of next state:
Particle Importance sampling Compute importance weights Selection Compute estimate of xi at any point Generate random x i from p(x i | x i-1 ) Sample new set of particles based on importance weights – filtering Original goal …
Particle
Hidden Markov Models Kalman Filter not very accurate Particle Filter computationally demanding HMMs somewhat in-between
HMMs Measurement model: conditional probability Dynamic model: limited memory; transition probabilities
HMMs, more classical application