Tracking Unknown Dynamics - Combined State and Parameter Estimation Tracking Unknown Dynamics - Combined State and Parameter Estimation Presenters: Hongwei.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
Dynamic Bayesian Networks (DBNs)
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 10: The Bayesian way to fit models Geoffrey Hinton.
STAT 497 APPLIED TIME SERIES ANALYSIS
Observers and Kalman Filters
Bayesian Model Selection and Multi-target Tracking Presenters: Xingqiu Zhao and Nikki Hu Joint work with M. A. Kouritzin, H. Long, J. McCrosky, W. Sun.
On Systems with Limited Communication PhD Thesis Defense Jian Zou May 6, 2004.
Artificial Learning Approaches for Multi-target Tracking Jesse McCrosky Nikki Hu.
Outline Formulation of Filtering Problem General Conditions for Filtering Equation Filtering Model for Reflecting Diffusions Wong-Zakai Approximation.
280 SYSTEM IDENTIFICATION The System Identification Problem is to estimate a model of a system based on input-output data. Basic Configuration continuous.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
2. Point and interval estimation Introduction Properties of estimators Finite sample size Asymptotic properties Construction methods Method of moments.
Goals of Adaptive Signal Processing Design algorithms that learn from training data Algorithms must have good properties: attain good solutions, simple.
Prepared By: Kevin Meier Alok Desai
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Novel approach to nonlinear/non- Gaussian Bayesian state estimation N.J Gordon, D.J. Salmond and A.F.M. Smith Presenter: Tri Tran
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
1 Formation et Analyse d’Images Session 7 Daniela Hall 7 November 2005.
Collaborative Filtering Matrix Factorization Approach
MITACS-PINTS Prediction In Interacting Systems Project Leader : Michael Kouriztin.
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
Introduction to estimation theory Seoul Nat’l Univ.
System Identification of Nonlinear State-Space Battery Models
CHAPTER 4 S TOCHASTIC A PPROXIMATION FOR R OOT F INDING IN N ONLINEAR M ODELS Organization of chapter in ISSO –Introduction and potpourri of examples Sample.
PINTS Network. Multiple Target Tracking Nonlinear Filtering Used for detection, tracking, and prediction of a target in a noisy environment Based entirely.
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
A new Ad Hoc Positioning System 컴퓨터 공학과 오영준.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Lecture 4: Statistics Review II Date: 9/5/02  Hypothesis tests: power  Estimation: likelihood, moment estimation, least square  Statistical properties.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Generalised method of moments approach to testing the CAPM Nimesh Mistry Filipp Levin.
CY3A2 System identification1 Maximum Likelihood Estimation: Maximum Likelihood is an ancient concept in estimation theory. Suppose that e is a discrete.
An Introduction to Kalman Filtering by Arthur Pece
Confidence Interval & Unbiased Estimator Review and Foreword.
Linearization and Newton’s Method. I. Linearization A.) Def. – If f is differentiable at x = a, then the approximating function is the LINEARIZATION of.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
CHAPTER 10 Widrow-Hoff Learning Ming-Feng Yeh.
3.7 Adaptive filtering Joonas Vanninen Antonio Palomino Alarcos.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
NONLINEAR CONTROL with LIMITED INFORMATION Daniel Liberzon Coordinated Science Laboratory and Dept. of Electrical & Computer Eng., Univ. of Illinois at.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
1 Lu LIU and Jie HUANG Department of Mechanics & Automation Engineering The Chinese University of Hong Kong 9 December, Systems Workshop on Autonomous.
State-Space Recursive Least Squares with Adaptive Memory College of Electrical & Mechanical Engineering National University of Sciences & Technology (NUST)
Computacion Inteligente Least-Square Methods for System Identification.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Estimation Econometría. ADE.. Estimation We assume we have a sample of size T of: – The dependent variable (y) – The explanatory variables (x 1,x 2, x.
Proposed Courses. Important Notes State-of-the-art challenges in TV Broadcasting o New technologies in TV o Multi-view broadcasting o HDR imaging.
Process Dynamics and Operations Group (DYN) TU-Dortmund
Probability Theory and Parameter Estimation I
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Ch3: Model Building through Regression
Assoc. Prof. Dr. Peerapol Yuvapoositanon
Policy Gradient in Continuous Time
Lecture 10: Observers and Kalman Filters
Collaborative Filtering Matrix Factorization Approach
Filtering and State Estimation: Basic Concepts
Lecture 4: Econometric Foundations
Bayes and Kalman Filter
16. Mean Square Estimation
Maximum Likelihood Estimation (MLE)
Presentation transcript:

Tracking Unknown Dynamics - Combined State and Parameter Estimation Tracking Unknown Dynamics - Combined State and Parameter Estimation Presenters: Hongwei Long and Nikki Hu Joint work with M.A. Kouritzin University of Alberta Supported by NSERC, MITACS, PIMS, Lockheed Martin Naval Electronics and Surveillance System, Lockheed Martin Canada, APR. Inc

Outline 1.Introduction 2.Algorithm for Combined State and Parameter Estimation 3.Simulation Results 4. Convergence of the Algorithm

1. Introduction 1. Introduction Signal contains unknown parameters Parameter estimation - difficult for nonlinear partially-observed stochastic systems Some typical methods: –Least squares methods –Methods of moments –Maximum likelihood methods –Filtering methods

Develop novel algorithm –Recursive particle prediction error identification method Branching particle filter method: –For state estimation Wide applications: –Financial markets –Signal processing –Communication networks –Target detection and tracking, pollution tracking –Search and rescue

2. Algorithms for Combined State and Parameter Estimation Particle Prediction Error Identification Method –Signal model : (n-dimensional) (1) –Observation Model : sequence of k-dimensional random vectors

: unknown parameter vector, D is a compact subset of, :state noise, : observation noise, Gaussian random variable with mean zero and variance :actual observation data, : “true” value of the parameter vector Define Find best estimator of in least squares sense:

Algorithm: (i) Initialization: N particles, : initial guess (ii) Evolution: each evolves according to the law of the signal process; also

(iii) Parameter Estimation: where which is approximated by

And which is approximated by Let,

(iv) Selection: calculate and compare it with uniform random variable to determine if the particle should: (1) be branched into two or more particles, or (2) stay in the current state, or (3) be removed The steps (ii)-(iv) are repeated at each observation time

We consider where is unknown parameter vector. 3. Simulation Results 3. Simulation Results

Observation has the form where is the total number of pixels on the domain, and In our simulation, n=1, = , = and = , =0.2, =0.4, =0.1 and =10.

Simulation

4. Convergence of the Algorithm Under certain regularity and stability conditions, almost surely. : the global minimum of an ordinary differential equation The convergence is associated to the asymptotic stability of the ordinary differential equation

Proof ideas: –Establishing some uniform moment estimates for the signal process as well as its gradient with respect to –Using mixing conditions, ergodic theorem and Gronwall lemma (discrete version) to prove the almost sure convergence