Overview and Mathematics Bjoern Griesbach

Slides:



Advertisements
Similar presentations
EKF, UKF TexPoint fonts used in EMF.
Advertisements

Extended Kalman Filter (EKF) And some other useful Kalman stuff!
(Includes references to Brian Clipp
Robot Localization Using Bayesian Methods
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Observers and Kalman Filters
Introduction to Mobile Robotics Bayes Filter Implementations Gaussian filters.
Probabilistic Robotics: Kalman Filters
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
Probabilistic video stabilization using Kalman filtering and mosaicking.
SLAM: Simultaneous Localization and Mapping: Part I Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT.
Kalman Filtering COS 323, Spring 05. Kalman Filtering Assume that results of experiment (i.e., optical flow) are noisy measurements of system stateAssume.
Discriminative Training of Kalman Filters P. Abbeel, A. Coates, M
Probabilistic Robotics
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Computer Vision Linear Tracking Jan-Michael Frahm COMP 256 Some slides from Welch & Bishop.
Adaptive Signal Processing
ROBOT MAPPING AND EKF SLAM
1 Formation et Analyse d’Images Session 7 Daniela Hall 7 November 2005.
Kalman filter and SLAM problem
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
Mobile Robot controlled by Kalman Filter
Computational Stochastic Optimization: Bridging communities October 25, 2012 Warren Powell CASTLE Laboratory Princeton University
Markov Localization & Bayes Filtering
/09/dji-phantom-crashes-into- canadian-lake/
Human-Computer Interaction Human-Computer Interaction Tracking Hanyang University Jong-Il Park.
Computer vision: models, learning and inference Chapter 19 Temporal models.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.2: Kalman Filter Jürgen Sturm Technische Universität München.
3D SLAM for Omni-directional Camera
Computer vision: models, learning and inference Chapter 19 Temporal models.
Simultaneous Localization and Mapping Presented by Lihan He Apr. 21, 2006.
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Probabilistic Robotics Bayes Filter Implementations.
Young Ki Baik, Computer Vision Lab.
Mobile Robot Localization (ch. 7)
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
1 Value of information – SITEX Data analysis Shubha Kadambe (310) Information Sciences Laboratory HRL Labs 3011 Malibu Canyon.
1 Assignment, Project and Presentation Mobile Robot Localization by using Particle Filter by Chong Wang, Chong Fu, and Guanghui Luo. Tracking, Mapping.
NCAF Manchester July 2000 Graham Hesketh Information Engineering Group Rolls-Royce Strategic Research Centre.
Unscented Kalman Filter 1. 2 Linearization via Unscented Transform EKF UKF.
Beamformer dimensionality ScalarVector Features1 optimal source orientation selected per location. Wrong orientation choice may lead to missed sources.
An Introduction To The Kalman Filter By, Santhosh Kumar.
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Object Tracking - Slide 1 Object Tracking Computer Vision Course Presentation by Wei-Chao Chen April 05, 2000.
Using Kalman Filter to Track Particles Saša Fratina advisor: Samo Korpar
Tracking with dynamics
SLAM Tutorial (Part I) Marios Xanthidis.
Cameron Rowe.  Introduction  Purpose  Implementation  Simple Example Problem  Extended Kalman Filters  Conclusion  Real World Examples.
The Unscented Kalman Filter for Nonlinear Estimation Young Ki Baik.
Camera calibration from multiple view of a 2D object, using a global non linear minimization method Computer Engineering YOO GWI HYEON.
Kalman Filter and Data Streaming Presented By :- Ankur Jain Department of Computer Science 7/21/03.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Using Sensor Data Effectively
C. Canton1, J.R. Casas1, A.M.Tekalp2, M.Pardàs1
Tracking Objects with Dynamics
On Optimal Distributed Kalman Filtering in Non-ideal Situations
PSG College of Technology
Unscented Kalman Filter
Simultaneous Localization and Mapping
Lecture 10: Observers and Kalman Filters
Bayes and Kalman Filter
Extended Kalman Filter
Kalman Filtering COS 323.
Simultaneous Localization and Mapping
Kalman Filter: Bayes Interpretation
Extended Kalman Filter
Presentation transcript:

Overview and Mathematics Bjoern Griesbach griesbac@in.tum.de Sensor Fusion Systems Overview and Mathematics Bjoern Griesbach griesbac@in.tum.de

Sensor Fusion Systems [Content] Introduction Motivation Different tracking options Existing Multi Sensor Fusion Systems Fusion of head mounted and fixed sensor data Fusion of magnetic & optical sensor data Fusion of gyroscope & optical sensor data Open Tracker – an open source AR software Mathematics of Sensor Fusion Kalman Filter Particle Filter

Motivation Each sensor has its strengths and weaknesses One sensor is never sufficient for reliable tracking Optimal tracking = use multiple sensors Precise Estimation Data Fusion Component Noisy Data Noisy Data Noisy Data Sensor Sensor Sensor

Motivation Multi Sensor Fusion used in various fields of research: Augmented Reality Virtual Reality Mobile Robots Air traffic control

Different Tracking Options Technology Location Magnetic Tracking reliable stable fast Optical Tracking precise time consuming Gyroscope drift error Fixed Trackers no limit in size, weight highly precise (e.g. stereo vision) in tracking objects bad for head orientation Mobile Trackers (i.e. Head Mounted Tracker) good for head orientation limited in size & weight less precise in tracking objects

Content Introduction Existing Multi Sensor Fusion Systems Motivation Different tracking options Existing Multi Sensor Fusion Systems Fusion of head mounted and fixed sensor data Fusion of magnetic & optical sensor data Fusion of gyroscope & optical sensor data Open Tracker – an open source AR software Mathematics of Sensor Fusion Kalman Filter Particle Filter

Fusion of Data from Head Mounted and Fixed Sensors Two optical trackers: Mobile (Head Mounted) Fixed Wanted: Fusing data of fixed and mobile tracker: Hybrid inside-out & outside-in approach in order to Estimate pose of a certain object (for example a head’s pose)  How to realize? = kalman simple!

Fusion of Data from Head-Mounted and Fixed Sensors Pose of an object is represented as a vector =(x,y,z,α,β,γ); By transforming poses of two different sensors into the same coordinate system, one will get two (noisy) measurements and for the same object Each measurement is weighted differently  depends on the variance of the measurement Each measurement has its own variance σi2 represented by a matrix Pi Random variable z, x

Fusion of Data from Head-Mounted and Fixed Sensors Given: pose measurements: z1 z2 with covariance matrices P1 P2 from two different sensors Wanted: optimal weights to get optimal estimate Solution: Optimal estimate x with minimal combined covariance matrix P: In the experiment the following equation was used: Beispiel: z1 hat kleiner varianz als z2 -> genauer -> groesseres gewicht! Already a simple form of the Kalman Filter - Remember!

Fusion of Data from Head-Mounted and Fixed Sensors Result: max. translational error was reduced by 90% Experiment by W. Hoff, First International Workshop on AR San Francisco

Content Introduction Existing Multi Sensor Fusion Systems Motivation Different tracking options Existing Multi Sensor Fusion Systems Fusion of head mounted and fixed sensor data Fusion of magnetic & optical sensor data Fusion of gyroscope & optical sensor data Open Tracker – an open source AR software Mathematics of Sensor Fusion Kalman Filter Particle Filter

Fusion of Magnetic and Optical Trackers E.g. Studierstube in Vienna: HMD with magnetic tracker & stereo camera system estimating pose by Landmark Tracking pose estimation from magnetic tracker is used to predict feature locations in the image optical tracking system can thus work with small search areas Result: Features of the entire system more precise than an magnetic tracker faster and more reliable than an optical tracker Flipchart: Pose estimation from magnetic tracker is used to predict feature locations in the image

Fusion of Magnetic and Optical Trackers Landmark Predictor: keeps track of potentially detectable landmarks and sorts them improves head pose after each newly found landmark tells IA where to search for landmarks Image Analyzer: inspects search area defined by LP

Content Introduction Existing Multi Sensor Fusion Systems Motivation Different tracking options Existing Multi Sensor Fusion Systems Fusion of head mounted and fixed sensor data Fusion of magnetic & optical sensor data Fusion of gyroscope & optical sensor data Open Tracker – an open source AR software Mathematics of Sensor Fusion Kalman Filter Particle Filter

Fusion of Gyroscope and Optical Tracker data Situation: Gyroscope (head mounted) Optical tracker (head mounted) Problem to solve: Gyroscope serves highly precise head orientation data but with a drift error Solution: Vision based drift compensation algorithm

Content Introduction Existing Multi Sensor Fusion Systems Motivation Different tracking options Existing Multi Sensor Fusion Systems Fusion of head mounted and fixed sensor data Fusion of magnetic & optical sensor data Fusion of gyroscope & optical sensor data Open Tracker – an open source AR software Mathematics of Sensor Fusion Kalman Filter Particle Filter

Open Tracker: An open Software Framework for AR Most implementations of AR Systems were not portable solutions. Reason: data flow in AR Systems is each time implemented specifically for this solution Need for a standard which handles data flow in AR Systems Open Tracker Open source framework Configurable via XML Object oriented Uses a directional graph to describe data flow Eases setting up an AR environment (e.g. distributed, several trackers, etc.)

Open Tracker: Example Optical Tracker [Mobile] Magnetic Tracker Multicast Information to multiple users on a network Console Fusion Filter Sink node Transformation Filter Noise Filter Filter node Optical Tracker [Mobile] Optical Tracker [Fixed] Magnetic Tracker [Mobile] Source node

Content Introduction Existing Multi Sensor Fusion Systems Motivation Different tracking options Existing Multi Sensor Fusion Systems Fusion of head mounted and fixed sensor data Fusion of magnetic & optical sensor data Fusion of gyroscope & optical sensor data Open Tracker – an open source AR software Mathematics of Sensor Fusion Kalman Filter Particle Filter Not only kalman and particle filter but also other tools, kalman and particle filter not primarily for sensor fusion but extensions

Kalman Filter Step by step introduction: Static KF Basic KF Extended KF Sensor Fusion with the KF First steps do not hav somethng in common with sensor fusion, but necessary to understand the kalman filter

Kalman Filter Optimal data processing algorithm Major use: filter out noise of measurement data (but can also be applied to other fields, e.g. Sensor Fusion) Result: Computes an optimal estimation of the state of an observed system based on measurements Recursive Optimal: incorporates all information (i.e. measurement data) that can be provided to it Does not need to keep all previous measurement data in storage!

Kalman Filter Conventions:  We observe a system with: x : state of the system z : measurement (approximates x) σ2 : variance of a measurement : vector P : covariance matrix : best estimate of state : best estimate before measurement was taken

Introduction Kalman Filter Assumptions: Two scalar sensor measurements z1 and z2 Gaussian noise, i.e. zi ~ N(0, σi2 )  Optimal state estimate: Example: 1D location of an object (e.g. tree at the horizon with binoculars, distance to a house) Derivation on flipchart x=wz1+(1-w)z2 & σ2 =w2σ12 +(1-w)2 σ22 because of Gaussian noise i.e. zi~N(0, σi2 ) find optimal w: derivation of σ2 (w)  minimum Transform equations of 1.) into x=z2+w(z1-z2) and σ2 =(1-w)σ22 Compare result to fusion of fixed and mobile sensors flip 1 and 2, K=w Voila static kalman Kalman Gain

Introduction Kalman Filter Let’s incorporate time! Measurements z1 and z2 were taken sequentially  z(t1), z(t2)  Optimal state estimate at time t2 : What happens if we take another measurement z(t3)? Kalman Gain

Introduction Kalman Filter Let’s take more measurements (time continues) Incorporate previous knowledge (last estimate)  Optimal state estimate at time tk : New view of the equation: old estimate + difference (old estimate, new measurement) times Kalman gain Kalman Gain

Introduction Kalman Filter Static Kalman Filter:

Introduction Kalman Filter Previous slides: static state Now: Dynamic state x (stochastic process) Idea: Use knowledge about process x in addition to measurements to obtain best estimate: Example: car instead of tree

Introduction Kalman Filter Previous slides: static state Now: Dynamic state x (stochastic process) Idea: Use knowledge about process x in addition to measurements to obtain best estimate: Process Model: (example) x(tk)=x(tk-1)+u+w Noise: w ~ N(0, σw2 ) Measurement Model: z(tk)=x(tk)+v Noise: v ~ N(0, σz2 ) No hat on x! why? U is 100m each time step for example! State transition equation!

Introduction Kalman Filter Process & Measurement Model z(tk-1) z(tk) z(tk+1) Measurements (observed) Measurement Model (measurement equation) States of the system (cannot be observed) x(tk-1) x(tk) x(tk+1) Process Model (state transition equation)

Introduction Kalman Filter Process & Measurement Model z(tk-1) z(tk) z(tk+1) Measurements (observed) Measurement Model (measurement equation) States of the system (cannot be observed) x(tk-1) x(tk) x(tk+1) Process Model (state transition equation) Kalman Filter evolves two step algorithm: Predict: via process model Correct: via measurement model

Introduction Kalman Filter Kalman Filter Algorithm (simplified): tk := tk+1 Start with init values Kalman Gain 2. Correct: with measurement 1. Predict: (superminus!) with Process Model Measurement not yet taken!

Kalman Filter: Possible Extensions Extending to Vector World: Previous scalar state x becomes vector which contains all relevant information of a state in a certain system. For example: State vector: =(x,y,z,α,β,γ) Process Model: Covariance Matrix: Matrices are time dependent e.g. A(t), B(t) Using a non linear process model  Extended Kalman Filter (EKF) State transition matrix A Input transformation matrix B ( u can be of different dimension than x) Omitting variables also can!

Basic Kalman Filter Process Model: Measurement Model: Algorithm: (vector arrows omitted!) 1. Predict 2. Correct Kalman Gain

Basic Kalman Filter [abstract] Process Model: Measurement Model: Algorithm: Predict via process model Correct via measurement model Every small letter implicitly is a vector!  Idea of the Kalman Filter

How to use a Kalman Filter Find a state representation Find a process model Find a measurement model  Many ways to apply a Kalman Filter, i.e. depends on the chosen models! How to apply KF for Sensor Fusion?

Kalman Filter: Sensor Fusion Examples: Static KF: As seen before (not a real KF!) Basic KF: Measurement vector incorporates data of all sensors. Covariance Matrix R weights data of different sensors according to their strength “Advanced” KF of G. Welch / G. Bishop: Asynchronous algorithm Uses multiple measurement models

Kalman Filter: Sensor Fusion [with Basic Kalman Filter] Process Model: Measurement Model: Measurement Vector incorporates all measurements: Each time step, data of all the sensors has to be available. What if this is not the case? Covariance matrix R reflects variances of different sensors!  Then use “normal” Basic KF algorithm

Kalman Filter: Sensor Fusion [Advanced approach Welch/Bishop] Process model: State Representation: State transition via A:  A relates for example: No discrete time steps System noise: w with covariance matrix Q

Kalman Filter: Sensor Fusion [Advanced approach Welch/Bishop] Individual measurement model for Sensor i: Measurement Function: hi(●) with corresponding Jacobian Hi: Wie funktioniert h(), was sind b() und c() und was sind die corresponding jacobians? Measurement noise: v with covariance matrix R

Kalman Filter: Sensor Fusion [Advanced approach Welch/Bishop] Asynchronous algorithm Each time a new measurement z becomes available, a new estimate x will be computed Sensor 2 Sensor 3 Kalman Fusion Filter Sensor 1

Kalman Filter: Sensor Fusion [Advanced approach Welch/Bishop] Algorithm: 1. Predict 2. Correct Kalman Gain Predicted measurement i Corresponding Jacobian

Content Introduction Existing Multi Sensor Fusion Systems Motivation Different tracking options Existing Multi Sensor Fusion Systems Fusion of head mounted and fixed sensor data Fusion of magnetic & optical sensor data Fusion of gyroscope & optical sensor data Open Tracker – an open source AR software Mathematics of Sensor Fusion Kalman Filter Particle Filter

Particle Filters To handle non linear processes To handle non Gaussian Noise Process and measurement models but different algorithm  slower Refer to: ”Particle Filters: an overview”, M. Muehlich Extensions: e.g. Decentralized Sensor Fusion with Distributed Particle Filters Based on Bayes Filter (like Kalman)

Conclusion Multi Sensor Fusion Sensor Fusion is of increasing interest due to higher tracking demands in AR Sensor Fusion can be complex and therefore has greater computational requirements Future work: standardizing AR fusion systems Open Tracker

Questions ?