École Doctorale des Sciences de l'Environnement d’ Î le-de-France Année 2009-2010 Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.

Slides:



Advertisements
Similar presentations
Introduction to Data Assimilation NCEO Data-assimilation training days 5-7 July 2010 Peter Jan van Leeuwen Data Assimilation Research Center (DARC) University.
Advertisements

Probabilistic Reasoning over Time
ASSIMILATION OF SATELLITE TRACER DATA AND OPTIMISATION USING SELF-CONSISTENCY DIAGNOSTICS Saad Rharmili, Slimane Bekki, SA-IPSL, CNRS/UPMC.
AGC DSP AGC DSP Professor A G Constantinides©1 Modern Spectral Estimation Modern Spectral Estimation is based on a priori assumptions on the manner, the.
Observers and Kalman Filters
Introduction to Mobile Robotics Bayes Filter Implementations Gaussian filters.
Ibrahim Hoteit KAUST, CSIM, May 2010 Should we be using Data Assimilation to Combine Seismic Imaging and Reservoir Modeling? Earth Sciences and Engineering.
Data assimilation schemes in numerical weather forecasting and their link with ensemble forecasting Gérald Desroziers Météo-France, Toulouse, France.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Validation Olivier Talagrand WMO Workshop on 4D-Var and Ensemble Kalman Filter Intercomparisons Buenos Aires, Argentina 13 November 2008.
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Prepared By: Kevin Meier Alok Desai
Carbon Flux Bias Estimation at Regional Scale using Coupled MLEF-PCTM model Ravindra Lokupitiya Department of Atmospheric Science Colorado State University.
Advanced data assimilation methods- EKF and EnKF Hong Li and Eugenia Kalnay University of Maryland July 2006.
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Basic Mathematics for Portfolio Management. Statistics Variables x, y, z Constants a, b Observations {x n, y n |n=1,…N} Mean.
ROBOT MAPPING AND EKF SLAM
Slam is a State Estimation Problem. Predicted belief corrected belief.
Helsinki University of Technology Adaptive Informatics Research Centre Finland Variational Bayesian Approach for Nonlinear Identification and Control Matti.
Lecture 11: Kalman Filters CS 344R: Robotics Benjamin Kuipers.
Ensemble Data Assimilation and Uncertainty Quantification Jeffrey Anderson, Alicia Karspeck, Tim Hoar, Nancy Collins, Kevin Raeder, Steve Yeager National.
Computer vision: models, learning and inference Chapter 19 Temporal models.
Assimilation of Observations and Bayesianity Mohamed Jardak and Olivier Talagrand Laboratoire de Météorologie Dynamique, École Normale Supérieure, Paris,
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.2: Kalman Filter Jürgen Sturm Technische Universität München.
Computer vision: models, learning and inference Chapter 19 Temporal models.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss High-resolution data assimilation in COSMO: Status and.
« Data assimilation in isentropic coordinates » Which Accuracy can be achieved using an high resolution transport model ? F. FIERLI (1,2), A. HAUCHECORNE.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
I Reanalysis of the Radiation Belt Fluxes Using CRRES and Akebono Satellites. II What can we Learn From Reanalysis. Yuri Shprits 1, Binbin Ni 1, Yue Chen.
Assimilation of HF Radar Data into Coastal Wave Models NERC-funded PhD work also supervised by Clive W Anderson (University of Sheffield) Judith Wolf (Proudman.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Human-Computer Interaction Kalman Filter Hanyang University Jong-Il Park.
École Doctorale des Sciences de l'Environnement d’Île-de-France Année Universitaire Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.
MODEL ERROR ESTIMATION EMPLOYING DATA ASSIMILATION METHODOLOGIES Dusanka Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University.
Maximum Likelihood Estimation and Simplified Kalman Filter tecniques for real time Data Assimilation.
Data assimilation and forecasting the weather (!) Eugenia Kalnay and many friends University of Maryland.
Processing Sequential Sensor Data The “John Krumm perspective” Thomas Plötz November 29 th, 2011.
Covariance Modeling “Where America’s Climate, Weather, Ocean and Space Weather Services Begin” JCSDA Summer Colloquium July 27, 2015 Presented by John.
An Introduction to Kalman Filtering by Arthur Pece
NCAF Manchester July 2000 Graham Hesketh Information Engineering Group Rolls-Royce Strategic Research Centre.
Additional Topics in Prediction Methodology. Introduction Predictive distribution for random variable Y 0 is meant to capture all the information about.
An Introduction To The Kalman Filter By, Santhosh Kumar.
École Doctorale des Sciences de l'Environnement d’Île-de-France Année Universitaire Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.
Tracking with dynamics
École Doctorale des Sciences de l'Environnement d’Île-de-France Année Universitaire Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.
Data assimilation applied to simple hydrodynamic cases in MATLAB
Cameron Rowe.  Introduction  Purpose  Implementation  Simple Example Problem  Extended Kalman Filters  Conclusion  Real World Examples.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Robust Localization Kalman Filter & LADAR Scans
École Doctorale des Sciences de l'Environnement d’Île-de-France Année Universitaire Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.
École Doctorale des Sciences de l'Environnement d’ Î le-de-France Année Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.
Demonstration and Comparison of Sequential Approaches for Altimeter Data Assimilation in HYCOM A. Srinivasan, E. P. Chassignet, O. M. Smedstad, C. Thacker,
Geology 5670/6670 Inverse Theory 12 Jan 2015 © A.R. Lowry 2015 Read for Wed 14 Jan: Menke Ch 2 (15-37) Last time: Course Introduction (Cont’d) Goal is.
Progea S.r.l Bologna & SMHI CARPE DIEM 6TH PROJECT MEETING JUNE 24 HELSINKI WP 4: Assessment of Nwp Model Uncertainty Including Models Errors Dott.ssa.
Environmental Data Analysis with MatLab 2 nd Edition Lecture 22: Linear Approximations and Non Linear Least Squares.
June 20, 2005Workshop on Chemical data assimilation and data needs Data Assimilation Methods Experience from operational meteorological assimilation John.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Space and Time Mesoscale Analysis System — Theory and Application 2007
PSG College of Technology
Course: Autonomous Machine Learning
Kalman Filtering: Control with Limited/Noisy Measurements
Modern Spectral Estimation
Optimal prediction of x
Filtering and State Estimation: Basic Concepts
École Doctorale des Sciences de l'Environnement d’Île-de-France Année Universitaire Modélisation Numérique de l’Écoulement Atmosphérique.
Multivariate Methods Berlin Chen
Parameterizations in Data Assimilation
Presentation transcript:

École Doctorale des Sciences de l'Environnement d’ Î le-de-France Année Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation d'Observations Olivier Talagrand Cours 6 28 Mai 2010

Best Linear Unbiased Estimate State vector x, belonging to state space S (dim S = n), to be estimated. Available data in the form of  A ‘background’ estimate (e. g. forecast from the past), belonging to state space, with dimension n x b = x +  b  An additional set of data (e. g. observations), belonging to observation space, with dimension p y = Hx +  H is known linear observation operator. Assume probability distribution is known for the couple (  b,  ). Assume E(  b ) = 0, E(  ) = 0, E(  b  T ) = 0 (not restrictive) Set E(  b  b T ) = P b (also often denoted B), E(  T ) = R

Best Linear Unbiased Estimate (continuation 1) x b = x +  b (1) y = Hx +  (2) A probability distribution being known for the couple (  b,  ), eqs (1-2) define probability distribution for the couple (x, y), with E(x) = x b, x’ = x - E(x) = -  b E(y) = Hx b, y’ = y - E(y) = y - Hx b =  - H  b d  y - Hx b is called the innovation vector.

Best Linear Unbiased Estimate (continuation 2) Apply formulæ for Optimal Interpolation x a = x b + P b H T [HP b H T + R] -1 (y - Hx b ) P a = P b - P b H T [HP b H T + R] -1 HP b x a is the Best Linear Unbiased Estimate (BLUE) of x from x b and y. Equivalent set of formulæ x a = x b + P a H T R -1 (y - Hx b ) [P a ] -1 = [P b ] -1 + H T R -1 H Matrix K = P b H T [HP b H T + R] -1 = P a H T R -1 is gain matrix. If probability distributions are globally gaussian, BLUE achieves bayesian estimation, in the sense that P(x | x b, y) = N [x a, P a ].

After A. Lorenc

Best Linear Unbiased Estimate (continuation 3) H can be any linear operator Example : (scalar) satellite observation x = (x 1, …, x n ) T temperature profile Observation y =  i  h i x i +  = Hx + ,  H = (h 1, …, h n ), E(  2 ) = r Backgroundx b = (x 1 b, …, x n b ) T, error covariance matrix P b = (p ij b ) x a = x b + P b H T [HP b H T + R] -1 (y - Hx b ) [HP b H T + R] -1 (y - Hx b ) = (y -   h  x  b ) / (  ij h i h j p ij b + r) -1   scalar !  P b = p b I n x i a = x i b + p b h i   P b = diag(p ii b ) x i a = x i b + p ii b h i   General case x i a = x i b +  j p ij b h j  Each level i is corrected, not only because of its own contribution to the observation, but because of the contribution of the other levels to which its background error is correlated.

Best Linear Unbiased Estimate (continuation 4) Variational form of the BLUE BLUE x a minimizes following scalar objective function, defined on state space  S   J (  ) = (1/2) (x b -  ) T [P b ] -1 (x b -  ) + (1/2) (y - H  ) T R -1 (y - H  ) = J b + J o ‘3D-Var’ Can easily, and heuristically, be extended to the case of a nonlinear observation operator H. Used operationally in USA, Australia, China, …

Question. How to introduce temporal dimension in estimation process ?  Logic of Optimal Interpolation can be extended to time dimension.  But we know much more than just temporal correlations. We know explicit dynamics. Real (unknown) state vector at time k (in format of assimilating model) x k. Belongs to state space S (dim S = n) Evolution equation x k+1 = M k (x k ) +  k M k is (known) model,  k is (unknown) model error

Sequential Assimilation Assimilating model is integrated over period of time over which observations are available. Whenever model time reaches an instant at which observations are available, state predicted by the model is updated with new observations. Variational Assimilation Assimilating model is globally adjusted to observations distributed over observation period. Achieved by minimization of an appropriate scalar objective function measuring misfit between data and sequence of model states to be estimated.

 Observation vector at time k y k = H k x k +  k k = 0, …, K E(  k ) = 0 ; E(  k  j T ) = R k  kj H k linear  Evolution equation x k+1 = M k x k +  k k = 0, …, K-1 E(  k ) = 0 ; E(  k  j T ) = Q k  kj M k linear  E(  k  j T ) = 0 (errors uncorrelated in time)

At time k, background x b k and associated error covariance matrix P b k known  Analysis step x a k = x b k + P b k H k T [H k P b k H k T + R k ] -1 (y k - H k x b k ) P a k = P b k - P b k H k T [H k P b k H k T + R k ] -1 H k P b k  Forecast step x b k+1 = M k x a k P b k+1 = E[(x b k+1 - x k+1 )(x b k+1 - x k+1 ) T ] = E[(M k x a k - M k x k -  k )(M k x a k - M k x k -  k ) T ] = M k E[(x a k - x k )(x a k - x k ) T ]M k T - E[  k (x a k - x k ) T ] - E[(x a k - x k )  k T ] + E[  k  k T ] = M k P a k M k T + Q k

At time k, background x b k and associated error covariance matrix P b k known  Analysis step x a k = x b k + P b k H k T [H k P b k H k T + R k ] -1 (y k - H k x b k ) P a k = P b k - P b k H k T [H k P b k H k T + R k ] -1 H k P b k  Forecast step x b k+1 = M k x a k P b k+1 = M k P a k M k T + Q k Kalman filter (KF, Kalman, 1960) Must be started from some initial estimate (x b 0, P b 0 )

If all operators are linear, and if errors are uncorrelated in time, Kalman filter produces at time k the BLUE x b k (resp. x a k ) of the real state x k from all data prior to (resp. up to) time k, plus the associated estimation error covariance matrix P b k (resp. P a k ). If in addition errors are gaussian, the corresponding conditional probability distributions are the respective gaussian distributions N [x b k, P b k ] and N [x a k, P a k ].

Nonlinearities ? Model is usually nonlinear, and observation operators (satellite observations) tend more and more to be nonlinear.  Analysis step x a k = x b k + P b k H k ’ T [H k ’P b k H k ’ T + R k ] -1 [y k - H k (x b k )] P a k = P b k - P b k H k ’ T [H k ’P b k H k ’ T + R k ] -1 H k ’ P b k  Forecast step x b k+1 = M k (x a k ) P b k+1 = M k ’ P a k M k ’ T + Q k Extended Kalman Filter (EKF, heuristic !)

Costliest part of computation P b k+1 = M k P a k M k T + Q k Multiplication by M k = one integration of the model between times k and k+1. Computation of M k P a k M k T ≈ 2n integrations of the model Need for determining the temporal evolution of the uncertainty on the state of the system is the major difficulty in assimilation of meteorological and oceanographical observations

Analysis of 500-hPa geopotential for 1 December 1989, 00:00 UTC (ECMWF, spectral truncation T21, unit m. After F. Bouttier)

Temporal evolution of the 500-hPa geopotential autocorrelation with respect to point located at 45N, 35W. From top to bottom: initial time, 6- and 24-hour range. Contour interval 0.1. After F. Bouttier.