Maximum Likelihood Estimation and Simplified Kalman Filter tecniques for real time Data Assimilation.

Slides:



Advertisements
Similar presentations
Variational data assimilation and forecast error statistics
Advertisements

Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
1B.17 ASSESSING THE IMPACT OF OBSERVATIONS AND MODEL ERRORS IN THE ENSEMBLE DATA ASSIMILATION FRAMEWORK D. Zupanski 1, A. Y. Hou 2, S. Zhang 2, M. Zupanski.
P1.8 QUANTIFYING AND REDUCING UNCERTAINTY BY EMPLOYING MODEL ERROR ESTIMATION METHODS Dusanka Zupanski Cooperative Institute for Research in the Atmosphere.
Probabilistic Reasoning over Time
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
Effects of model error on ensemble forecast using the EnKF Hiroshi Koyama 1 and Masahiro Watanabe 2 1 : Center for Climate System Research, University.
Observers and Kalman Filters
Presenter: Yufan Liu November 17th,
Ibrahim Hoteit KAUST, CSIM, May 2010 Should we be using Data Assimilation to Combine Seismic Imaging and Reservoir Modeling? Earth Sciences and Engineering.
Classification and risk prediction
Mobile Intelligent Systems 2004 Course Responsibility: Ola Bengtsson.
Lecture 8 The Principle of Maximum Likelihood. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.
Independent Component Analysis (ICA) and Factor Analysis (FA)
Prepared By: Kevin Meier Alok Desai
Retrieval Theory Mar 23, 2008 Vijay Natraj. The Inverse Modeling Problem Optimize values of an ensemble of variables (state vector x ) using observations:
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Another important method to estimate parameters Connection.
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Course AE4-T40 Lecture 5: Control Apllication
Basic Mathematics for Portfolio Management. Statistics Variables x, y, z Constants a, b Observations {x n, y n |n=1,…N} Mean.
Principles of the Global Positioning System Lecture 13 Prof. Thomas Herring Room A;
Slam is a State Estimation Problem. Predicted belief corrected belief.
Principles of the Global Positioning System Lecture 11 Prof. Thomas Herring Room A;
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Colorado Center for Astrodynamics Research The University of Colorado STATISTICAL ORBIT DETERMINATION Project Report Unscented kalman Filter Information.
Lecture 11: Kalman Filters CS 344R: Robotics Benjamin Kuipers.
Ensemble Data Assimilation and Uncertainty Quantification Jeffrey Anderson, Alicia Karspeck, Tim Hoar, Nancy Collins, Kevin Raeder, Steve Yeager National.
Nonlinear Data Assimilation and Particle Filters
CSDA Conference, Limassol, 2005 University of Medicine and Pharmacy “Gr. T. Popa” Iasi Department of Mathematics and Informatics Gabriel Dimitriu University.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Modern Navigation Thomas Herring
Dusanka Zupanski And Scott Denning Colorado State University Fort Collins, CO CMDL Workshop on Modeling and Data Analysis of Atmospheric CO.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
2004 SIAM Annual Meeting Minisymposium on Data Assimilation and Predictability for Atmospheric and Oceanographic Modeling July 15, 2004, Portland, Oregon.
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
MODEL ERROR ESTIMATION EMPLOYING DATA ASSIMILATION METHODOLOGIES Dusanka Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University.
The “ ” Paige in Kalman Filtering K. E. Schubert.
INVERSE MODELING OF ATMOSPHERIC COMPOSITION DATA Daniel J. Jacob See my web site under “educational materials” for lectures on inverse modeling atmospheric.
Data assimilation and forecasting the weather (!) Eugenia Kalnay and many friends University of Maryland.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION The Minimum Variance Estimate ASEN 5070 LECTURE.
Sabrina Rainwater David National Research Council Postdoc at NRL with Craig Bishop and Dan Hodyss Naval Research Laboratory Multi-scale Covariance Localization.
An Introduction to Kalman Filtering by Arthur Pece
An Introduction To The Kalman Filter By, Santhosh Kumar.
Machine Learning 5. Parametric Methods.
École Doctorale des Sciences de l'Environnement d’Île-de-France Année Universitaire Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.
Tracking with dynamics
Nonlinear State Estimation
École Doctorale des Sciences de l'Environnement d’Île-de-France Année Universitaire Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.
Estimating Volatilities and Correlations
Cameron Rowe.  Introduction  Purpose  Implementation  Simple Example Problem  Extended Kalman Filters  Conclusion  Real World Examples.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Ensemble forecasting/data assimilation and model error estimation algorithm Prepared by Dusanka Zupanski and Milija Zupanski CIRA/CSU Denning group meeting.
École Doctorale des Sciences de l'Environnement d’ Î le-de-France Année Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Computacion Inteligente Least-Square Methods for System Identification.
Progea S.r.l Bologna & SMHI CARPE DIEM 6TH PROJECT MEETING JUNE 24 HELSINKI WP 4: Assessment of Nwp Model Uncertainty Including Models Errors Dott.ssa.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
DSP-CIS Part-III : Optimal & Adaptive Filters Chapter-9 : Kalman Filters Marc Moonen Dept. E.E./ESAT-STADIUS, KU Leuven
Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date :
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Statistical Interpretation of Least Squares ASEN.
Probability Theory and Parameter Estimation I
Data Assimilation Theory CTCD Data Assimilation Workshop Nov 2005
PSG College of Technology
Kalman Filtering: Control with Limited/Noisy Measurements
Filtering and State Estimation: Basic Concepts
Error statistics in data assimilation
Principles of the Global Positioning System Lecture 11
Principles of the Global Positioning System Lecture 13
Presentation transcript:

Maximum Likelihood Estimation and Simplified Kalman Filter tecniques for real time Data Assimilation

Basic concepts State vector : atmospheric state of the forecast model True state : best representation of the reality Background (or priori) state : estimate of the true state calculated from the analysis Observations vector y :observations data used for a given analysis Innovation vector Observation matrix H (or transition matrix): linear operator

Modelling of errors Background unbiased error: Background error’s covariance matrix: Observation unbiased error: Observation error’s covariance matrix: Note: observation and backround errors are mutually uncorrelated

The Kalman Filter Equations for the analysis and propagation of the model state for each observation time step Equations for the analysis and propagation covariance matrices Kalman Gain Too large inverted matrix Innovation

On line estimation of covariance parameters Maximum likelihood approach suggested by Dee for Hirlam data assimilation [1] Dee, D.P., 1991: Simplification of the Kalman filter for meteorological data assimilation, Q.J.R. Meteorol. Soc. 117, [2] Dee, D.P., 1995: On-line estimation of error covariance parameters for atmospheric data assimilation, Mon. Wea. Rev. 123, Estimation tecniques imposing the optimatility condition of the filter for the innovation process at first tested for a 28-variable atmospheric model [3] Mehra, R.K., 1970: On the identification of variances and adaptive Kalman filtering. IEEE Trans. Automat. Contr. Vol. AC15, [4] Todini, E., 1978: Mutually interactive state-parameter (MISP) estimation. In Chao-Lin Chu (ed) Applications of Kalman Filter to Hydrology, Hydraulics, and Water Resources. Stochastic Hydraulics Program Dept. Civil Eng. University of Pittsburgh (Penn). [5] Lorenz, E, 1965: A study of the predicatibility of a 28-variable atmospheric model. Tellus XVII,

Maximum likelihood method We parametrize the aspected value of the innovation as follows We introduce the pdf which it is assumed to be Gaussian We find the estimation of the parameters maximizing the pdf or minimizing the log function of pdf

Application to HIRVDA Data Selected observational data for each type of observations Choose a appropiated sub-domain, because too many observations made the dimension of S too large to invert Calculate optimal parameter in each sub-domain and used to estimated the background error standard deviations used for HIRLAM forecast.

Estimations of statistics for observations and model errors Assuming the following conditions: The rank of matrices Q and R are equal The model error is stationary (Q is time indipendent) For each time step k Appling the last conditions we can calculate the recursive form for R and Q. For his retrictive conditions at first this method will be tested for a 28-variable meteorological model.