Non-Gaussian Data Assimilation using Mixture and Kernel Piyush Tagade Hansjorg Seybold Sai Ravela Earth Systems and Signals Group Earth Atmospheric and.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

1 -Classification: Internal Uncertainty in petroleum reservoirs.
Bayesian Belief Propagation
Data-Assimilation Research Centre
1B.17 ASSESSING THE IMPACT OF OBSERVATIONS AND MODEL ERRORS IN THE ENSEMBLE DATA ASSIMILATION FRAMEWORK D. Zupanski 1, A. Y. Hou 2, S. Zhang 2, M. Zupanski.
P1.8 QUANTIFYING AND REDUCING UNCERTAINTY BY EMPLOYING MODEL ERROR ESTIMATION METHODS Dusanka Zupanski Cooperative Institute for Research in the Atmosphere.
Factorial Mixture of Gaussians and the Marginal Independence Model Ricardo Silva Joint work-in-progress with Zoubin Ghahramani.
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Pattern Recognition and Machine Learning
1 アンサンブルカルマンフィルターによ る大気海洋結合モデルへのデータ同化 On-line estimation of observation error covariance for ensemble-based filters Genta Ueno The Institute of Statistical.
Learning Inhomogeneous Gibbs Models Ce Liu
Ibrahim Hoteit KAUST, CSIM, May 2010 Should we be using Data Assimilation to Combine Seismic Imaging and Reservoir Modeling? Earth Sciences and Engineering.
Hilbert Space Embeddings of Hidden Markov Models Le Song, Byron Boots, Sajid Siddiqi, Geoff Gordon and Alex Smola 1.
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
Application of Statistical Techniques to Neural Data Analysis Aniket Kaloti 03/07/2006.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Optimal Adaptation for Statistical Classifiers Xiao Li.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
A Unifying Review of Linear Gaussian Models
. Expressive Graphical Models in Variational Approximations: Chain-Graphs and Hidden Variables Tal El-Hay & Nir Friedman School of Computer Science & Engineering.
Platzhalter für Bild, Bild auf Titelfolie hinter das Logo einsetzen Oliver Pajonk, Bojana Rosic, Alexander Litvinenko, Hermann G. Matthies ISUME 2011,
Classification: Internal Status: Draft Using the EnKF for combined state and parameter estimation Geir Evensen.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Computer vision: models, learning and inference Chapter 19 Temporal models.
Nonlinear Data Assimilation and Particle Filters
Segmental Hidden Markov Models with Random Effects for Waveform Modeling Author: Seyoung Kim & Padhraic Smyth Presentor: Lu Ren.
Least-Mean-Square Training of Cluster-Weighted-Modeling National Taiwan University Department of Computer Science and Information Engineering.
Computer vision: models, learning and inference Chapter 19 Temporal models.
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
Generative Topographic Mapping by Deterministic Annealing Jong Youl Choi, Judy Qiu, Marlon Pierce, and Geoffrey Fox School of Informatics and Computing.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Young Ki Baik, Computer Vision Lab.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 3: LINEAR MODELS FOR REGRESSION.
Mean Field Variational Bayesian Data Assimilation EGU 2012, Vienna Michail Vrettas 1, Dan Cornford 1, Manfred Opper 2 1 NCRG, Computer Science, Aston University,
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.
2004 SIAM Annual Meeting Minisymposium on Data Assimilation and Predictability for Atmospheric and Oceanographic Modeling July 15, 2004, Portland, Oregon.
Applications of optimal control and EnKF to Flow Simulation and Modeling Florida State University, February, 2005, Tallahassee, Florida The Maximum.
Continuous Variables Write message update equation as an expectation: Proposal distribution W t (x t ) for each node Samples define a random discretization.
MODEL ERROR ESTIMATION EMPLOYING DATA ASSIMILATION METHODOLOGIES Dusanka Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University.
CS Statistical Machine learning Lecture 24
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Wei Sun and KC Chang George Mason University March 2008 Convergence Study of Message Passing In Arbitrary Continuous Bayesian.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
ICCV 2007 National Laboratory of Pattern Recognition Institute of Automation Chinese Academy of Sciences Half Quadratic Analysis for Mean Shift: with Extension.
École Doctorale des Sciences de l'Environnement d’Île-de-France Année Universitaire Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.
30 November, 2005 CIMCA2005, Vienna 1 Statistical Learning Procedure in Loopy Belief Propagation for Probabilistic Image Processing Kazuyuki Tanaka Graduate.
Environmental Data Analysis with MatLab 2 nd Edition Lecture 22: Linear Approximations and Non Linear Least Squares.
A Method to Approximate the Bayesian Posterior Distribution in Singular Learning Machines Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
1 C.A.L. Bailer-Jones. Machine Learning. Data exploration and dimensionality reduction Machine learning, pattern recognition and statistical data modelling.
Bayesian fMRI analysis with Spatial Basis Function Priors
Probability Theory and Parameter Estimation I
Variational filtering in generated coordinates of motion
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
Information content in ensemble data assimilation
Special Topics In Scientific Computing
Bayesian Models in Machine Learning
Probabilistic Models with Latent Variables
Filtering and State Estimation: Basic Concepts
10701 Recitation Pengtao Xie
Generally Discriminant Analysis
Expectation-Maximization & Belief Propagation
2. University of Northern British Columbia, Prince George, Canada
Probabilistic image processing and Bayesian network
Probabilistic image processing and Bayesian network
JFG de Freitas, M Niranjan and AH Gee
Lecture 11 Generalizations of EM.
CS639: Data Management for Data Science
Presentation transcript:

Non-Gaussian Data Assimilation using Mixture and Kernel Piyush Tagade Hansjorg Seybold Sai Ravela Earth Systems and Signals Group Earth Atmospheric and Planetary Sciences, Massachusetts Institute of Technology

Environmental DDDAS Mapping localized atmospheric phenomena (caos.mit.edu) MEnF We Explore MuIF Model UQ Targeting DA Planning Sensing

Key Areas Theory – Generalized Coherent Features Pattern theory for coherent fluids – Statistical Inference for GCF Methodology – Data-driven Nowcasting – DBFE Information theoretic UQ, Targeting, Estimation, Planning, and Control Application – Mapping localized phenomena Field Alignment -- DA Coalescence -- UQPAG modes – Model Reduction

Dynamically Biorthogonal Field Equations KL Expansion Biorthogonal Expansion Dynamic Orthogonality: Closed form solution for mean, eigenvalues and expansion coefficients Faster than gPC with comparable accuracy Tagade et al., ASME IDETC 2012, AIAA-NDA 2013

Inference Challenges: Nonlinearity, Dimensionality System Dynamics Measurements Bayesian Inference – Smoothing – Filtering Dimensionality Nonlinearity KF PF EnKF ?? EKF

Two Approaches Mutual Information Filter –Non Gaussian prior and likelihood –Distributions oriented –Quadratic non-Gaussian estimation, optimization approach –Tractable mutual information measures Mixture Ensemble Filter (MEnF) –GMM prior and Gaussian likelihood, joint state-parameter estimation –Direct Ensemble update, compact ensemble transform –Fast smoother forms –Marginal additional cost –Note: not for UQ

Approach 1: Mutual Information Filter Mutual Information Filter (MuIF): Non-Gaussian prior and likelihood, Maximization of Kapoor’s quadratic mutual information, equivalence with RKHS Shannon Entropy Mutual Information Inference: Maximization of Mutual Information Tractable? We use Generalization of Shannon Entropy: Renyi Entropy α-Information Potential: Others: Havrda-Chardat, Behara-Chawla, Sharma-Mittal Kapur establishes equivalence

Quadratic Mutual Information Special Interest: Kapoor’s result: – Also Cauchy-Schwartz Mutual Information via Information potentials Noparametric density estimation –Kernel density estimate Reinterpretation using RKHS Inference Filter Parameterization Maximization of MI: Steepest Gradient Ascent Kapur 63, Torkkola et al. 03, Principe et al., Ravela et al. 08, Tagade and Ravela, ACC14

Application Results: Lorenz 95 Lorenz 95 Model MuIF Outperforms EnKF Effect of Kernel Bandwidth

Approach 2: Mixture Ensemble Filter State of the Art: Means and Covariances of GMMs explicitly propagated and updated (Alspach and Sorenson, 1972) Ensemble to propagate uncertainty. Mixture moments explicitly calculated (Sondergaard and Lermusiax, 2013) Ensemble members used with balance and membership rules (Frei and Knusch, 2013) Key Contributions: Ensemble update equation via Jensen’s inequality Update reduces to compact ensemble transform Direct model-state adjustment, resampling not compulsory EM for mixture parameter estimation, also in reduced form Familiar message passing framework for smoothing O(1) – fixed lag, O(N) – fixed interval, in time

Mixture Ensemble Filter Joint state and parameter estimation Ensemble Update equation via Jensen’s inequality Ensemble Transform Smoother Form Ravela and McLaughlin, OD 2007; Tagade et al., under review

MEnF: Performance 90% Confidence Bound Comparison for MEnF with EnKF and MCMC Double Well Lorenz - 63

MEnF and Coherent Structures MEnF EnKF-FS EnKF-PS KdV: Idealizes coherent structure of solitary waves, non-linear superposition, chaotic

Conclusions and Future Work Nonlinearity and Dimensionality challenges in inference for large-scale fluids Optimization approach to non-Gaussian estimation Mixture Ensemble Filter and Mutual Information Filter Complexity of Mutual Information Filter – Fast Gauss Transform, and variations from RKHS – Smoother forms Convergence rates of MEnF Combine Inference with UQ and Planning for Clouds and Volcanoes.