Dario Grana and Tapan Mukerji Sequential approach to Bayesian linear inverse problems in reservoir modeling using Gaussian mixture models SCRF Annual Meeting,

Slides:



Advertisements
Similar presentations
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Advertisements

Maximum Likelihood And Expectation Maximization Lecture Notes for CMPUT 466/551 Nilanjan Ray.
Lecture 3 Probability and Measurement Error, Part 2.
Aspects of Conditional Simulation and estimation of hydraulic conductivity in coastal aquifers" Luit Jan Slooten.
Visual Recognition Tutorial
Lecture 9 Inexact Theories. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture 03Probability and.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 10 Statistical Modelling Martin Russell.
First introduced in 1977 Lots of mathematical derivation Problem : given a set of data (data is incomplete or having missing values). Goal : assume the.
0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
AGC DSP AGC DSP Professor A G Constantinides© Estimation Theory We seek to determine from a set of data, a set of parameters such that their values would.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
An Optimal Learning Approach to Finding an Outbreak of a Disease Warren Scott Warren Powell
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Lecture 8 The Principle of Maximum Likelihood. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.
Environmental Data Analysis with MatLab Lecture 7: Prior Information.
Lecture II-2: Probability Review
Review of Probability.
Machine Learning Queens College Lecture 3: Probability and Statistics.
Bringing Inverse Modeling to the Scientific Community Hydrologic Data and the Method of Anchored Distributions (MAD) Matthew Over 1, Daniel P. Ames 2,
The horseshoe estimator for sparse signals CARLOS M. CARVALHO NICHOLAS G. POLSON JAMES G. SCOTT Biometrika (2010) Presented by Eric Wang 10/14/2010.
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
Random Sampling, Point Estimation and Maximum Likelihood.
1 Institute of Engineering Mechanics Leopold-Franzens University Innsbruck, Austria, EU H.J. Pradlwarter and G.I. Schuëller Confidence.
1 HMM - Part 2 Review of the last lecture The EM algorithm Continuous density HMM.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Binary Stochastic Fields: Theory and Application to Modeling of Two-Phase Random Media Steve Koutsourelakis University of Innsbruck George Deodatis Columbia.
7.4 – Sampling Distribution Statistic: a numerical descriptive measure of a sample Parameter: a numerical descriptive measure of a population.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Conjugate Priors Multinomial Gaussian MAP Variance Estimation Example.
-Arnaud Doucet, Nando de Freitas et al, UAI
Example: Bioassay experiment Problem statement –Observations: At each level of dose, 5 animals are tested, and number of death are observed.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 07: BAYESIAN ESTIMATION (Cont.) Objectives:
HMM - Part 2 The EM algorithm Continuous density HMM.
CS Statistical Machine learning Lecture 24
Computer Vision Lecture 6. Probabilistic Methods in Segmentation.
CHAPTER 17 O PTIMAL D ESIGN FOR E XPERIMENTAL I NPUTS Organization of chapter in ISSO –Background Motivation Finite sample and asymptotic (continuous)
Prototype Classification Methods Fu Chang Institute of Information Science Academia Sinica ext. 1819
Simulation Study for Longitudinal Data with Nonignorable Missing Data Rong Liu, PhD Candidate Dr. Ramakrishnan, Advisor Department of Biostatistics Virginia.
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Univariate Gaussian Case (Cont.)
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
Gil McVean, Department of Statistics Thursday February 12 th 2009 Monte Carlo simulation.
Density Estimation in R Ha Le and Nikolaos Sarafianos COSC 7362 – Advanced Machine Learning Professor: Dr. Christoph F. Eick 1.
Hybrid Bayesian Linearized Acoustic Inversion Methodology PhD in Petroleum Engineering Fernando Bordignon Introduction Seismic inversion.
A Study on Speaker Adaptation of Continuous Density HMM Parameters By Chin-Hui Lee, Chih-Heng Lin, and Biing-Hwang Juang Presented by: 陳亮宇 1990 ICASSP/IEEE.
On triangular norms, metric spaces and a general formulation of the discrete inverse problem or starting to think logically about uncertainty On triangular.
Computational Intelligence: Methods and Applications Lecture 14 Bias-variance tradeoff – model selection. Włodzisław Duch Dept. of Informatics, UMK Google:
Univariate Gaussian Case (Cont.)
Probability Theory and Parameter Estimation I
Ch3: Model Building through Regression
Parameter Estimation 主講人:虞台文.
Classification of unlabeled data:
Latent Variables, Mixture Models and EM
Dario Grana, Tapan Mukerji
Unfolding Problem: A Machine Learning Approach
Hidden Markov Models Part 2: Algorithms
Bayesian Models in Machine Learning
Problem statement Given: a set of unknown parameters
Filtering and State Estimation: Basic Concepts
Jaehoon Lee, Tapan Mukerji, Michael Tompkins
OVERVIEW OF LINEAR MODELS
LECTURE 09: BAYESIAN LEARNING
Unfolding with system identification
Berlin Chen Department of Computer Science & Information Engineering
Presentation transcript:

Dario Grana and Tapan Mukerji Sequential approach to Bayesian linear inverse problems in reservoir modeling using Gaussian mixture models SCRF Annual Meeting, 9-11 May 2012

– 2 Introduction Many linear inverse problems are solved using a Bayesian approach assuming Gaussian distribution of the model. We show the analytical solution of the Bayesian linear inverse problem in the Gaussian mixture case. Some applications to reservoir modeling are presented (reservoir properties estimation and simulation)

– 3 Introduction In reservoir modeling we aim to model rock properties: porosity, sand/clay content, saturations. Rock properties cannot be directly measured away from the wells. The main source of information are seismic data. Inverse problem Seismic data Porosity

– 4 Introduction The seismic forward model can be linearized and the model linking velocities and rock properties is almost linear. Rock properties can be described by a Gaussian Mixture (GM) model.

– 4 Introduction Well data In traditional methods, when we observe a significant overlap in the prior distribution it is difficult to make a choice on the cut-off P-wave velocity (m/s) Porosity (v/v) Sand content P-wave velocity (m/s) Porosity (v/v)

– 4 Introduction The seismic forward model can be linearized and the model linking velocities and rock properties is almost linear. Rock properties can be described by a Gaussian Mixture (GM) model. The goal is to estimate reservoir properties as a solution of a Bayesian GM inverse problem.

– 5 A random vector m is distributed according to a Gaussian Mixture Model (GMM) with L components when the probability density is given by: where each single component is Gaussian: and the additional conditions Gaussian mixture models Example of 1D mixture with L=2 components (PDF and histogram of N random samples)

– 6 Gaussian mixture models Gaussian Mixture distribution Weights, means and covariance matrices estimated by EM method (Hastie, Tibshirani, Friedman, The Elements of Statistical Learning, 2009)

– 7 Linear inverse problem Linear inverse problems (Gaussian)

– 7 Linear inverse problem Linear inverse problems (Gaussian) If then

– 7 Linear inverse problem Linear inverse problems (Gaussian) If then Tarantola, Linear inverse problems, 2005.

– 8 Linear inverse problems (Gaussian) Linear operator G VpVp

– 8 Linear inverse problems (Gaussian) This result is based on two well known properties of Gaussian distributions: A.The linear transform of a Gaussian distribution is again Gaussian; B.If the joint distribution (m,d) is Gaussian, then the conditional distribution m|d is again Gaussian. These two properties can be extended to the Gaussian Mixture case.

– 9 Proposition 1: If and G is a linear operator: y = Gx Linear inverse problems (GM)

– 9 Proposition 1: If and G is a linear operator: y = Gx Linear inverse problems (GM) then y is distributed according to a Gaussian mixture

– 10 Proposition 2: If Linear inverse problems (GM)

– 10 Proposition 2: If Linear inverse problems (GM) then x 2 | x 1 is distributed according to a Gaussian mixture

– 11 Linear inverse problem Linear inverse problems (GM)

– 11 Linear inverse problem Linear inverse problems (GM) If then

– 11 Linear inverse problem Linear inverse problems (GM) If then

– 12 Introductory example Comparison: Reference model Inverted - Bayesian GMM Inverted - Bayesian Gaussian

– 13 Sequential approach The sequential approach to linear inverse problems (Gaussian case) was proposed by Hansen et al. (2006) We extended this approach to Gaussian Mixture models

– 14 Sequential approach 1.Randomly visit a location k in the model space; 2.Compute the conditional mean and variance; 3.Draw a random value at the location k according to the computed distribution; 4.The simulated value is used as conditioning datum for next element simulations; 5.Repeat steps 1-4 until all locations of the model space have been visited.

– 15 Sequential inversion (Gaussian) mimi

– 15 Sequential inversion (Gaussian) mimi m s is the subvector of direct observations of m

Sequential inversion (Gaussian) Hansen et al., Linear inverse Gaussian theory and geostatistics: Geophysics, mimi If then m s is the subvector of direct observations of m – 15

– 16 Main result: Sequential inversion (GM) mimi m s is the subvector of direct observations of m

– 16 Main result: Sequential inversion (GM) Analytical formulation form means, covariance matrices, and weights. mimi If then m s is the subvector of direct observations of m

– 17 Applications Geostatistics (reservoir modeling) Simulation of facies (discrete) and porosity (continuous): Sequential Gaussian Mixture Simulation Geophysics (seismic inverse problems) Inverse problem Facies probability is derived from the weights of the mixture

– 18 SGMixSim: conditional simulations

– 19 Bayesian GM inversion: example 1

P-wave velocity (m/s) Porosity (v/v) – 20 Bayesian GM inversion: example 2 Well data Inverted velocities (top horizon) P-wave velocity (m/s) Porosity (v/v) Sand content

Bayesian GM inversion: example 2 Prior: Sand 30% Prior: Sand 40%

– 22 Bayesian GM inversion: example 3 We used the linearized seismic forward model proposed in Buland and Omre (2003)

– 23 Conclusions We presented a methodology to estimate reservoir properties as a solution of a Bayesian linear inverse problem using Gaussian Mixture. We proposed a method based on the sequential approach to Gaussian Mixture linear inverse problems. The method can be applied to reservoir modeling and seismic reservoir characterization.

Backup

– 37 Advantages The method is based on the exact analytical solution of the Bayesian linear inverse problem in the Gaussian Mixture case. By introducing a mixture model, it is possible to condition the discrete distribution using a continuous parameter. It is particular useful to solve a discrete-continuous problem where the conditioning data are continuous. It does not require a variogram model of the discrete property, a variogram model of the conditioning data, nor the cross variogram. SISim is a consistent method only when we have 2 indicators; with more than 3 indicators it is not guaranteed that SISim provides probabilities between 0 and 1 (see POSTIK in GSLib). It can be difficult to handle a multimodal dataset with normal score transformations.

– 38 SGMixSim: conditional simulations Without post-processing With post-processing

– 39 Main result: Sequential inversion (GM) Linear operator where Analytical formulation: