Presentation is loading. Please wait.

Presentation is loading. Please wait.

Dario Grana and Tapan Mukerji Sequential approach to Bayesian linear inverse problems in reservoir modeling using Gaussian mixture models SCRF Annual Meeting,

Similar presentations


Presentation on theme: "Dario Grana and Tapan Mukerji Sequential approach to Bayesian linear inverse problems in reservoir modeling using Gaussian mixture models SCRF Annual Meeting,"— Presentation transcript:

1 Dario Grana and Tapan Mukerji Sequential approach to Bayesian linear inverse problems in reservoir modeling using Gaussian mixture models SCRF Annual Meeting, 9-11 May 2012

2 – 2 Introduction Many linear inverse problems are solved using a Bayesian approach assuming Gaussian distribution of the model. We show the analytical solution of the Bayesian linear inverse problem in the Gaussian mixture case. Some applications to reservoir modeling are presented (reservoir properties estimation and simulation)

3 – 3 Introduction In reservoir modeling we aim to model rock properties: porosity, sand/clay content, saturations. Rock properties cannot be directly measured away from the wells. The main source of information are seismic data. Inverse problem Seismic data Porosity

4 – 4 Introduction The seismic forward model can be linearized and the model linking velocities and rock properties is almost linear. Rock properties can be described by a Gaussian Mixture (GM) model.

5 – 4 Introduction Well data In traditional methods, when we observe a significant overlap in the prior distribution it is difficult to make a choice on the cut-off P-wave velocity (m/s) Porosity (v/v) Sand content P-wave velocity (m/s) Porosity (v/v)

6 – 4 Introduction The seismic forward model can be linearized and the model linking velocities and rock properties is almost linear. Rock properties can be described by a Gaussian Mixture (GM) model. The goal is to estimate reservoir properties as a solution of a Bayesian GM inverse problem.

7 – 5 A random vector m is distributed according to a Gaussian Mixture Model (GMM) with L components when the probability density is given by: where each single component is Gaussian: and the additional conditions Gaussian mixture models Example of 1D mixture with L=2 components (PDF and histogram of N random samples)

8 – 6 Gaussian mixture models Gaussian Mixture distribution Weights, means and covariance matrices estimated by EM method (Hastie, Tibshirani, Friedman, The Elements of Statistical Learning, 2009)

9 – 7 Linear inverse problem Linear inverse problems (Gaussian)

10 – 7 Linear inverse problem Linear inverse problems (Gaussian) If then

11 – 7 Linear inverse problem Linear inverse problems (Gaussian) If then Tarantola, Linear inverse problems, 2005.

12 – 8 Linear inverse problems (Gaussian) Linear operator G VpVp

13 – 8 Linear inverse problems (Gaussian) This result is based on two well known properties of Gaussian distributions: A.The linear transform of a Gaussian distribution is again Gaussian; B.If the joint distribution (m,d) is Gaussian, then the conditional distribution m|d is again Gaussian. These two properties can be extended to the Gaussian Mixture case.

14 – 9 Proposition 1: If and G is a linear operator: y = Gx Linear inverse problems (GM)

15 – 9 Proposition 1: If and G is a linear operator: y = Gx Linear inverse problems (GM) then y is distributed according to a Gaussian mixture

16 – 10 Proposition 2: If Linear inverse problems (GM)

17 – 10 Proposition 2: If Linear inverse problems (GM) then x 2 | x 1 is distributed according to a Gaussian mixture

18 – 11 Linear inverse problem Linear inverse problems (GM)

19 – 11 Linear inverse problem Linear inverse problems (GM) If then

20 – 11 Linear inverse problem Linear inverse problems (GM) If then

21 – 12 Introductory example Comparison: Reference model Inverted - Bayesian GMM Inverted - Bayesian Gaussian

22 – 13 Sequential approach The sequential approach to linear inverse problems (Gaussian case) was proposed by Hansen et al. (2006) We extended this approach to Gaussian Mixture models

23 – 14 Sequential approach 1.Randomly visit a location k in the model space; 2.Compute the conditional mean and variance; 3.Draw a random value at the location k according to the computed distribution; 4.The simulated value is used as conditioning datum for next element simulations; 5.Repeat steps 1-4 until all locations of the model space have been visited.

24 – 15 Sequential inversion (Gaussian) mimi

25 – 15 Sequential inversion (Gaussian) mimi m s is the subvector of direct observations of m

26 Sequential inversion (Gaussian) Hansen et al., Linear inverse Gaussian theory and geostatistics: Geophysics, 2006. mimi If then m s is the subvector of direct observations of m – 15

27 – 16 Main result: Sequential inversion (GM) mimi m s is the subvector of direct observations of m

28 – 16 Main result: Sequential inversion (GM) Analytical formulation form means, covariance matrices, and weights. mimi If then m s is the subvector of direct observations of m

29 – 17 Applications Geostatistics (reservoir modeling) Simulation of facies (discrete) and porosity (continuous): Sequential Gaussian Mixture Simulation Geophysics (seismic inverse problems) Inverse problem Facies probability is derived from the weights of the mixture

30 – 18 SGMixSim: conditional simulations

31 – 19 Bayesian GM inversion: example 1

32 P-wave velocity (m/s) Porosity (v/v) – 20 Bayesian GM inversion: example 2 Well data Inverted velocities (top horizon) P-wave velocity (m/s) Porosity (v/v) Sand content

33 Bayesian GM inversion: example 2 Prior: Sand 30% Prior: Sand 40%

34 – 22 Bayesian GM inversion: example 3 We used the linearized seismic forward model proposed in Buland and Omre (2003)

35 – 23 Conclusions We presented a methodology to estimate reservoir properties as a solution of a Bayesian linear inverse problem using Gaussian Mixture. We proposed a method based on the sequential approach to Gaussian Mixture linear inverse problems. The method can be applied to reservoir modeling and seismic reservoir characterization.

36 Backup

37 – 37 Advantages The method is based on the exact analytical solution of the Bayesian linear inverse problem in the Gaussian Mixture case. By introducing a mixture model, it is possible to condition the discrete distribution using a continuous parameter. It is particular useful to solve a discrete-continuous problem where the conditioning data are continuous. It does not require a variogram model of the discrete property, a variogram model of the conditioning data, nor the cross variogram. SISim is a consistent method only when we have 2 indicators; with more than 3 indicators it is not guaranteed that SISim provides probabilities between 0 and 1 (see POSTIK in GSLib). It can be difficult to handle a multimodal dataset with normal score transformations.

38 – 38 SGMixSim: conditional simulations Without post-processing With post-processing

39 – 39 Main result: Sequential inversion (GM) Linear operator where Analytical formulation:


Download ppt "Dario Grana and Tapan Mukerji Sequential approach to Bayesian linear inverse problems in reservoir modeling using Gaussian mixture models SCRF Annual Meeting,"

Similar presentations


Ads by Google