Stochastic inverse modeling under realistic prior model constraints with multiple-point geostatistics Jef Caers Petroleum Engineering Department Stanford.

Slides:



Advertisements
Similar presentations
1 -Classification: Internal Uncertainty in petroleum reservoirs.
Advertisements

Bayesian Belief Propagation
Active Appearance Models
Active Shape Models Suppose we have a statistical shape model –Trained from sets of examples How do we use it to interpret new images? Use an “Active Shape.
Multipoint Statistics to Generate Geologically Realistic Networks 1 Hiroshi Okabe supervised by Prof. Martin J Blunt Petroleum Engineering and Rock Mechanics.
A multi-scale, pattern-based approach to sequential simulation annual scrf meeting, may 2003 stanford university burc arpat ( coaching provided by jef.
STAT 497 APPLIED TIME SERIES ANALYSIS
1 (from Optimization of Advanced Well Type and Performance Louis J. Durlofsky Department of Petroleum Engineering, Stanford University.
Simulation Where real stuff starts. ToC 1.What, transience, stationarity 2.How, discrete event, recurrence 3.Accuracy of output 4.Monte Carlo 5.Random.
HMM-BASED PATTERN DETECTION. Outline  Markov Process  Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D.
Prénom Nom Document Analysis: Parameter Estimation for Pattern Recognition Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
1 On the Statistical Analysis of Dirty Pictures Julian Besag.
U NIVERSITY OF M ASSACHUSETTS, A MHERST Department of Computer Science Optimal Fixed-Size Controllers for Decentralized POMDPs Christopher Amato Daniel.
Planning operation start times for the manufacture of capital products with uncertain processing times and resource constraints D.P. Song, Dr. C.Hicks.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Method of Soil Analysis 1. 5 Geostatistics Introduction 1. 5
Classification: Internal Status: Draft Using the EnKF for combined state and parameter estimation Geir Evensen.
Machine Learning Queens College Lecture 3: Probability and Statistics.
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
16 th Annual Meeting Stanford Center for Reservoir Forecasting Stanford Center for Reservoir Forecasting.
Monte Carlo Simulation CWR 6536 Stochastic Subsurface Hydrology.
Various topics Petter Mostad Overview Epidemiology Study types / data types Econometrics Time series data More about sampling –Estimation.
Simulated Annealing.
Binary Stochastic Fields: Theory and Application to Modeling of Two-Phase Random Media Steve Koutsourelakis University of Innsbruck George Deodatis Columbia.
Importance Resampling for Global Illumination Justin F. Talbot Master’s Thesis Defense Brigham Young University Provo, UT.
Review of Random Process Theory CWR 6536 Stochastic Subsurface Hydrology.
Céline Scheidt and Jef Caers SCRF Affiliate Meeting– April 30, 2009.
Chenney/Forsyth Paper Chenney & Forsyth, 2000 AI CS 416.
Computer Vision Lecture 6. Probabilistic Methods in Segmentation.
Petrel Workflow Tools 5 Day Introduction Course
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Introduction to Models Lecture 8 February 22, 2005.
Monte-Carlo based Expertise A powerful Tool for System Evaluation & Optimization  Introduction  Features  System Performance.
Adaptive Spatial Resampling as a McMC Method for Uncertainty Quantification in Seismic Reservoir Modeling Cheolkyun Jeong*, Tapan Mukerji, and Gregoire.
Generalized Spatial Dirichlet Process Models Jason A. Duan Michele Guindani Alan E. Gelfand March, 2006.
To be presented by Maral Hudaybergenova IENG 513 FALL 2015.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Stochastic Hydrology Random Field Simulation Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
History Matching Flowmeter Data in the Ghawar Field
Hyucksoo Park, Céline Scheidt and Jef Caers Stanford University Scenario Uncertainty from Production Data: Methodology and Case Study.
Dario Grana and Tapan Mukerji Sequential approach to Bayesian linear inverse problems in reservoir modeling using Gaussian mixture models SCRF Annual Meeting,
Hybrid Bayesian Linearized Acoustic Inversion Methodology PhD in Petroleum Engineering Fernando Bordignon Introduction Seismic inversion.
SEISMIC ATTRIBUTES FOR RESERVOIR CHARACTERIZATION
Geostatistical History Matching Methodology using Block-DSS for Multi-Scale Consistent Models PHD PROGRAM IN PETROLUM ENGINEERING CATARINA MARQUES
Luca Colombera, Nigel P. Mountney, William D. McCaffrey
Thin sub-resolution shaly-sands
Advanced Statistical Computing Fall 2016
Stanford Center for Reservoir Forecasting
Maria Volkova, Mikhail Perepechkin, Evgeniy Kovalevskiy*
Cheolkyun Jeong, Céline Scheidt, Jef Caers, and Tapan Mukerji
A strategy for managing uncertainty
Jun Liu Department of Statistics Stanford University
Jef Caers, Xiaojin Tan and Pejman Tahmasebi Stanford University, USA
Addy Satija and Jef Caers Department of Energy Resources Engineering
SCRF 26th Annual Meeting May
Pejman Tahmasebi and Jef Caers
New Software Tools for Geostatistics: GsTL and Simulacre Nicolas Remy
Jef Caers, Céline Scheidt and Pejman Tahmasebi
Pejman Tahmasebi, Thomas Hossler and Jef Caers
Fast Pattern Simulation Using Multi‐Scale Search
Stochastic Hydrology Random Field Simulation
Problem statement Given: a set of unknown parameters
Jaehoon Lee, Tapan Mukerji, Michael Tompkins
Brent Lowry & Jef Caers Stanford University, USA
Siyao Xu Earth, Energy and Environmental Sciences (EEES)
Stanford Center for Reservoir Forecasting
SCRF High-order stochastic simulations and some effects on flow through heterogeneous media Roussos Dimitrakopoulos COSMO – Stochastic Mine Planning.
Energy Resources Engineering Department Stanford University, CA, USA
Yalchin Efendiev Texas A&M University
Presentation transcript:

Stochastic inverse modeling under realistic prior model constraints with multiple-point geostatistics Jef Caers Petroleum Engineering Department Stanford Center for Reservoir Forecasting Stanford, California, USA

Acknowledgements I would like to acknowledge the contributions of the SCRF team, in particular Andre Journel and all graduate students who contributed to this presentation

Quote “Theory should be as simple as possible, but not simpler as possible….” Albert EINSTEIN

Overview Multiple-point geostatistics Why do we need it ? How does it work ? How do we define prior models with it ? Data integration Integration of multiple types/scales of data Improvement on traditional Bayesian methods Solving general inverse problems Using prior models from mp geostatistics Application to history matching

Part I Multiple-point Geostatistics

Limitations of traditional geostatistics Variograms EW Variograms NS 2-point correlation is not enough to characterize connectivity A prior geological interpretation is required and it is NOT multi-Gaussian 123

Stochastic sequential simulation Define a multi-variate (Gaussian) distribution over the random function Z(u) Decompose the distribution as follows Or in its conditional form

Practice of sequential simulation A B1 B2 B3 B={B1,B2,B3} P(A|B) = N(m,  ) m,  given by kriging, depend on autocorrelation (variogram) function

Multiple-point Geostatistics Reservoir = well data multiple-point data event P ( A | B ) ? Sequential simulation A B

Extended Normal Equations u uu

Single Normal Equation

The training image module Training image module = standardized analog model quantifying geo-patterns P ( A | B ) = 1 / 4 A = Mud SNESIM algorithm Recognizing P(A|B) for all possible A,B

The SNESIM algorithm Training image Data template (data search neighborhood) Search tree Construction requires scanning training image one single time Minimizes memory demand Allows retrieving all training cpdf’s for the template adopted!

Probabilities from a Search Tree u Search neighborhood Search tree Training image j=1 i = u u u u u u u u u u u u u u Level 0 (no CD) Level 1 (1 CD) Level 4 (4 CD) u u u

Example 400 sample data Realization True image Training image CPU 2 facies, 1 million cells = 4’ 30” On 1GHz PC

Where do we get a 3D TI ? Training image requires "stationarity" Only patterns = "repeated multipoint statistics" can be reproduced Valid training imageNot Valid

Modular training image Modular ? * no units * rotation-invariant * affinity-invariant Training ImageModels generated with snesim using the SAME training image

Properties of training image Required Stationarity: patterns by definition repeat Ergodicity: to reproduce long range feature => large image Limited to 4-5 categories Not required Univariate statistics need not be the same as actual field No conditioning to ANY data Affinity/rotation need not be the same

Part II Multiple-point Geostatistics and data integration

Simple question, difficult problem… A geologist believes based on geological data that there is 80% chance of having a channel at location X A geophysicist believes based on geophysical data that there is 75% chance of having a channel at location X A petroleum engineer believes based on engineering data that there is 85% chance of having a channel at location X What is the probability of having a channel at X ? The essential data integration problem… P(A|B) P(A|C) P(A|D) P(A|B,C,D)?

Combining sources of information

Conditional independence O = In practice not necessarily YES

Correcting conditional independence

Permanence of ratios hypothesis

Advantages of using ratios No term P(B,C), hence McMC is not required Work with P(A|B),P(A|C), more intuitive than P(B|A),P(C|A) Verifies all consistency conditions by definition It is still a form of independence, Yet dependence can be reintroduced reintroducing dependence

Simple problem… P(A|B) = 0.80 P(A|C) = 0.75 => P(A|BC) Suppose P(A) = 0.5 => P(A|BC) = 0.92 Suppose P(A) = 0.3 => P(A|BC) = 0.95 = compounding of events Lesson learned : if geologist and geophysicist agree for almost 80%, you can be even more certain that there is a channel !

Example reservoir P(A|C) Training image P(A|B) Single realization

P(A|C), A = single-point ! P(A|C) Realization When combing P(A|B) from geology and P(A|C) from seismic to P(A|BC), ‘A’ is still a single point event !  Certain patterns, such as local rotation will be ignored  Honor seismic only as a single- point probability ?

Concept of MODULAR training image Modular ? * Stationary patterns * rotation-invariant * affinity-invariant * no units Modular Training ImageModels generated with snesim using the SAME training image

Local rotation angle from seismic P(A|C)Local angle

Results 2 realizations with anglewithout angle

Constrain to local “channel features” P(A|C) Hard data from seismic Soft data from seismic

Part III Inverse modeling with multiple-point geostatistics Application to history matching

Production data does not inform geological heterogeneity a a a a a a a a a a a a a a A Petroleum Engineer Geologist 1 Geologist 2Disagreeing Geologist ?

Approach Methodology Define a non-stationary Markov chain that moves a realization to match data, two properties At each perturbation we maintain geological realism use term P(A|B) Construct a soft data set “P(A|D)” such that we move the current realization as fast as possible to match the data => Optimization of the Markov chain at each step

Methodology: two facies D = set of historic production data (pressures, flows) Some notation: Initial guess realization: Realization at iteration

Define a Markov chain Define a transition matrix:

Transition matrix 2 x 2 transition matrix describes the probability of changing facies at location u and we define it as follows

Parameter r D

Determine r D Use P(A|D) as a probability model in multiple-point geostatistics  Combine P(A|B) (from training image) with P(A|D) from production data D into P(A|B,D)  Allows generating iterations that are consistent with prior geological vision  Allows combining geological information with production data  Allows determining an optimal value for r D as follows…

r D determines a “perturbation” r D =0.01 Some initial model r D =0.1 r D =0.2 r D =0.5r D =1 Find r D that matches best the production data Find r D that matches best the production data = one-dimensional optimization

Construct a Training Image with the desired geological continuity constraint Use snesim (P(A|B)) to generate an initial guess Until adequate match to production data D Define a soft data P(A|D) as function of r D Perform snesim with P(A|D) to generate a new guess Find the value of r D that matches best the data D Complete algorithm

Examples Generate 10 reservoir models that 1. Honor the two hard data 2. Honor fractional flow 3. Have geological continuity similar as TI I P

Single model

r D values, single 1D optimization r D value Objective function

Different geology

More wells

Hierarchical matching * First choose fixed permeability per facies, perturb facies model * Then, for a fixed facies perturb the permeability within facies (using traditional methods, ssc, gradual deformation

Example I P

Results Klow = 50 Khigh = 500 Klow = 50 Khigh = 500 Klow = 12 Khigh = 729 Klow = 12 Khigh = 729 Klow = 11 Khigh = 694 Klow = 150 Khigh = 750

Results

More realistic Reference Initial model matched model

Conclusions What can multiple-point statistics provide Large flexibility of prior models, no need for math. def. A fast, robust sampling of the prior A more realistic data integration approach than traditional Bayesian methods A generic inverse solution method that honors prior information

More on conditional independence Q? why should  B and  C be independent unless they are homoscedastic, i.e. independent of A ? Q? Is it not a mere transfer of independence hypothesis to  B and  C