1 -Classification: Internal 2010-05-25 Uncertainty in petroleum reservoirs.

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

Empirical Factors Leading to a Good Fractured Reservoir Early recognition of fractures High fracture intensity & good connections Good interaction between.
Pattern Recognition and Machine Learning
Sales Forecasting using Dynamic Bayesian Networks Steve Djajasaputra SNN Nijmegen The Netherlands.
Combining Like Terms. Only combine terms that are exactly the same!! Whats the same mean? –If numbers have a variable, then you can combine only ones.
Uncertainty Quantification In Geosciences with Computationally Expensive Simulation Models (with last minute modification to Include Sustainability/Energy.
Title Subtitle.
0 - 0.
MULTIPLYING MONOMIALS TIMES POLYNOMIALS (DISTRIBUTIVE PROPERTY)
Addition Facts
MCMC estimation in MlwiN
Uncertainty in reservoirs
Bayesian Belief Propagation
Data-Assimilation Research Centre
Introduction to Data Assimilation NCEO Data-assimilation training days 5-7 July 2010 Peter Jan van Leeuwen Data Assimilation Research Center (DARC) University.
5-1 Chapter 5 Theory & Problems of Probability & Statistics Murray R. Spiegel Sampling Theory.
Mobile Robot Localization and Mapping using the Kalman Filter
© S Haughton more than 3?
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Introduction to Data Assimilation Peter Jan van Leeuwen IMAU.
Insert Date HereSlide 1 Using Derivative and Integral Information in the Statistical Analysis of Computer Models Gemma Stephenson March 2007.
Addition 1’s to 20.
25 seconds left…...
Test B, 100 Subtraction Facts
Week 1.
Jose-Luis Blanco, Javier González, Juan-Antonio Fernández-Madrigal University of Málaga (Spain) Dpt. of System Engineering and Automation May Pasadena,
IP, IST, José Bioucas, Probability The mathematical language to quantify uncertainty  Observation mechanism:  Priors:  Parameters Role in inverse.
Probabilistic Reasoning over Time
State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Probabilistic Robotics SLAM. 2 Given: The robot’s controls Observations of nearby features Estimate: Map of features Path of the robot The SLAM Problem.
Observers and Kalman Filters
State space model of precipitation rate (Tamre Cardoso, PhD UW 2004) Updating wave height forecasts using satellite data (Anders Malmberg, PhD U. Lund.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Andrew Schuh, Scott Denning, Marek Ulliasz Kathy Corbin, Nick Parazoo A Case Study in Regional Inverse Modeling.
An Optimal Learning Approach to Finding an Outbreak of a Disease Warren Scott Warren Powell
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
History matching reservoir models using the EnKF/EnKS Geir Evensen.
Arizona State University DMML Kernel Methods – Gaussian Processes Presented by Shankar Bhargav.
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Upscaling and History Matching of Fractured Reservoirs Pål Næverlid Sævik Department of Mathematics University of Bergen Modeling and Inversion of Geophysical.
Classification: Internal Status: Draft Using the EnKF for combined state and parameter estimation Geir Evensen.
Helsinki University of Technology Adaptive Informatics Research Centre Finland Variational Bayesian Approach for Nonlinear Identification and Control Matti.
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Learning Theory Reza Shadmehr logistic regression, iterative re-weighted least squares.
Probabilistic Robotics Bayes Filter Implementations.
Mean Field Variational Bayesian Data Assimilation EGU 2012, Vienna Michail Vrettas 1, Dan Cornford 1, Manfred Opper 2 1 NCRG, Computer Science, Aston University,
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Center for Radiative Shock Hydrodynamics Fall 2011 Review Assessment of predictive capability Derek Bingham 1.
Data assimilation and forecasting the weather (!) Eugenia Kalnay and many friends University of Maryland.
Adaptive Hybrid EnKF-OI for State- Parameters Estimation in Contaminant Transport Models Mohamad E. Gharamti, Johan Valstar, Ibrahim Hoteit European Geoscience.
- 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.
State Estimation and Kalman Filtering Zeeshan Ali Sayyed.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Anders Nielsen Technical University of Denmark, DTU-Aqua Mark Maunder Inter-American Tropical Tuna Commission An Introduction.
École Doctorale des Sciences de l'Environnement d’Île-de-France Année Universitaire Modélisation Numérique de l’Écoulement Atmosphérique et Assimilation.
Geostatistical History Matching Methodology using Block-DSS for Multi-Scale Consistent Models PHD PROGRAM IN PETROLUM ENGINEERING CATARINA MARQUES
Data Assimilation Theory CTCD Data Assimilation Workshop Nov 2005
A strategy for managing uncertainty
PSG College of Technology
Probabilistic Robotics
Jincong He, Louis Durlofsky, Pallav Sarma (Chevron ETC)
Particle Filtering.
2. University of Northern British Columbia, Prince George, Canada
Yalchin Efendiev Texas A&M University
Presentation transcript:

1 -Classification: Internal Uncertainty in petroleum reservoirs

2 -Classification: Internal Finding the reservoir I

3 -Classification: Internal Finding the reservoir II The underground is packed with density gradients: Interpreting this is a far cry from hard science. Top and base of reservoir (I think …).

4 -Classification: Internal Geological properties Exploration well – try to infer properties on km scale from point measurement.

5 -Classification: Internal Porosity and permeability High porosityLow porosity High permeabilityLow permeability

6 -Classification: Internal OK – what is inside this reservoir Internal barriers? Interface depth?

7 -Classification: Internal Fluid properties Water-wet reservoirOil-wet reservoir

8 -Classification: Internal Uncertain factors –The geometry of the reservoir – including internal compartmentalization. –The spatial distribution of porosity and permeability. –Depth of fluid interfaces. –Fluid and fluid-reservoir properties. –…

9 -Classification: Internal What to do with it? 1.Deterministic models: Attempts at modelling and quantifying uncertainty are certainly done, but this is mainly in the form of variable (stocastic) input, not stocastic dynamics. 2.Before production: A range input values is tried out, and the future production is simulated.These simulations are an important basis for investment decisions. 3.After production start: When the field is producing we have measured values of e.g. produced rates of oil, gas and water which can be compared with the simulated predictions a misfit can be evaluated, and the models updated.

10 -Classification: Internal History matching (or revisionism) 1.Select a set true observations you want to reproduce in your simulations. 2.Select a (limited) set of parameters to update. 3.Update your parameters as best you can. 4.Simulate your model and compare simulated results with observations. 5.Discrepancy below tolerance? 6.You have an updated model. No Yes

11 -Classification: Internal History matching – it is just plain stupid Traditionally History Matching is percieved as an optimization problem – a very problematic approach: –The problem is highly nonlinear, and severely underdetermined. –The observations we are comparing with can be highly uncertain. –The choice of parameterization is somewhat arbitrary – we will optimize in the wrong space anyway.

12 -Classification: Internal A probabilistic problem – Bayesian setting. {m} : Model parameters {d} : Observed data Prior Posterior Likelihood

13 -Classification: Internal The objective function Guassian likelihood: P(d|m) = exp(-(S(m) – d) T C -1 (S(m) – d)) Result from the simulator Covariance of measurement errors. Evaluation of S(m) requires running the simulator and is very costly. Observed data

14 -Classification: Internal How to find the posterior?? EnKF: Data assimilation technique based on resampling of finite ensemble in a Gaussian approximation. Gives good results when the Gaussian approximation applies, and fails spectactularly when it does not apply. BASRA (McMC with proxy functions): Flexible and fully general approach. Guaranteed to converge to the correct posterior, but the convergence rate can be slow.

15 -Classification: Internal Kalman filter Kalman filter: Technique for sequential state estimation based on combining measurements and a linear equation of motion. Very simple example: Forecast Updated Measurement Forecast error Measurement error (Co)variance estimate: State estimate:

16 -Classification: Internal EnKF When the equation of motion is nonlinear predicting the state covariance becomes difficult. The EnKF approach is to let an ensemble (i.e. sample) evolve with the equation of motion, and use the sample covariance as a plugin estimator for the state covariance. –Gaussian likelihood. –Gaussian prior –A combined parameter and state estimation problem. –The updated state is linear combination of the prior states. Computationally efficient – but limiting

17 -Classification: Internal EnKF - linear combination Permeability Porosity Relperm MULTFLT Permeability Porosity Relperm MULTFLT Permeability Porosity Relperm MULTFLT Permeability Porosity Relperm MULTFLT Permeability Porosity Relperm MULTFLT Observation Integrate Time EnKF update: A A = A F X

18 -Classification: Internal EnKF update: sequential The EnKF method updates the models every time data is available. When new data becomes available we can continue without going back. WOPR TIME Last historical data Future prediction

19 -Classification: Internal BASRA Workflow 1.Select a limited ( <~ 50 ) parameters {m} to update, with an accompanying prior. 2.Perturb the parameter set {m} {m} + δ{m} and evaluate a new misfit O({m}). 3.Accept the new state with probability P = min{1,exp(-δO({m})}. 4.When this has converged we have one realization {m} from the posterior which can be used for uncertainty studies; repeat to get an ensemble of realizations. The evaluation of the misfit is prohibitively expensive, and advanced proxy modelling is essential.

20 -Classification: Internal BASRA Results Converging the proxies: Marginal posteriors: Posterior ensemble: Prior Posterior

21 -Classification: Internal Current trends –Reservoir modelling usually involves a chain of weakly coupled models and applications – strive hard to update parameters early in the chain. –Update of slightly more exotic variables like surface shapes and the direction of channels. –The choice of parameterization is somewhat arbitrary – we will optimize in the wrong space anyway. A more systematic approach to choosing parameterization would be very valuable.