Application of neural tools in geological data analyses Visiting lecture for IAMG student chapter in Szeged, Hungary 14th Nov 2008 Dr. Tomislav Malvić,

Slides:



Advertisements
Similar presentations
NEURAL NETWORKS Backpropagation Algorithm
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Artificial Intelligence 13. Multi-Layer ANNs Course V231 Department of Computing Imperial College © Simon Colton.
For Wednesday Read chapter 19, sections 1-3 No homework.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Mehran University of Engineering and Technology, Jamshoro Department of Electronic Engineering Neural Networks Feedforward Networks By Dr. Mukhtiar Ali.
Indicator Kriging Case study; Geological Models of Upper Miocene Sandstone Reservoirs at the Kloštar Oil and Gas Field Kristina Novak Zelenika Zagreb,
Artificial Neural Networks - Introduction -
Artificial Neural Networks - Introduction -
Structure learning with deep neuronal networks 6 th Network Modeling Workshop, 6/6/2013 Patrick Michl.
Machine Learning Neural Networks
Chapter 5 NEURAL NETWORKS
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Neural Networks Marco Loog.
Back-Propagation Algorithm
Lecture 4 Neural Networks ICS 273A UC Irvine Instructor: Max Welling Read chapter 4.
Artificial Neural Networks
LOGO Classification III Lecturer: Dr. Bo Yuan
CS 4700: Foundations of Artificial Intelligence
September 28, 2010Neural Networks Lecture 7: Perceptron Modifications 1 Adaline Schematic Adjust weights i1i1i1i1 i2i2i2i2 inininin …  w 0 + w 1 i 1 +
ICS 273A UC Irvine Instructor: Max Welling Neural Networks.
Radial Basis Function (RBF) Networks
Radial Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University Course website:
Radial Basis Function Networks
Artificial Neural Networks
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
Using of Ordinary Kriging for indicator variable mapping (example of sandstone/marl border) Visiting lecture for IAMG student chapter in Szeged, Hungary.
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Artificial Intelligence Chapter 3 Neural Networks Artificial Intelligence Chapter 3 Neural Networks Biointelligence Lab School of Computer Sci. & Eng.
Geomathematical and geostatistical characters of some clastic Neogene hydrocarbon reservoirs in the Croatia (“Stochastic simulations and geostatistics”,
Non-Bayes classifiers. Linear discriminants, neural networks.
Akram Bitar and Larry Manevitz Department of Computer Science
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
Neural Networks - Berrin Yanıkoğlu1 Applications and Examples From Mitchell Chp. 4.
Robert J. Marks II CIA Lab Baylor University School of Engineering CiaLab.org Artificial Neural Networks: Supervised Models.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
BACKPROPAGATION (CONTINUED) Hidden unit transfer function usually sigmoid (s-shaped), a smooth curve. Limits the output (activation) unit between 0..1.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Today’s Lecture Neural networks Training
Machine Learning Supervised Learning Classification and Regression
Fall 2004 Backpropagation CS478 - Machine Learning.
Artificial Neural Networks
The Gradient Descent Algorithm
Subsurface mapping, deterministics and volumes
第 3 章 神经网络.
A Simple Artificial Neuron
Structure learning with deep autoencoders
CSC 578 Neural Networks and Deep Learning
of the Artificial Neural Networks.
Artificial Intelligence Chapter 3 Neural Networks
Neural Networks Geoff Hulten.
Artificial Intelligence Chapter 3 Neural Networks
Neural Networks ICS 273A UC Irvine Instructor: Max Welling
Machine Learning: Lecture 4
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Akram Bitar and Larry Manevitz Department of Computer Science
Presentation transcript:

Application of neural tools in geological data analyses Visiting lecture for IAMG student chapter in Szeged, Hungary 14th Nov 2008 Dr. Tomislav Malvić, Grad. in Geol. INA-Industry of Oil Plc., E&P of Oil and Gas, Reservoir Engineering and Field Engineering Dept. (advisor) Faculty of Mining, Geology and Petroleum Engineering, Institute of Geology and Geological Engineering (visiting lecturer) Dr. Tomislav Malvić, Grad. in Geol. INA-Industry of Oil Plc., E&P of Oil and Gas, Reservoir Engineering and Field Engineering Dept. (advisor) Faculty of Mining, Geology and Petroleum Engineering, Institute of Geology and Geological Engineering (visiting lecturer)

INTRODUCTION IN NEURAL ARCHITECTURE Generally, neural networks can be described as: Biological (human) and Artificial or simulated (computer algorithms based network). Fig. 1: Biological (human) neurons Fig. 2: Artificial neurons (schematic)

Fig. 3: The artificial neuron model (extended)  The input layers collects and distributes data into the network.  The hidden layer(s) process such data.  Equation (1) represents a set of operations performed on the neuron.  Equation (2) detects activation of the neuron. The Activation function – the value of output (U) is compared with condition necessary for hypothesis acceptance (t). The function is started only if this value is appropriate.

Fig. 4: Schematic organization of neural network through the layers Fig. 5: Adoption of weighting coefficient and error decreasing The basic Equation 1 impies: previously determined weighting coefficients, Condition of hypothesis acceptance, Number of layers, Number of neurons in layer. Coefficient estimation is BACK PROPAGATION process (or backerror procedure).

Simple (basic) neuron architecture recognize inputs behaviour through finding linearity (it is perceptron concept). Back-propagation network by backing error and adopting coefficient overcome this limitation using hidden layers. Backpropagation network is also called Multilayer Perceptron Network. Such error is determined for each neuron, and applied for adopting weighting coefficient and activation value. It is learning (training) and validating of the network. The weighting coefficient are calculated through Equation 3 and 4.

Backpropagation (disadvantages) – the most used paradigm, but often characterised with long lasting training. Simple (basic) neuron architecture recognize inputs behaviour through finding linearity (it is perceptron concept). It resulted from the gradient descent method used in backprop. This problem is often expressed in geophysical neural application. The very large dataset, and sending each channel (attribute, input) back can significantly decreased learning rate (slow processing) and paralyze the network. Resilient Propagation Algorithm (rProp) – one of the often improvements of backprop. The main difference is using only of partial derivations in process of weighting coefficient adjustment. It is about 4- 5 times faster than the standard backprop algorithm. Radial Basis Function Algorithm (RBF) – is an artificial network that uses radial basis fnction as activation function. Very often it is applied in function approximation, time series prediction etc. A radial basis function is a real-valued function whose value depends only on the distance from the origin or alternatively on the distance from some other point c, called a center.

Fig. 6: The Multi Layer Perceptron (MLP) backprop network Fig. 7: The Radial Basis Funcion (RBF) network

ARCHITECTURE OVERVIEW: The networks architecture includes: 1.Distribution of neuron in different layers; 2.Defining of connection types among neurons; 3.Defining of the way how neurons receiving inputs and calculate outputs; 4.Setting of the rules how to adjust weighting coefficient. The application of neural network includes: 1.Learning of training of network; 2.Testing of network; 3.Applying of the network for prediction.

ANALYSED AREAS (CROATIAN PANNON) Fig. 8: Areas analyzed by neural networks in Croatia The Okoli field (prediction of facies) in 2006; The Beničanci field (porosity) in 2007 and The Kloštar field (lithology and saturation) in 2007/08.

OKOLI FIELD The neural analysis was performed using cVISION – Neuro Genetic Solution commercial software. Available at:

The Okoli field, located in the Sava depression, is selected as the example for clastic facies prediction using neural network. The significant oil and gas reserves are proved in Lower Pontian sandstones. The analysis is based on rProp algorithm. The network is trained using log data (curves GR, R16", R64", PORE/T/W, SAND & SHALE) from two wells (code names B-1 & B-2). The neural network was trained based on selected part of input data and registered lithology from c 2 reservoir (as analytical target) of Lower Pontian age. Positions of facies (sand/marl sequences) were predicted. The results indicate on over-trained network in the case of sandstone sequences prediction (Figures 10, 11), because the marl sequences in the top and the base are mostly replaced by sandstone. The further neural facies modelling in the Sava depression need to be expanded with additional logs that characterised lithology and saturation (SP, CN, DEN). Then, rPORP algorithm could be reached with more than 90% probability of true prediction (in presented analysis this value reached 82.1%).

Figure 9: Structural map of c 2 reservoir top with selected well's positions

Figure 10: Relations of errors in periods of training (T), learning (L) and validation (V) and position of Face and Best configurations (the symbols F, B in legend) for B-1 well Figure 11: Relations of errors in periods of training (T), learning (L) and validation (V) and position of Face and Best configurations (the symbols F, B in legend) for B-2 well

CONCLUSIONS (Okoli field) 1.This is the first neural analysis in hydrocarbon reservoir analysis in Croatia 2.Excellent correlation was obtained between predicted and true position of sandstone lithology (reservoir of Lower Pontian age in the Sava depression); 2. On contrary, positions of predicted and true marlstones positions (in top and bottom) mostly do not correspond; 3. The best prediction (so called Face machine) is reached in relatively early training period. In B-1 well such prediction is observed in 2186 th iteration, and in B-2 well in 7626 th iteration; 4. It means that in similar facies analyses in the Sava depression, it is not necessary to use large iteration set (here is used about 30000); 5. The input dataset would need to be extended on other log curves that characterize lithology, porosity and saturation, like SP (spontaneous potential), CN (compensated neutron), DEN (density) and some other; 6. The wished true prediction could reached 90% (Face machine could be configured with 90% probability).

BENIČANCI FIELD The neural analysis was performed using NEURO3 – Neural Network Software. It is freeware E&P Tools published by the National Energy Technology Laboratory (NETL), owned and operated by the U.S. Department of Energy (DOE) national laboratory system. (

GENERAL LITHOLOGY AND NETWORK TYPE: The reservoir is represented by carbonate breccia (and conglomerates) of Badenian age. Locally the thickness of entire reservoir sequence is locally more than 200 m. The three seismic attributes were interpreted – amplitude, phase and frequencies making 3D seismic cube, averaged and correlated by well porosities at the 14 well locations. The 14 seismic and porosity point data made the network training. The network was of the backpropagation type. It was fitted through iterations, searching for the maximal value of correlation between attribute(s) and porosities and the minimal convergence.

The best training was reached using all three attributes together, what indicated on:  tendency that neural networks like numerous inputs;  physical connection among seismic attributes. Results are presented for:  Kriging (Figure 12a);  Cokriging (Figure 12b) and  Neural network (Figure 12c). Neural map is based at cell estimation, rarely reaching of hard-data porosity minimum and maximum (the scale is 5-10%, and the geostatistics interpolated in 3-11%). It means that neural estimation is more “conservative” than geostatics (Figure 12c). The cokriging approach includes one attribute. The neural approach favours using of three attributes. The possible attribute physical connection alerts us on carefully and geologically meaningful selection of the network inputs.

Figure 12a: Kriging porosity map (colour scale 4-10%) Figure 12b: Cokriging porosity map (colour scale 3-11%) Figure 12c: Neural network porosity map (colour scale 5-10%)

CONCLUSIONS (Beničanci field) 1.The neural network was selected as the tool for handling uncertainties of porosity distribution in breccia-conglomerate carbonate reservoir of the Badenian age; 2. The lateral changes in averaged reservoir's porosities are influenced by the Middle Miocene depositional environments; 3. The best porosity training results are obtained when all three seismic attributes (amplitude, frequency, phase) were used; 4. The reached correlation of neural results for each attribute is R 2 =0.987 and convergence criteria  e 2 =0.329; 5. These values can slightly (a few percent) differs in every new training, what is consequence of stochastic (random sampling) is some processes of the network fitting; 6. The result indicates that neural network very favour the numerous inputs, but also can be easily applied in the Beničanci field for porosity prediction.

KLOŠTAR FIELD Neural analysis was done by package StatSoft STATISTICA 7

The field is located in the Sava depression. The largest oil reserves are in Upper Miocene sandstones in:  I. series (Lower Pontian age),  II. series (Upper Pannonian age). Neural networks were trained in two wells (Klo-A and Klo-B). Inputs were conventional log data (curves SP, R16 and R64). The neural networks were used to predict:  Lithology and  Saturation with hydrocarbons.

DATA ANALYSIS The networks designing included:  Number of hidden layers and neurons in each layer;  Selection of the best training algorithm;  Number of epochs (iterations);  Learning rate (or here called momentum coefficient).

Input data: Spontaneous potential (SP) log Resistivity logs R 16 and R 64 Paper description of available cores Lithology was defined as a categorical variable - sand (1) or marl (0). LITHOLOGY PREDICTION a Error value ranges from 0 to 1, where 0 represents 100% success of prediction, i.e., no error. Neural network type and properties WellTraining error a Selection error a RBF 3–31–1Klo-A MLP 3–4–6–3–1Klo-A RBF 3–13–1Klo-B MLP 3–6–4–2–1Klo-B

LITHOLOGY PREDICTION (example in well “Klo-B”). The better results are obtain by RBF network. Figure 13: RBF network training (II. sandstone series UP, I. sandstone series DOWN)

SATURATION PREDICTION Input data: Spontaneous potential (SP) log Resistivity logs R 16 and R 64 Paper description of available cores and saturation from DST Hydrocarbon saturation was defined as a categorical value – saturated (1) and unsaturated (0). Neural network type and properties Training errorSelection error MLP 5–6–8–

SATURATION PREDICTION (examples from Klo-A and Klo-B). The better results are obtain in both wells by MLP network.. Figure 13: MLP network training (both series are shown) (Klo-A UP, Klo-B DOWN)

CONCLUSIONS (Kloštar field) 1.Neural networks were trained with the tasks of:  Analyzed sandstone series of Upper Pannonian and Lower Pontian age;  Predicting lithology;  Predicting hydrocarbon saturation. 2. RBF network was used for prediction of lithology; 3. MLP network was used for prediction of hydrocarbon saturation; 4.Results were very good, with small error; 5.Neural network could be applied in sandstone reservoir characterisation; 5. In the Sava depression, RBF and MLP networks are good tool for acquiring useful results from well logs and extending properties along the reservoir (lateral).

RECOMMENDED REFERENCES ANDERSON, J.A. and ROSENFELD, E. (1989): Neurocomputing: Foundations of Research. Cambridge, MA: MIT Press. CHAMBERS, R.L. & YARUS, J.M. (2002): Quantitative Use of Seismic Attributes for Reservoir Characterization. RECORDER, Canadian SEG, Vol. 27, pp , June. CVETKOVIĆ, M. (2007): Petroleum geology use of neural networks on the example of reservoir in Kloštar field. University of Zagreb, Faculty of Mining, Geology and Petroleum Engineering, Graduate thesis, mentor Prof. Dr. J. Velić, 15. June 2007, 49 p. MALVIĆ, T. (2006): Clastic facies prediction using neural networks (Case study from Okoli field). Nafta, 57, 10, MALVIĆ, T. and PRSKALO, S. (2007): Some benefits of the neural approach in porosity prediction (Case study from Beničanci field). Nafta, 58, 9, McCORMACK, M.D. (1991): Neural Computing im Geophysics. The Leading Edge, 10/1, Society of Exploration Geophysicists. RIEDMILLER, M. and BRAUN, H. (1993): A direct adaptive method for faster backpropagation learning: The RProp algorithm. Proc. of the IEEE Intl. Conf. on Neural Networks, San Francisco, p ROSENBLATT, F. (1958): The perceptron: A probabilistic model for information storage and organization in the brain. Psychological Review, 65, ZAHEDI, F. (1993): Inteligent systems for business, expert systems with neural networks. Wodsworth publishing Inc.