Well Log Data Inversion Using Radial Basis Function Network Kou-Yuan Huang, Li-Sheng Weng Department of Computer Science National Chiao Tung University.

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

Slides from: Doug Gray, David Poole
Support Vector Machines
Supervised Learning Recap
Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Radial Basis Functions
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
I welcome you all to this presentation On: Neural Network Applications Systems Engineering Dept. KFUPM Imran Nadeem & Naveed R. Butt &
Chapter 5 NEURAL NETWORKS
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Speaker Adaptation for Vowel Classification
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Radial Basis Function Networks 표현아 Computer Science, KAIST.
Neural Networks Chapter Feed-Forward Neural Networks.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
CS Instance Based Learning1 Instance Based Learning.
Aula 4 Radial Basis Function Networks
September 28, 2010Neural Networks Lecture 7: Perceptron Modifications 1 Adaline Schematic Adjust weights i1i1i1i1 i2i2i2i2 inininin …  w 0 + w 1 i 1 +
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial-Basis Function Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Radial Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Evaluating Performance for Data Mining Techniques
Radial Basis Function Networks
Approximating the Algebraic Solution of Systems of Interval Linear Equations with Use of Neural Networks Nguyen Hoang Viet Michal Kleiber Institute of.
Radial Basis Function Networks
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Artificial Neural Networks Shreekanth Mandayam Robi Polikar …… …... … net k   
Least-Mean-Square Training of Cluster-Weighted-Modeling National Taiwan University Department of Computer Science and Information Engineering.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Radial Basis Function Networks:
1 RECENT DEVELOPMENTS IN MULTILAYER PERCEPTRON NEURAL NETWORKS Walter H. Delashmit Lockheed Martin Missiles and Fire Control Dallas, TX 75265
A hybrid SOFM-SVR with a filter-based feature selection for stock market forecasting Huang, C. L. & Tsai, C. Y. Expert Systems with Applications 2008.
Artificial Intelligence Chapter 3 Neural Networks Artificial Intelligence Chapter 3 Neural Networks Biointelligence Lab School of Computer Sci. & Eng.
Multi-Layer Perceptron
ADALINE (ADAptive LInear NEuron) Network and
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
SUPERVISED LEARNING NETWORK
Lecture 5 Neural Control
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models Ahmed Kattan Edgar Galvan.
Neural Networks 2nd Edition Simon Haykin
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Intelligent Numerical Computation1 Center:Width:.
Dept. of Electronics Engineering & Institute of Electronics National Chiao Tung University Hsinchu, Taiwan ISPD’16 Generating Routing-Driven Power Distribution.
Neural network based hybrid computing model for wind speed prediction K. Gnana Sheela, S.N. Deepa Neurocomputing Volume 122, 25 December 2013, Pages 425–429.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Multiple-Layer Networks and Backpropagation Algorithms
Neural Networks Winter-Spring 2014
ECE 539 Project Jialin Zhang
Artificial Neural Network & Backpropagation Algorithm
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
Neural Networks Chapter 5
Neural Network - 2 Mayank Vatsa
Chapter 9: Supervised Learning Neural Networks
Capabilities of Threshold Neurons
Artificial Intelligence Chapter 3 Neural Networks
Introduction to Radial Basis Function Networks
Artificial Intelligence Chapter 3 Neural Networks
Presentation transcript:

Well Log Data Inversion Using Radial Basis Function Network Kou-Yuan Huang, Li-Sheng Weng Department of Computer Science National Chiao Tung University Hsinchu, Taiwan and Liang-Chi Shen Department of Electrical & Computer Engineering University of Houston Houston, TX

Outline Introduction Proposed Methods Modification of two-layer RBF Proposed three-layer RBF Experiments Simulation using two-layer RBF Simulation using three-layer RBF Application to real well log data inversion Conclusions and Discussion

Real well log data: Apparent conductivity vs. depth

Inversion to get the true layer effect?

Review of well log data inversion Lin, Gianzero, and Strickland used the least squares technique, Dyos used maximum entropy, Martin, Chen, Hagiwara, Strickland, Gianzero, and Hagan used 2-layer neural network, Goswami, Mydur, Wu, and Hwliot used a robust technique, Huang, Shen, and Chen used higher order perceptron, IEEE IGARSS, 2008.

Review of RBF Powell, 1985, proposed RBF for multivariate interpolation. Hush and Horne, 1993, used RBF network for functional approximation. Haykin, 2009, summarized RBF in Neural Networks book.

Conventional two-layer RBF Hush and Horne, 1993

Training in conventional two-layer RBF

Properties of RBF RBF is a supervised training model. The 1 st layer used the K-means clustering algorithm to determine the K nodes. The activation function of the 2 nd layer was linear. f(s)=s. f ’(s)=1. The 2 nd layer used the Widrow-Hoff learning rule.

Output of the 1 st layer of RBF

Training in the 2 nd layer

Outline Introduction Proposed Methods Modification of two-layer RBF Proposed three-layer RBF Experiments Simulation using two-layer RBF Simulation using three-layer RBF Application to real well log data inversion Conclusions and Discussion

Modification of two-layer RBF

Training in modified two-layer RBF

Optimal number of nodes in the 1 st layer

Perceptron training in the 2 nd layer

Outline Introduction Proposed Methods Modification of two-layer RBF Proposed three-layer RBF Experiments Simulation using two-layer RBF Simulation using three-layer RBF Application to real well log data inversion Conclusions and Discussion

Proposed three-layer RBF

Training in proposed three-layer RBF

Generalized delta learning rule ( Rumelhart, Hinton, and Williams, 1986 )

Outline Introduction Proposed Methods Modification of two-layer RBF Proposed three-layer RBF Experiments Simulation using two-layer RBF Simulation using three-layer RBF Application to real well log data inversion Conclusions and Discussion

Experiments: System flow in simulation Apparent resistivity (Ra ) Apparent conductivity (Ca) True formation resistivity (Rt) Radial basis function network (RBF) Scale Ca to 0~1 (Ca’) Desired true formation conductivity (Ct’’) Re-scale Ct’ to Ct True formation conductivity (Ct’)

Experiments: on simulated well log data In the simulation, there are 31 well logs. Professor Shen at University of Houston worked on theoretical calculation. Each well log has the apparent conductivity (Ca) as the input, and the true formation conductivity (Ct) as the desired output. Well logs #1~#25 are for training. Well logs #26~#31 are for testing.

Simulated well log data: examples Simulated well log data #7

Simulated well log data #13

Simulated well log data #26

What is the input data length? Output length? 200 records on each well log. 25 well logs for training. 6 well logs for testing. How many inputs to the RBF is the best? Cut 200 records into 1, 2, 4, 5, 10, 20, 40, 50, 100, and 200 data, segment by segment, to test the best input data length to RBF model. For inversion, the output data length is equal to the input data length in the RBF model. In testing, input n data to the RBF model to get the n output data, then input n data of the next segment to get the next n output data, repeatedly.

Example of input data length at well log #13 If each segment (pattern vector) has 10 data, 200 records of each well log are cut into 20 segments (pattern vectors). If each segment (pattern vector) has 10 data, 200 records of each well log are cut into 20 segments (pattern vectors).

Input data length and # of training patterns from 25 training well logs Input data length Number of training patterns

Optimal cluster number of training patterns Example: for input data length 10 PFS vs. K. For input N=10, the optimal cluster number K is 27.

Optimal cluster number of training patterns in 10 cases Set up 10 two-layer RBF models. Compare the testing errors of 10 models to select the optimal RBF model. N features Training patterns K clusters

Experiment: Training in modified two-layer RBF

Parameter setting in the experiment

Testing errors at 2-layer RBF models in simulation RBF model gets the smallest error in testing. Network size Number of training patterns MAE at 20,000 iterations Average MAE of 6 well log data inversion Training CPU Time (H:M:S) :43: :22: :11: :11: :05: :03: :01: :01: :01: :01:13

Training result: error vs. iteration using two-layer RBF

Inversion testing Inversion testing using two-layer RBF Inverted Ct of log #26 by network (MAE= ). Inverted Ct of log #27 by network (MAE= ).

Inverted Ct of log #28 by network (MAE= ). Inverted Ct of log #29 by network (MAE= ).

Inverted Ct of log #30 by network (MAE= ). Inverted Ct of log #31 by network (MAE= ).

Outline Introduction Proposed Methods Modification of two-layer RBF Proposed three-layer RBF Experiments Simulation using two-layer RBF Simulation using three-layer RBF Application to real well log data inversion Conclusions and Discussion

Experiment: Training in modified three-layer RBF. Hidden node number?

Determine the number of hidden nodes in the 2-layer perceptron

Hidden node number and optimal 3-layer RBF

Training result: error vs. iteration using three-layer RBF

Inversion testing Inversion testing using three-layer RBF Inverted Ct of log 26 by network (MAE= ) Inverted Ct of log 27 by network (MAE= )

Inverted Ct of log 28 by network (MAE= ) Inverted Ct of log 29 by network (MAE= )

Inverted Ct of log 30 by network (MAE= ) Inverted Ct of log 31 by network (MAE= )

Testing error of each well log using three-layer RBF model Average error: Well Log Data MAE of well log data inversion # # # # # #

Average testing error of each three-layer RBF model in simulation Experiments using RBFs with different number of hidden nodes get the smallest average error in testing. So it is selected to the real data application. Network size Number of training patterns Error MAE at 20,000 iterations Training CPU Time (H:M:S) Average Error MAE of 6 Well log data inversion :31: :29: :29: :30: :29: :29:

Outline Introduction Proposed Methods Modification of two-layer RBF Proposed three-layer RBF Experiments Simulation using two-layer RBF Simulation using three-layer RBF Application to real well log data inversion Conclusions and Discussion

Real well log data: Apparent conductivity vs. depth

Application to real well log data inversion Real well log data: Depth from 5,577.5 to 6,772 feet. Sampling interval 0.5 feet. Total 2,290 data in one well log. Select optimal RBF model for real data inversion. After convergence in training, input 10 real data to the RBF model to get the 10 output data, then input 10 data of the next segment to get the next 10 output data, repeatedly.

Inversion of real well log data: Inverted Ct vs. depth

Outline Introduction Proposed Methods Modification of two-layer RBF Proposed three-layer RBF Experiments Simulation using two-layer RBF Simulation using three-layer RBF Application to real well log data inversion Conclusions and Discussion

We have the modification of 2-layer RBF and propose 3- layer RBF for well log data inversion. 3-layer RBF has better inversion than 2-layer RBF because more layers can do more nonlinear mapping. In the simulation, the optimal 3-layer model is It can get the smallest average mean absolute error in the testing. The trained RBF model is applied to the real well log data inversion. The result is acceptable and good. It shows that the RBF model can work on well log data inversion. Errors are different at experiments because initial weights are different in the network. But the order or percentage of errors can be for comparison in the RBF performance.

Thank you for your attention.