Download presentation
Presentation is loading. Please wait.
Published byLeony Makmur Modified over 5 years ago
1
David Lubo-Robles*, Thang Ha, S. Lakshmivarahan, and Kurt J. Marfurt
USING PROBABILISTIC NEURAL NETWORKS FOR ATTRIBUTE ASSESSMENT AND GENERAL REGRESSION David Lubo-Robles*, Thang Ha, S. Lakshmivarahan, and Kurt J. Marfurt By The University of Oklahoma
2
Predict missing logs in well data Seismic attribute assessment.
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study Probabilistic Neural Networks (PNN) is an interpolation method in which different mathematical operations can be organized in different layers forming a neural network structure (Specht 1990, Hampson et al., 2001). Compared to Multi-layer Feedforward Networks which commonly use a sigmoid function for its implementation, PNN uses a Gaussian distribution as an activation function (Specht, 1990). PNN was designed fundamentally for classification between two or more classes. However, it can be modified into the General Regression Neural Network for mapping/regression tasks (Masters, 1995). Objectives: Predict missing logs in well data Seismic attribute assessment. Predict reservoir properties using seismic attributes. 1
3
Hidden Layer Linear Nonlinear MLFN vs. PNN
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study Multi-layer Feedforward Network (MLFN): Linear Nonlinear W1 W2 net = W1*i1 + W2*i2 π π+ π βπππ Activation function W13 Hidden Layer Input Layer Hidden Layer Output Layer i1 i2 i3 i4 2
4
Pattern Layer Summation Layer
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study π π = π π΅ π=π π΅ π β π=π π΄ ( π π βπππ) π π π π Probabilistic Neural Network (PNN): π β π π βπππ π π π π π β π π βπππ π π π π Pattern Layer π β π π βππ π π π π π β π π βππ π π π π + β¦ Summation Layer Input Layer Class 2 Class 1 Pattern Layer Summation Layer Output π π p11 p12 π π π π π π π π p21 π π p22 3
5
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study General Regression Neural Network (GRNN): Input Layer Pattern Layer Summation Layer Numerator Denominator Predicted Value Output p11 π π π = π=π π΅ πππ β π=π π΄ ( π π βπππ) π π π π π=π π΅ π β π=π π΄ ( π π βπππ) π π π π π 4
6
Validation Data (input) Training Data (pattern)
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study GRNN: π = π=π π΅ πππ β π=π π΄ ( π π βπππ) π π π π π=π π΅ π β π=π π΄ ( π π βπππ) π π π π Validation Data (input) Attribute #2 Attribute #1 Property (π£) Training Data (pattern) Depth/Time Attribute #1 Attribute #2 Predicted Property ( π£ ) Depth/Time 5
7
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study Optimizing Ο: Exhaustive Search. Cross-validation is used to train the neural network. (Hampson, 2001) Training Data Validation Data π.πβ€Οβ€π Property (π) Predicted Property ( π ) e = π β π π π΅ e --- > Error Leaving out Sigma // Error 1 2 3 4 Ο1 // e1 Ο2 // e2 Ο3 // e3 Ο4 // e4 6
8
Attribute assessment: Combination List Leaving out: Well #2
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study Attribute assessment: Training Data Validation Data Combination List Leaving out: Well #2 Attribute (s) Sigma // Error 1 2 3 1,2 1,3 2,3 1,2,3 Ο1 // e1 Ο2 // e2 Ο3 // e3 Ο1,2 // e1,2 Ο1,3 // e1,3 Ο2,3 // e2,3 Ο1,2,3 // e1,2,3 7
9
MLFN vs. PNN 8 Introduction GRNN Conclusions Optimize π
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study 8
10
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study CASE STUDY: PREDICTING POROSITY USING WELL LOGS IN DIAMOND M FIELD, TX. 9
11
Wells: Garnet, Emerald, Topaz, Jade.
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study Geologic Background: Wells: Garnet, Emerald, Topaz, Jade. Top Reef Modified from Walker et al., 1995 Diamond M Field is located in Scurry County, TX., approximately 80 mi northeast of Midland, Texas. The trend is part of the Horseshoe Atoll Reef Complex, an arcuate chain of reef mounds, made of mixed types of bioclastic debris that accumulated during the Late Paleozoic in the Midland basin (Vest, 1970). The massive carbonates presented in the Atoll are separated by correlative shale beds (Davogustto, 2010) . 10
12
correlation coefficient
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study Logs used for prediction: Gamma Ray (GR), Resistivity (RD), Density (π ), and Photoelectric Factor (PEFZ). GR vs. Porosity Ο vs. Porosity RD vs. Porosity PEFZ vs. Porosity GR (API) Ο (g/cm3 ) RD( omh-ft) PEFZ Porosity πππππ,π= πππ (π,π) ππππ πππππ,π --- > Pearsonβs correlation coefficient Input Attribute Correlation Gamma Ray (GR) 0.5202 Density (Ο) Resistivity (RD) Photoelectric Factor (PEFZ). 11
13
Leaving out well: Garnet Leaving out well: Emerald
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study Combination Lists e = π β π π π΅ Leaving out well: Garnet Leaving out well: Emerald Error= Error= 12
14
Leaving out well: Topaz
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study Combination Lists e = π β π π π΅ Leaving out well: Jade Error= Leaving out well: Topaz Error= 13
15
MLFN vs. PNN 14 Introduction GRNN Conclusions Optimize π
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study Error= Error= Error= Error= 14
16
MLFN vs. PNN Garnet: Jade: Emerald: Topaz: 15
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study πππππ,π= πππ (π,π) ππππ Cross-plots Garnet: (Corr = ) Jade: (Corr = ) Emerald: (Corr = ) Topaz: (Corr = ) 15
17
MLFN vs. PNN Future work 16 Introduction GRNN Conclusions Optimize π
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study Future work 16
18
PNN provides faster learning speed than MLFN.
AASPI Introduction GRNN Conclusions MLFN vs. PNN Optimize π Exhaustive GRNN Workflow Case study PNN provides faster learning speed than MLFN. PNN are insensitive to outliers. Therefore it can be more accurate than MLFN. GRRN is a powerful technique to predict missing data using known properties as input. The novel technique Exhaustive GRNN allow us to determine the best group of attributes to be used for a regression task. Computation time increases as a larger range of Ο is used. Future work: We will apply our technique using seismic attributes in order to obtain a predicted rock property survey. We will expand our Exhaustive GRNN technique for classification tasks (Exhaustive PNN). Simulated annealing, and the Gradient Descent Method will be implemented to obtain different sigma for each attribute and class. 17
19
References Davogustto, O., 2013, Quantitative Geophysical Investigations at the Diamond M Field, Scurry County, Texas. Ph.D Dissertation, The University of Oklahoma. Hampson, D.P., Schuelke, J.S., Quirein, J.A., 2001, Use of multiattribute transforms to predict log properties from seismic data. Geophysics, 66, Hughes D., and N. Correll, 2016, Distributed Machine Learning in Materials that Couple Sensing, Actuation, Computation and Communication, arXiv: v1. Masters, T., 1995, Advanced Algorithms for Neural Networks. Specht, D., 1990, Probabilistic neural networks: Neural Networks, 3, Walker, D. A., J. Golonka, A. M. Reid, and S. T. Reid, The effects of late Paleozoic paleolatitude and paleogeography on carbonate sedimentation in the Midland Basin, Texas; Permian Basin plays: Society of Economic Paleontologists and Mineralogists, Permian Basin Chapter, Tomorrowβs Technology Today, 18
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.