1 Backpropagation Neural Network for Soil Moisture Retrieval Using NAFE05 Data : A Comparison Of Different Training Algorithms Soo See Chai Department.

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

Artificial Neural Networks
Forecasting winter wheat yield in Ukraine using 3 different approaches
Making a Line Plot Collect data and put in chronological order
0 - 0.
5 August, 2014 Martijn v/d Horst, TU/e Computer Science, System Architecture and Networking 1 Martijn v/d Horst
1 Update on Kyeamba Creek NAFE study. (1)Intro the study (reminder) (2)Go over reviewers comments from NAFE special edition letter (3) Upscaling to Kyeamba.
1 EUROPEAN TOPIC CENTRE ON WATER WATERBASE Rivers Content and structure of Waterbase Rivers Update procedure Products based on Waterbase Rivers Future.
Addition 1’s to 20.
AMS’04, Seattle, WA. January 12, 2004Slide 1 HYDROS Radiometer and Radar Combined Soil Moisture Retrieval Using Kalman Filter Data Assimilation X. Zhan,
NEURAL NETWORKS Backpropagation Algorithm
Page 1 of 50 Optimization of Artificial Neural Networks in Remote Sensing Data Analysis Tiegeng Ren Dept. of Natural Resource Science in URI (401)
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
1 Part I Artificial Neural Networks Sofia Nikitaki.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
The back-propagation training algorithm
Artificial Intelligence Statistical learning methods Chapter 20, AIMA (only ANNs & SVMs)
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
Improved BP algorithms ( first order gradient method) 1.BP with momentum 2.Delta- bar- delta 3.Decoupled momentum 4.RProp 5.Adaptive BP 6.Trinary BP 7.BP.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Neural Optimization of Evolutionary Algorithm Strategy Parameters Hiral Patel.
ICS 273A UC Irvine Instructor: Max Welling Neural Networks.
Neural Networks.
Neural Networks Introduction – What is a neural network and what can it do for me? Terminology, Design and Topology Data Sets – When too much is not a.
Integrating Neural Network and Genetic Algorithm to Solve Function Approximation Combined with Optimization Problem Term presentation for CSC7333 Machine.
Neural Networks John Riebe and Adam Profitt. What is a neuron? Weights: Weights are scalars that multiply each input element Summer: The summer sums the.
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
Appendix B: An Example of Back-propagation algorithm
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith.
Neural Networks1 Introduction to NETLAB NETLAB is a Matlab toolbox for experimenting with neural networks Available from:
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
William Crosson, Ashutosh Limaye, Charles Laymon National Space Science and Technology Center Huntsville, Alabama, USA Soil Moisture Retrievals Using C-
Gap filling of eddy fluxes with artificial neural networks
Non-Bayes classifiers. Linear discriminants, neural networks.
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
CS621 : Artificial Intelligence
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks - lecture 51 Multi-layer neural networks  Motivation  Choosing the architecture  Functioning. FORWARD algorithm  Neural networks as.
Artificial Neural Networks (Cont.) Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
BACKPROPAGATION (CONTINUED) Hidden unit transfer function usually sigmoid (s-shaped), a smooth curve. Limits the output (activation) unit between 0..1.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Today’s Lecture Neural networks Training
Machine Learning Supervised Learning Classification and Regression
Fall 2004 Backpropagation CS478 - Machine Learning.
The Gradient Descent Algorithm
Predicting Salinity in the Chesapeake Bay Using Neural Networks
Computer Science and Engineering, Seoul National University
Real Neurons Cell structures Cell body Dendrites Axon
Derivation of a Learning Rule for Perceptrons
NEURAL NETWORK APPROACHES FOR AUTOMOBILE MPG PREDICTION
Prof. Carolina Ruiz Department of Computer Science
Machine Learning Today: Reading: Maria Florina Balcan
Brain Inspired Algorithms Dr. David Fagan
Camera Calibration Using Neural Network for Image-Based Soil Deformation Measurement Systems Zhao, Honghua Ge, Louis Civil, Architectural, and Environmental.
network of simple neuron-like computing elements
Neural Networks Geoff Hulten.
Artificial Neural Network
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Application of artificial neural network in materials research
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Prof. Carolina Ruiz Department of Computer Science
Creating Trends for Reservoir Modelling Using ANN
Presentation transcript:

1 Backpropagation Neural Network for Soil Moisture Retrieval Using NAFE05 Data : A Comparison Of Different Training Algorithms Soo See Chai Department of Spatial Sciences, Curtin University of Technology

2 CONTENT Neural Network and Soil Moisture Retrieval Backpropagation Neural Network Training of Neural Network Testing Results Q and A

3 Neural Network For Soil Moisture Retrieval Radiometric signatures of a vegetation- covered field reflect an integrated response of the soil and vegetation system to the observing microwave system Surface parameters and radiometric signatures :

4 Backpropagation Neural Network

5 Different Backpropagation Training Algorithms Several different training algorithms have a variety of different computation and storage requirements No one algorithm is best suited to all locations MATLAB : 11 different training algorithms Review : basic gradient descent and Levenberg-Marquardt(LM) algorithm How about the other algorithms ?

6 Data Preparation Roscommon area : 1/11, 8/11, 15/11 Determine the area coordinate : Roscommon : Top latitude : Bottom latitude : Left longitude : Right longitude : MATLAB : cut the area, extract the fields in the PLMR file Copy the latitude, longitude, brightness temperature and altitude data into Excel Extract the aircraft altitude of medium resolution mapping which is around 1050m to 1270m ASL

7 Roscommon 1/11Roscommon 8/11Roscommon 15/11

8 Example :

9 Find the minimum and maximum of average Tb for each data set Next find the range (max-min) Find the width for each class ( 3 classes : training, validation and testing ) Range / 3 Find starting and ending point for each class A bit of Statistics …

10 We have now : Group 1 of date 1/11, 8/11 and 15/11 (combined : GRP1) Group 2 of date 1/11, 8/11, 15/11 (combined : GRP2) Group 3 of date 1/11, 8/11, 15/11(combined : GRP3) GRP 1 : randomly divide them into 3 groups : 60% for training, 30% for validation and 10% for testing Same with GRP2 and GRP3 All training data in one file, all validation data in one file, all testing data in one file

11 Training : K-Fold Cross Validation No. of data set is small, to get a better accuracy result, K-fold validation is used. Training data + validation data = fold cross validation, each time 14 data will be used for validation, 98 data for training To make sure the data is random enough, each time the data will be randomized. Eg: First run : Second run : validationtraining validationtraining

12 Training :NN Parameters determination A series of experiments trial and error lowest RMSE If yes, then save the input weight, layer weight and bias for the NN to be used for the other training algorithms Fixed Layer : 3 layers ( 1 input, 1 hidden, 1 output ) Input : H polarized brightness temperature, TbH and physical soil temperature at 4cm Hidden : sigmoid function Output : linear function Soil moisture (%v/v)

13 Experiments carried out : Decision : Learning rate, lr = Momentum, mc = 0.4 Input Weight, iw = W2.mat Layer weight, lw = LW2.mat Bias, b = B2.mat No. of hidden neuron = 4 No. of epochs = 200

14 Testing Result I – Roscommon No.Backpropagation AlgorithmRMSE (%) 1.Batch Gradient with Momentum (traingdm) Gradient Descent with Adaptive Learning Rate (traingda) Gradient descent with momentum and adaptive learning rate (traingdx) Resilient backpropagation (trainrp) Conjugate gradient backpropagation with Fletcher-Reeves updates (traincgf) *finish : epochs = Conjugate gradient backpropagation with Polak-Ribiére updates (traincgp) *finish : epochs = Conjugate gradient backpropagation with Powell-Beale restarts (traincgb) *finish : epochs = Scaled conjugate gradient backpropagation (trainscg) Quasi-Newton Algorithm :BFGS (trainbfg) Quasi-Newton Algorithm :One step Secant Algorithm (trainoss) Levenberg-Marquardt (trainlm)4.04

15 Conclusions Different types of training algorithms of backpropagation NN is giving different but similar accuracy result The training data is representative of the testing data

16 Questions Is the NN architecture transferable ? Is number of data a factor contribute to the accuracy of the retrieval ? Adding ancillary data (beside soil temperature) : vegetation water content and land cover information help ? Adding V-polarized brightness temperature as an input ? Adding these data directly or let the NN account for these data ?

17