Artificial Neural Network Techniques For Estimating Heravy Rainfall From Satellite Data Ming Zhang, Roderick A. Scofield NOAA/NESDIS/ORA 5200 Auth Road, Room 601 Camp Springs, MD 20746, USA Page 1
Page 2 ANSER System Interface
Artificial Neural network expert System for Estimation of Rainfall from the satellite data ANSER System ( ) :US$66,000 suported by USA National Research Council & NOAA :A$11,000 suppouted by Australia Research Council& NOAA :US$62,000 suported by USA National Research Council & NOAA Page 3
Why Develop ANSER ? - More than $3.5 billion in property is damaged and, more than 225 people are killed by heavy rain and flooding each year - No rainfall estimating system in GIS system, No real time and working system of rainfall estimation in the world - Can ANN be used in the weather forecasting area? If yes, how should we use ANN techniques in this area? Page 4
Why Use Neural Network Techniques ? - Two Directions of New generation computer Quamtun Computer Artificial Neural Network - Much quicker speed ? - Complicated pattern recognition? - Unknown rule knowledge base? - Self learning reasoning network? - Super position for multip choice? Page 5
ANN Operator Simulator Page 6
Cloud Merger Operator u MI(i, j) is a black-and-white image which can be used to represent asatellite image. u Label Set:L = {0, 1, 2, …… M} Each label corresponds to a different cloud merger. u Cloud Merger Recognising operator CMR CMR: MI(i, j) L Page 7
Cloud Merger Operator Set u The cloud merger recognising operator CMR is the operator set: u CMR = { CMCI, CMR1, CMR2, CMS1,CMS2,CMS3,CMS4, CMM1,CMM2,CMM3,CMM4} u Where CMCI: Circle input satellite data cloud merger recognising operator. …... Page 8
Ternary Output of Cloud Merger Operator 1, O(Ns,t) 1 – cloud merger u L= 2, O(Ns,t) 3 – further test needed 0, O(Ns,t) 2 - cloud not merger u where s-th level is the output layer of NN. u All other operators (CMR1, CMR2, CMS1, CMS2, CMS3, CMS4,CMM1, CMM2, CMM3, CMM4) have the same definitions as CMCI. Page 9
Cloud Merge Using ANN Circle Operator Page 10
Cloud Merge Using ANN S-Shape 4 Operator Page 11
Results of Cloud Merger Operator ANN Operator No Merger Merger u Circle u Rectangle u Rectangle u S-Shape u S-Shape u S-Shape u S-Shape u Moon-Shape u Moon-Shape u Moon-Shape u Moon-Shape Page 12
PT-HONN Data Simulator Page 13
Simulating by PT-HONN Simulator Page 14
PT-HONN MODEL u The network architecture of PT-HONN has combined both the characteristics of PHONN and THONN. u It is a multi-layer network that consists of an input layer with input-units, and output layer with output-units, and two hidden layers consisting of intermediate processing units. Page 15
Definition of PT-HONN Page 16
Structure of PT-HONN Page 17
Knowledge of Rainfall Half Hour Rainfall Inches Cloud Top Cloud Growth Latitude Degree Temperature 2/3 1/3 0 u > -32 C u -36 C u -46 C u -55 C u -60 C u -70 C u <-80 C Page 18
PT-HONN Results Cloud Top Cloud Growth PHONN PT_HONN Temperature Latitude Degree |Erro|r% |Error| % …… …… ……. …... u > - 32 C1/ u - 36 C1/ u - 46 C1/ u - 55 C1/ u - 60 C1/ u - 70 C1/ u < - 80 C1/ u …… …… …… …... u Average 6.36% 5.68% Page 19
Page 20 Neuron-Adaptive Neural Network Simulator
* The network architecture of NANN is a multilayer feed-forward network that consists of an input layer with input-units, an output layer with output-units, and one hidden layer consisting of intermediate processing units. * There is no activation function in the input layer and the output neurones are summing units (linear activation) * our activation function for the hidden layer processing units is a Neuron-Adaptive Activation Function (NAAF) NANN Page 21
The activation function for the hidden layer processing units is a Neuron-Adaptive Activation Function (NAAF) defined as where a1,b1,a2,b2,a3 and b3 are real variable which will be adjusted (as well as weights) during training. NAAF Page 22
Structure of NANN Page 23
NANN Group u Neuron-Adaptive Feedforward Neural network Group (NAFNG) is one kind of neural network group in which each element is a neuron-adaptive feedforward neural network (F i ). We have: NAFNG ={F 1, F 2, F 3,…... F i,…...F n } Page 24
Feature of NANN u Hornik (1991): If the activation function is continuous, bounded and nonconstant, then standard FNN can approximate any continuous function. u Leshno (1993): A standard FNN can approximate any continuous function if the network's activation function is not a polynomial. u A neuron-adaptive feedforward neural network group with adaptive neurones can approximate any kind of piecewise continuous function. Page 25
Page 26 ANSER Rainfall Estimation Result (May 2000) Time LAT LAN ANSER GAGE Min mm 2.0mm Max mm 6.0mm Min mm 2.0mm Max mm 12.0mm Min mm 1.8mm Max mm 9.0mm Min mm 6.0mm Max mm 33.0mm
Conclusion- What Approved Artificial Neural Network Techniques can : - Much quick speed: 5-10 time quick - Complicated pattern recognition: cloud merger - Unknown rule knowledge base: Rainfall - Reasoning network: rainfall estimation Page 27
Conclusion- Next Step - Rebuild interface & retraining neural networks - New neural netowrk models: more complicated pattern recognition - Self expending knowledge base: attract knowledge from real time cases - Self learning reasoning network: automatic system to - Study in advance in 15 years: Artificial Neural Network - one of two directions of new generation computer Research Page 28
Page 29 ANSER Rainfall Estimation Result 9th May 2000 Time: 18Z LAT LAN Min Max ANSER Min: 1.47 mm Max: 6.37mm NAVY Min: 2.0mm Max: 6.0mm
Page 30 ANSER Rainfall Estimation Result 12th May 2000 Time: 07Z LAT LAN Min Max ANSER Min: 2.45 mm Max: 9.31mm Gage Min: 2.0mm Max: 12.0mm
Page 31 ANSER Rainfall Estimation Result 23th May 2000 Time: 06Z LAT LAN Min Max ANSER Min: 0.98 mm Max: 8.82mm Gage Min: 1.8mm Max: 9.0mm
Page 32 ANSER Rainfall Estimation Result 24th May 2000 Time: 06Z LAT LAN Min Max ANSER Min: 7.10 mm Max: 27.69mm Gage Min: 6.0mm Max: 23.0mm NAVA Min: 7.0mm Max: 33.0mm