Canadian Weather Analysis Using Connectionist Learning Paradigms Imran Maqsood*, Muhammad Riaz Khan , Ajith Abraham  * Environmental Systems Engineering.

Slides:



Advertisements
Similar presentations
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Advertisements

Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
SEKE 2014, Hyatt Regency, Vancouver, Canada
Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Lecture 14 – Neural Networks
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Decision Support Systems
6/10/ Visual Recognition1 Radial Basis Function Networks Computer Science, KAIST.
The back-propagation training algorithm
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
I welcome you all to this presentation On: Neural Network Applications Systems Engineering Dept. KFUPM Imran Nadeem & Naveed R. Butt &
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Radial Basis Function Networks 표현아 Computer Science, KAIST.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial Basis Function (RBF) Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Radial Basis Function Networks
1 Neural plug-in motor coil thermal modeling Mo-Yuen Chow; Tipsuwan Y Industrial Electronics Society, IECON 26th Annual Conference of the IEEE, Volume:
Neuro-fuzzy Systems Xinbo Gao School of Electronic Engineering Xidian University 2004,10.
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
Machine Learning. Learning agent Any other agent.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Networks
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
A Statistical Comparison of Weather Stations in Carberry, Manitoba, Canada.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Artificial Neural Networks An Overview and Analysis.
Chapter 9 Neural Network.
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
NEURAL NETWORKS FOR DATA MINING
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Applying Neural Networks Michael J. Watts
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
Artificial Neural Network Building Using WEKA Software
Akram Bitar and Larry Manevitz Department of Computer Science
CS621 : Artificial Intelligence
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
SUPERVISED LEARNING NETWORK
Lecture 5 Neural Control
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Introduction to Neural Networks Freek Stulp. 2 Overview Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered.
COMP53311 Other Classification Models: Neural Network Prepared by Raymond Wong Some of the notes about Neural Network are borrowed from LW Chan’s notes.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Chapter 6 Neural Network.
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Dr.-Ing. Erwin Sitompul President University Lecture 1 Introduction to Neural Networks and Fuzzy Logic President UniversityErwin SitompulNNFL 1/1
Neural network based hybrid computing model for wind speed prediction K. Gnana Sheela, S.N. Deepa Neurocomputing Volume 122, 25 December 2013, Pages 425–429.
Evolutionary Computation Evolving Neural Network Topologies.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Machine Learning Supervised Learning Classification and Regression
Applying Neural Networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
Emna Krichene 1, Youssef Masmoudi 1, Adel M
The Network Approach: Mind as a Web
Akram Bitar and Larry Manevitz Department of Computer Science
Presentation transcript:

Canadian Weather Analysis Using Connectionist Learning Paradigms Imran Maqsood*, Muhammad Riaz Khan , Ajith Abraham  * Environmental Systems Engineering Program, Faculty of Engineering, University of Regina, Regina, Saskatchewan S4S 0A2, Canada,  Partner Technologies Incorporated, 1155 Park Street, Regina, Saskatchewan S4N 4Y8, Canada,  Faculty of Information Technology, School of Business Systems, Monash University, Clayton 3800, Australia, 7th Online World Conference on Soft Computing in Industrial Applications (on WWW), September 23 - October 4, 2002

23 Sep - 04 Oct, 2002WSC72 CONTENTS Introduction MLP, ERNN and RBFN Background Experimental Setup of a Case Study Test Results Conclusions

23 Sep - 04 Oct, 2002WSC73 1. INTRODUCTION Weather forecasts provide critical information about future weather Weather forecasting remains a complex business, due to its chaotic and unpredictable nature Combined with threat by the global warming and green house gas effect, impact of extreme weather phenomena on society is growing costly, causing infrastructure damage, injury and the loss of life.

23 Sep - 04 Oct, 2002WSC74 Accurate weather forecast models are important to the countries, where the entire agriculture depends upon weather Previously, several artificial intelligence techniques have been used in the past for modeling chaotic behavior of weather However, several of them use simple feed-forward neural network training methods using backpropagation algorithm

23 Sep - 04 Oct, 2002WSC75 Study Objectives To develop an accurate and reliable predictive models for forecasting the weather of Vancouver, BC, Canada. To compare performance of multi-layered perception (MLP) neural networks, Elman recurrent neural networks (ERNN) and radial basis function network (RBFN) for the weather analysis.

23 Sep - 04 Oct, 2002WSC76 2. ANN BACKGROUND INFORMATION ANN Advantages An ability to solve complex and non-linear problems Quick response Self-organization Real time operation Fault tolerance via redundant information coding Adaptability and generalization

23 Sep - 04 Oct, 2002WSC77 O1O1 O2O2 OmOm VjVj W ij w jk    Input Layer Hidden Layer Output Layer I1I1 I2I2 InIn kk OiOi (a) Multi-Layered Perceptron (MLP) Networks network is arranged in layers of neurons every neuron in a layer computes sum of its inputs and passes this sum through a nonlinear function as its output. Each neuron has only one output, but this output is multiplied by a weighting factor if it is to be used as an input to another neuron (in a next higher layer) There are no connections among neurons in the same layer. Figure: Architecture of 3-layered MLP network for weather forecasting

23 Sep - 04 Oct, 2002WSC78 (b) Elman Recurrent Neural Networks (ERNN) ERNN are a subclass of recurrent networks They are multilayer perceptron networks augmented with one or more additional context layers storing output values of one of the layers delayed by one step and used for activating this or some other layer in the next time step The Elman network can learn sequences that cannot be learned with other recurrent neural network    I3I3   Input LayerHidden LayerOutput Layer I1I1 I2I2 InIn O1O1 OmOm D -1 Feedback Figure: Architecture of 3-layered ERNN

23 Sep - 04 Oct, 2002WSC79 (c) Radial Basis Function Network (RBFN) network consists of 3-layers: input layer, hidden layer, and output layer The neurons in hidden layer are of local response to its input and known as RBF neurons, while the neurons of the output layer only sum their inputs and are called linear neurons network is inherently well suited for weather prediction, because it naturally uses unsupervised learning to cluster the input data. W2W2 WnWn W0W0 W1W1     11 22 nn I1I1 I2I2 ImIm Input Layer Pure linear Output =  w i  i (x) Adjustable weights w i w 0 = bias Adjustable centers c i Adjustable spreads  i Figure: Architecture of RBFN

23 Sep - 04 Oct, 2002WSC EXPERIMENTAL SETUP Weather Data: Vancouver, BC, Canada 1-yr data: Sep 2000 – Aug 2001 Observed Parameters (most important): –Minimum Temperature ( o C) –Maximum Temperature( o C) –Wind-Speed (km/hr)

23 Sep - 04 Oct, 2002WSC711 Training and Testing Datasets Dataset 1: MLP and ERNN –Testing dataset = January 2001 –Training dataset = remaining data Dataset 2: RBFN, MLP and ERNN –Testing dataset = April 2001 –Training dataset = remaining data We used this above method to ensure that there is no bias on the training and test datasets

23 Sep - 04 Oct, 2002WSC712 Simulation System Used Pentium-III, 1GHz processor 256 MB RAM all the experiments were simulated using MATLAB Steps taken before starting the training process: Error level was set to a value (10 -4 ) The hidden neurons were varied (10-80) and the optimal number for each network were then decided.

23 Sep - 04 Oct, 2002WSC713 Convergence of the LM and OSS training algorithms using MLP network Convergence of the LM and OSS training algorithms using ERNN 4. TEST RESULTS Training Convergence of MLP and ERNN

23 Sep - 04 Oct, 2002WSC714 Comparison of Actual vs. 10-day ahead Forecasting using OSS and LM approaches MLP network ERNN (a) Minimum Temperature (11-20 Jan 2001) Performance evaluation parameters (min. temperature) MLP Network ERNN OSSLM OSSLM Mean absolute % error (MAPE) Root mean square error (RMSE) Mean absolute deviation (MAD) Correlation coefficient Training time (minutes) Number of iterations (epochs)

23 Sep - 04 Oct, 2002WSC715 MLP network ERNN (b) Maximum Temperature (11-20 Jan 2001) Performance evaluation parameters MLP Network ERNN OSSLM OSSLM Mean absolute % error (MAPE) Root mean square error (RMSE) Mean absolute deviation (MAD) Correlation coefficient Training time (minutes) Number of iterations (epochs)

23 Sep - 04 Oct, 2002WSC716 MLP network ERNN (c) Maximum Wind-Speed (11-20 Jan 2001) Performance evaluation parameters (wind-speed) MLP Network ERNN OSSLM OSSLM Mean absolute % error (MAPE) Root mean square error (RMSE) Mean absolute deviation (MAD) Correlation coefficient Training time (minutes) Number of iterations (epochs)

23 Sep - 04 Oct, 2002WSC717 Comparison of Relative Percentage Error between Actual and Forecasted Parameters MLP network ERNN Minimum Temperature Maximum Temperature Wind-Speed

23 Sep - 04 Oct, 2002WSC718 Comparison of Training of Connectionist Models Network modelNumber of hidden neurons Number of hidden layers Activation function used in hidden layer Activation function used in output layer MLP 451Log-sigmoidPure linear ERNN451Tan-sigmoidPure linear RBFN1802Gaussian functionPure linear

23 Sep - 04 Oct, 2002WSC Days of the Month Temperature ( o C) Days of the Month Temperature ( o C) Actual valueRBFNMLPRNN Maximum Temperature Minimum Temperature Wind-Speed Comparison among three Neural Networks Techniques for 15-day ahead Forecasting (1-15 Apr 2001)

23 Sep - 04 Oct, 2002WSC720 Model Performance Evaluation Parameters Maximum Temperature Minimum Temperature Wind Speed RBFN MAP MAD Correlation Coefficient MLP MAP MAD Correlation Coefficient ERNN MAP MAD Correlation Coefficient Performance Evaluation of RBFN, MLP and ERNN Techniques

23 Sep - 04 Oct, 2002WSC CONCLUSIONS In this paper, we developed and compared the performance of multi-layered perceptron (MLP) neural network, Elman recurrent neural network (ERNN) and radial basis functions network (RBFN). It can be inferred that ERNN could yield more accurate results, if good data selection strategies, training paradigms, and network input and output representations are determined properly.

23 Sep - 04 Oct, 2002WSC722 Levenberg-Marquardt (LM) approach appears to be the best learning algorithm. However, it requires more memory and is computationally complex while compared to one-step-secant (OSS) algorithm. Empirical results clearly demonstrate that compared to MLP neural network and ERNN, RBFN are much faster and more reliable for the weather forecasting problem considered. A comparison of the neurocomputing techniques with other statistical techniques would be another future research topic.

23 Sep - 04 Oct, 2002WSC723 THANK YOU !