Emna Krichene 1, Youssef Masmoudi 1, Adel M

Slides:



Advertisements
Similar presentations
Neural Networks in Financial Analysis
Advertisements

Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
Machine Learning Neural Networks
Decision Support Systems
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
PY 427 Statistics 1Fall 2006 Kin Ching Kong, Ph.D Lecture 6 Chicago School of Professional Psychology.
Radial Basis Function Networks
NEURAL NETWORKS FOR TECHNICAL ANALYSIS: A STUDY ON KLCI 授課教師:楊婉秀 報告人:李宗霖.
Presented by: Kamakhaya Argulewar Guided by: Prof. Shweta V. Jain
Halilİbrahim Bayrakdaroğlu Dokuz Eylül University Industrial Engineering Department FORECASTING AND TIME SERIES.
Chapter 14: Artificial Intelligence Invitation to Computer Science, C++ Version, Third Edition.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Data Mining and Neural Networks Danny Leung CS157B, Spring 2006 Professor Sin-Min Lee.
Data Mining Techniques in Stock Market Prediction
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
NEURAL NETWORKS FOR DATA MINING
Time-Series Forecasting Overview Moving Averages Exponential Smoothing Seasonality.
Akram Bitar and Larry Manevitz Department of Computer Science
Data Mining BY JEMINI ISLAM. Data Mining Outline: What is data mining? Why use data mining? How does data mining work The process of data mining Tools.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Principles of Extrapolation
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
IEEE International Conference on Fuzzy Systems p.p , June 2011, Taipei, Taiwan Short-Term Load Forecasting Via Fuzzy Neural Network With Varied.
Canadian Weather Analysis Using Connectionist Learning Paradigms Imran Maqsood*, Muhammad Riaz Khan , Ajith Abraham  * Environmental Systems Engineering.
Overfitting, Bias/Variance tradeoff. 2 Content of the presentation Bias and variance definitions Parameters that influence bias and variance Bias and.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
© 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
1 Introduction to Neural Networks Recurrent Neural Networks.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Prepared by Fayes Salma.  Introduction: Financial Tasks  Data Mining process  Methods in Financial Data mining o Neural Network o Decision Tree  Trading.
Introduction to Machine Learning, its potential usage in network area,
An Automated Trading System using Recurrent Reinforcement Learning
Neural networks.
Big data classification using neural network
Neural Network Architecture Session 2
Deep Feedforward Networks
Learning in Neural Networks
An Investigation of Market Dynamics and Wealth Distributions
“The Art of Forecasting”
A Simple Artificial Neuron
Intelligent Information System Lab
Dr. Unnikrishnan P.C. Professor, EEE
Module 2: Demand Forecasting 2.
Data Mining Practical Machine Learning Tools and Techniques
Computational Intelligence
Texas A&M Industrial Engineering
of the Artificial Neural Networks.
Neural Networks and Their Application in the Fields of Coporate Finance By Eric Séverin Hanna Viinikainen.
An Improved Neural Network Algorithm for Classifying the Transmission Line Faults Slavko Vasilic Dr Mladen Kezunovic Texas A&M University.
Computational Intelligence
United Nations Statistics Division
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Neural Networks Geoff Hulten.
Department of Electrical Engineering
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
The Naïve Bayes (NB) Classifier
Computational Intelligence
Building Topic/Trend Detection System based on Slow Intelligence
Data Analysis and R : Technology & Opportunity
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
Chapter 12 Analyzing Semistructured Decision Support Systems
Computational Intelligence
Random Neural Network Texture Model
Akram Bitar and Larry Manevitz Department of Computer Science
Presentation transcript:

Forecasting of Mackey-Glass Time Series Using Elman Recurrent Neural Network Emna Krichene 1, Youssef Masmoudi 1, Adel M. Alimi1 1, Ajith Abraham 2, Habib Chabchoub 3 emna.krichene.tn@ieee.org, youssef.masmoudi@gmail.com, adel.alimi@ieee.org, ajith.abraham@ieee.org, habib.chabchoub@gmail.com 16 th International Conference on Intelligent Systems Design and Applications (ISDA)

Agenda Introduction Forecasting of Mackey-Glass Time Series Using Elman Recurrent Neural Network Literature review Contribution Conclusion 16 th International Conference on Intelligent Systems Design and Applications (ISDA)

Information Extraction Predictive analytics information Introduction |The Forecasting The Forecasting Information Extraction Introduction Introduction An important aspect related to forecasting represents the subject of much research today. Forecasting is an important data analysis technique that aims to study historical data in order to explore and predict its future values. Various fields to forecast: The behavior of users Weather Earthquakes Stock market… Literature review Contribution Conclusion Forecasting Predictive analytics information 16 th International Conference on Intelligent Systems Design and Applications (ISDA)

Agenda Introduction Forecasting of Mackey-Glass Time Series Using Elman Recurrent Neural Network Literature review Literature review Contribution Conclusion 16 th International Conference on Intelligent Systems Design and Applications (ISDA)

Extrapolative methods Forecasting literature contains a large variety of methods that can be classified into two main families: II. Quantitative methods Used when situation is stable & historical data exist Existing products Current technology Involves mathematical techniques e.g., forecasting sales of color televisions I. Qualitative methods Used when situation is vague & little data exist New products New technology Involves intuition, experience e.g., forecasting sales on Internet Extrapolative methods (time series model) Causal methods 16 th International Conference on Intelligent Systems Design and Applications (ISDA)

Agenda Introduction Forecasting of Mackey-Glass Time Series Using Elman Recurrent Neural Network Literature review Contribution Proposed Method Conclusion 16 th International Conference on Intelligent Systems Design and Applications (ISDA)

Contribution |Building the architecture The Learning process consists on : Estimating the number of hidden neurons. Estimating and adjusting the alpha learning rate. Adjusting the connections weights. Step 1 Step 2 Initial parameters system Training data set Learning process Normalization No Yes The optimized built architecture Yes Termination criteria met ? Evaluation 16 th International Conference on Intelligent Systems Design and Applications (ISDA)

Figure 1. The architecture of the optimized Elman RNN Contribution |The optimized RNN architecture Figure 1. The architecture of the optimized Elman RNN 16 th International Conference on Intelligent Systems Design and Applications (ISDA)

Contribution Serving by the back-propagation learning algorithm, all connection weights in the network are simply computed and updated even the context connections. Fig. 2 depicts the example of 500 samples from the Mackey-Glass time Series predicted by our network architecture. Basically, the connection strengths between hidden past values and their corresponding actual ones are fixed to one as we suppose retaining past values without applying any modification; however experiments show that updating connection strengths between context units and hidden units vkj in the same way as wij gives better results than fixing them at 1. Serving by the back-propagation learning algorithm, all connection weights in the network are simply computed and updated even the context connections. Fig. 2 depicts the example of 500 samples from the Mackey-Glass time Series predicted by our network architecture. We remark that the proposed scheme gives accurate results as they are predicted efficiently. 16 th International Conference on Intelligent Systems Design and Applications (ISDA)

Agenda Introduction Forecasting of Mackey-Glass Time Series Using Elman Recurrent Neural Network Literature review Contribution Conclusion Conclusion 16 th International Conference on Intelligent Systems Design and Applications (ISDA)

Conclusion Several techniques have been proposed in the literature to solve forecasting problem, but none of them can be considered as the best. The choice of the method can be generalized and dependent on several factors. RNN can be a good alternative model to handle hard time series as it is well suited to memorize past values to use them as memory within the training phase. One probable extension is to forecast error values and to combine them to obtain optimized results. Several techniques have been proposed in the literature from statistical methods to those based on artificial intelligence techniques to solve forecasting problem, but none of them can be considered as the best. The choice of the method can be generalized and dependent on several factors including the size of the historical data, the complexity of the proposed model, the variance of the error and the complexity of the algorithms i.e the time needed to analyze the data. RNN can be a good alternative model to handle hard time series as it is well suited to memorize past values to use them as memory within the training phase. One probable extension is to use data mining techniques and to choose or combine the ones that give the best results. 16 th International Conference on Intelligent Systems Design and Applications (ISDA)

Thank you Emna KRICHENE, Youssef Masmoudi, Adel M. Alimi, Ajith Abraham, Habib Chabchoub 3 , adel.alimi@ieee.org E-mail: emna.krichene.tn@ieee.org, youssef.masmoudi@gmail.com, adel.alimi@ieee.org, ajith.abraham@ieee.org, habib.chabchoub@gmail.com 16 th International Conference on Intelligent Systems Design and Applications (ISDA)