Prediction of the Foreign Exchange Market Using Classifying Neural Network Doug Moll Chad Zeman.

Slides:



Advertisements
Similar presentations
FUNCTION FITTING Student’s name: Ruba Eyal Salman Supervisor:
Advertisements

Multi-Layer Perceptron (MLP)
Slides from: Doug Gray, David Poole
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Machine Learning Neural Networks
Brian Merrick CS498 Seminar.  Introduction to Neural Networks  Types of Neural Networks  Neural Networks with Pattern Recognition  Applications.
Decision Support Systems
Artificial Intelligence Statistical learning methods Chapter 20, AIMA (only ANNs & SVMs)
Chapter 5 NEURAL NETWORKS
Hybrid Systems Two examples of the Combination of Rule-Based Systems and neural nets By Pieter Buzing.
Text Classification: An Implementation Project Prerak Sanghvi Computer Science and Engineering Department State University of New York at Buffalo.
Multi Layer Perceptrons (MLP) Course website: The back-propagation algorithm Following Hertz chapter 6.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Neural networks.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
SE367 Project Final Presentation By: Sujith Thomas Parimi Krishna Chaitanya In charge:- Prof Amitabha Mukerjee.
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 INFO331 Machine learning. Neural networks. Supervised.
Multi-Layer Perceptrons Michael J. Watts
Neural Networks Chapter 6 Joost N. Kok Universiteit Leiden.
Chapter 9 Neural Network.
Appendix B: An Example of Back-propagation algorithm
Backpropagation An efficient way to compute the gradient Hung-yi Lee.
Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Neural Networks1 Introduction to NETLAB NETLAB is a Matlab toolbox for experimenting with neural networks Available from:
The Perceptron. Perceptron Pattern Classification One of the purposes that neural networks are used for is pattern classification. Once the neural network.
1 Introduction to Neural Networks And Their Applications.
Artificial Neural Network Building Using WEKA Software
Multi-Layer Perceptron
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Non-Bayes classifiers. Linear discriminants, neural networks.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
CS621 : Artificial Intelligence
Fundamentals of Artificial Neural Networks Chapter 7 in amlbook.com.
Robert J. Marks II CIA Lab Baylor University School of Engineering CiaLab.org Artificial Neural Networks: Supervised Models.
Neural Networks - lecture 51 Multi-layer neural networks  Motivation  Choosing the architecture  Functioning. FORWARD algorithm  Neural networks as.
Additional NN Models Reinforcement learning (RL) Basic ideas: –Supervised learning: (delta rule, BP) Samples (x, f(x)) to learn function f(.) precise error.
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Neural Network Recognition of Frequency Disturbance Recorder Signals Stephen Tang REU Final Presentation July 22, 2014.
BACKPROPAGATION (CONTINUED) Hidden unit transfer function usually sigmoid (s-shaped), a smooth curve. Limits the output (activation) unit between 0..1.
Neural Networks for EMC Modeling of Airplanes Vlastimil Koudelka Department of Radio Electronics FEKT BUT Metz,
Feasibility of Using Machine Learning Algorithms to Determine Future Price Points of Stocks By: Alexander Dumont.
Intro. ANN & Fuzzy Systems Lecture 11. MLP (III): Back-Propagation.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Neural network based hybrid computing model for wind speed prediction K. Gnana Sheela, S.N. Deepa Neurocomputing Volume 122, 25 December 2013, Pages 425–429.
Machine Learning Supervised Learning Classification and Regression
Neural networks.
به نام خدا هوالعليم.
과제 3: 인공신경망.
Neural Networks: Improving Performance in X-ray Lithography Applications ECE 539 Ryan T. Hogg May 10, 2000.
Prof. Carolina Ruiz Department of Computer Science
Neural Networks Advantages Criticism
Training a Neural Network
network of simple neuron-like computing elements
Neural Networks Chapter 5
2. Matrix-Vector Formulation of Backpropagation Learning
Artificial Intelligence 10. Neural Networks
A case in Neural Network - A Nerual Network for Bankruptcy Prediction
A case in Neural Network - A Nerual Network for Bankruptcy Prediction
Prediction Networks Prediction A simple example (section 3.7.3)
Learning Combinational Logic
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Prediction of the Foreign Exchange Market Using Classifying Neural Network Doug Moll Chad Zeman

The Problem Using neural networks, can we predict future foreign exchange rates to profit from short- term fluctuations?

Outline Project History Project Proposal Data Set MPL Results PNN Results

Project History Senior Seminar Used Trajan for regression networks Attempted to predict direction & movement size Less than desirable results

Proposal Classification of Up or Down movement Continue to use Trajan Maintain same biases to compare to previous research Minimize time to learn new tool

Data Set Inputs (1994 – 2003) Percent change of CA/US exchange rate Interest differential of short term interest rates (CA – US) Lagged Values

Exchange Rate data 94-03

Percent Change Model

Data Set Equalize training cases Same number of Up examples as Down for each of the three data periods Training Verification Testing

Trajan Algorithm User Defined Settings Inputs & Outputs Training, Verification, Test data splits Network Type

Trajan Algorithm Automatically Determined Settings Network Complexity (# of hidden nodes)

Trajan Algorithm Randomly builds networks Trains using backpropagation Utilize cross-verification techniques Evaluate networks based on verification error Cross-reference with out-of-sample test data

Results – MLP - Daily 16 inputs 20 hidden nodes 1 output 0.3 momentum 0.1 learning rate 50 epochs

Results – MLP - Daily Data SetPerformance Training58.41% Verification53.59% Testing53.59%

Results – MLP - Weekly 16 inputs 22 hidden nodes 1 output 0.3 momentum 0.1 learning rate 4 epochs

Results – MLP - Weekly Data SetPerformance Training52.73% Verification62.50% Testing55.47%

Probabilistic Neural Networks Finite deterministic network Three layers

PNN Example – Training Example A Exchange Rate Interest Rate % Input Layer A Target Output = Up Pattern Layer Output Layer Up Down

Pattern Layer Receives input vector Calibrates Gaussian bell

PNN Example – Training Example A Exchange Rate Interest Rate % Input Layer A 100% Target Output = Up Pattern Layer Output Layer Up Down A % 1

PNN Example – Training Example B Exchange Rate Interest Rate % Input Layer A 100% Target Output = Down Pattern Layer Output Layer Up Down A % B % 1

PNN Example – Out-of-Sample Exchange Rate Interest Rate % Input Layer A 80% Pattern Layer Output Layer Up Down A % B % %

Results – PNN - Daily 16 inputs 1224 hidden nodes 2 outputs

Results – PNN - Daily Data SetPerformance Training54.82% Verification51.14% Testing51.96%

Results – PNN - Weekly 16 inputs 256 hidden nodes 2 outputs

Results – PNN - Weekly Data SetPerformance Training77.73% Verification54.69% Testing57.03%

Conclusions Predicting foreign exchange market is a tough problem PNN vs. MLP Weekly vs. Daily data