Using Artificial Neural Networks and Support Vector Regression to Model the Lyapunov Exponent Adam Maus.

Slides:



Advertisements
Similar presentations
A brief review of non-neural-network approaches to deep learning
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Neural Network Approach to Modeling the Laser Material-Removal Process By Basem. F. Yousef London, Canada, N6A 5B9 December 2001.
Strange Attractors From Art to Science
Machine Learning Neural Networks
Instance Based Learning
Lecture 14 – Neural Networks
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Neural Networks Chapter Feed-Forward Neural Networks.
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Data Mining Techniques in Stock Market Prediction
Backpropagation An efficient way to compute the gradient Hung-yi Lee.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Application of artificial neural network in materials research Wei SHA Professor of Materials Science
Strange Attractors From Art to Science J. C. Sprott Department of Physics University of Wisconsin - Madison Presented to the Society for chaos theory in.
Applications of Neural Networks in Time-Series Analysis Adam Maus Computer Science Department Mentor: Doctor Sprott Physics Department.
Features of Biological Neural Networks 1)Robustness and Fault Tolerance. 2)Flexibility. 3)Ability to deal with variety of Data situations. 4)Collective.
Introduction to Chaos by: Saeed Heidary 29 Feb 2013.
Non-Bayes classifiers. Linear discriminants, neural networks.
Akram Bitar and Larry Manevitz Department of Computer Science
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
Fundamentals of Artificial Neural Networks Chapter 7 in amlbook.com.
Reservoir Uncertainty Assessment Using Machine Learning Techniques Authors: Jincong He Department of Energy Resources Engineering AbstractIntroduction.
Neural Networks Demystified by Louise Francis Francis Analytics and Actuarial Data Mining, Inc.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
1 Challenge the future Chaotic Invariants for Human Action Recognition Ali, Basharat, & Shah, ICCV 2007.
Joe Bradish Parallel Neural Networks. Background  Deep Neural Networks (DNNs) have become one of the leading technologies in artificial intelligence.
Neural NetworksNN 21 Architecture We consider the architecture: feed- forward NN with one layer It is sufficient to study single layer perceptrons with.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Dimensions of Neural Networks Ali Akbar Darabi Ghassem Mirroshandel Hootan Nokhost.
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
1 Traffic accident analysis using machine learning paradigms Miao Chong, Ajith Abraham, Mercin Paprzycki Informatica 29, P89, 2005 Report: Hsin-Chan Tsai.
Canadian Weather Analysis Using Connectionist Learning Paradigms Imran Maqsood*, Muhammad Riaz Khan , Ajith Abraham  * Environmental Systems Engineering.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Fall 2004 Backpropagation CS478 - Machine Learning.
Deterministic Dynamics
Neural Networks Toolbox
Dimension Review Many of the geometric structures generated by chaotic map or differential dynamic systems are extremely complex. Fractal : hard to define.
Ranga Rodrigo February 8, 2014
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Final Year Project Presentation --- Magic Paint Face
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Dynamics of High-Dimensional Systems
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
Strange Attractors From Art to Science
Neural Networks Advantages Criticism
Biological and Artificial Neuron
Biological and Artificial Neuron
Neuro-Computing Lecture 4 Radial Basis Function Network
Biological and Artificial Neuron
CSE 573 Introduction to Artificial Intelligence Neural Networks
Neural Networks Geoff Hulten.
Identification of Wiener models using support vector regression
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Artificial neurons Nisheeth 10th January 2019.
Recursively Adapted Radial Basis Function Networks and its Relationship to Resource Allocating Networks and Online Kernel Learning Weifeng Liu, Puskal.
Artificial Intelligence 10. Neural Networks
Chapter - 3 Single Layer Percetron
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Capacity Dimension of The Hénon Attractor
CSC321: Neural Networks Lecture 11: Learning in recurrent networks
Artificial Neural Networks
Akram Bitar and Larry Manevitz Department of Computer Science
Presentation transcript:

Using Artificial Neural Networks and Support Vector Regression to Model the Lyapunov Exponent Adam Maus

Nonlinear Prediction Largest Lyapunov Exponent (LE) Measure of long term predictability Models used to reconstruct LE Artificial Neural Network Support Vector Regression Henon Map (HM) Henon Map’s LE = .42093 Strange Attractor of Hénon Map

Artificial Neural Network Single Layer Feed-Forward Architecture 8 neurons and 3 dimensions Trained using next step prediction Weights updated using a variant of simulated annealing 400 training points 1 million training epochs

Support Vector Regression Global Solution to a Convex Programming Problem Uses only a Subset of Points Points outside of ɛ-tube thrown out User-Defined Parameters C - control flatness of the output function ɛ - controls size of the tube kernel function and its parameters Training dataset size Many toolboxes available LibSVM Toolbox s = length of w vector K = kernel function Toolbox Available At: http://tinyurl.com/sxnse Picture: http://tinyurl.com/6pscdr

Dynamic and Attractor Reconstruction Results Artificial Neural Network LE = .38431 Mean Square Error = 4.65 x 10 Support Vector Regression LE = .41288 Weighted Mean Square Error = 4.69 x 10 -5 -5 Strange Attractor of Neural Network Mean Square Error Weighted Mean Square Error c = length of time series Strange Attractor of Support Vector Regression

Conclusions Hypothesis Dynamic and Attractor Reconstruction are correlated Neural Networks Perform proper reconstruction given adequate training Support Vector Regression Performs proper reconstruction given that we weight the model so that it replicates the first points more due to the chaotic nature of the data. Strange Attractor of the Logistic Map (LM) Actual NN SVR LM .69526 .63744 .68923 DHM .37038 .35030 .35550 HM .42093 .38431 .41288 Strange Attractor of the delayed Hénon map (DHM)