Download presentation
Presentation is loading. Please wait.
1
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna michal.trna@gmail.com
2
= Overview = Introduction to RNN Demo of the tool Application on the chosen domain AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS
3
= Introduction to NN & RNN = Motivation of the NN Brain contains 50–100 billion neurons 1000 trillion synaptic connections Solves complex problems Recognition of complex forms Forms well-founded predictions ↑ Contours of the human brain Drawing of neurons from the cerebellum of a pigeon by Ramón y Cajal (1911) →
4
= Introduction to NN & RNN = Non-local connection Plasticity, synaptic learning Creation and atrophy of the connections Axon Nucleus Dendrites Axon terminal Action potential 1-100m/s
5
= Introduction to NN & RNN = Hebb’s law: When an axon of cell A is near enough to excite cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A's efficiency, as one of the cells firing B, is increased. i.e.: Cells that fire together, wire together. Donald O. Hebb, 1949 Hebbian learning / Synaptic learning Anti-Hebbian learning
6
= Introduction to NN & RNN = Mathematical model of neuron Summing junction Activation function Σ Output xjxj...... f Bias Inputs Neuron j Synaptic weights Recipients of the output
7
= Introduction to NN & RNN = Artificial neural networks A neural network is a massively parallel distributed processor that has a natural propensity for storing experiential knowledge and making it available for use. It resembles the brain in two respects: 1.Knowledge is acquired by the network through a learning process. 2.Interneuron connection strengths known as synaptic weights are used to store the knowledge.
8
= Introduction to NN & RNN = Artificial neural network Properties –Adaptability –Fault tolerance –Knowledge representation, context –Non-linearity –I/O mapping
9
= Introduction to NN & RNN = Hebbian theory For p patterns of length n:
10
= Introduction to NN & RNN = Feed-forward neural networks Recursive neural networks
11
= Introduction to NN & RNN = Perceptron Summing junction Activation function Σ Output xjxj...... f Bias Inputs Neuron j Synaptic weights
12
= Introduction to NN & RNN = Perceptron –Separability, linear classifier –XOR problem ↑ Linear separation of logical AND, logical OR and logical XOR
13
= Introduction to NN & RNN = Multilayer perceptron
14
= Introduction to NN & RNN = Multi-layer perceptron Learning algorithm = back-propagation –generate the output –propagates back to produce deltas of all output and hidden layers –gradient of weights –modify the weight in the (opposite) direction of grad.
15
= Introduction to NN & RNN = Single-layer and Multi-layer perceptron Single layer Two layers Three layers Arbitrary set XOR-like set
16
= Introduction to NN & RNN = Recurrent networks (RNN) Simple RNN: Elman/Jordan network Fully connected: Hopfield network
17
= Introduction to NN & RNN = Elman network Context layer
18
= Introduction to NN & RNN = Jordan network Context layer
19
= Introduction to NN & RNN = Hopfield Networks Dynamic equation
20
= Introduction to NN & RNN = Synaptic potential, threshold Mode of operation –Synchronous –Asynchronous –Deterministic –Non-deterministic Energy Autoassociative memory –Capacity: 0.15 N
21
= Graph Approach = Graph approach –Acquiring pattern ξ: –Hopfield network:
22
= Graph Approach = Coloring Red component Blue component
23
= Graph Approach = Tetrahedral property
24
= Graph Approach = Tetrahedral property Four possible configurations 1 1 1 11 1 0 0 0 11 1 0 0 1 –1 0 0
25
= Graph Approach = Parameters
26
= Graph Approach = Energy point, projection to 2D Energy lines –classes Scalar energy Control of the convergence
27
= Graph Approach = Relative weight of neuron –contribution of this neuron to the component I or O Deviation “a hash function”
28
= Graph Approach = Thresholds
29
= Tool = Time for a demo –http://msc.michaltrna.info/markers/index.htmlhttp://msc.michaltrna.info/markers/index.html ↑ Typical convergence path
30
Outlooks, future lines –To use deviation for discrimination of parasitic states –Quantify the results –Application on automatic trading
31
Thank you for your attention! Time for your questions
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.