Download presentation
Presentation is loading. Please wait.
Published byBranden Carson Modified over 9 years ago
1
Establishing the Equivalence between Recurrent Neural Networks and Turing Machines. Ritesh Kumar Sinha(02d05005) Kumar Gaurav Bijay(02005013)
2
“No computer has ever been designed that is ever aware of what it’s doing; but most of the time, we aren’t either” – Marvin Minsky.
3
Introduction Plan : Establish Equivalence between recurrent neural network and turing machine History of Neurons Definitions Constructive Proof of equivalence Approach : Conceptual Understanding
4
Motivation Understanding the learning patterns of human brain – concept of neuron Is Turing Machine the ultimate computing machine ? How powerful are neural networks – DFA, PDA, Turing Machine, still higher …
5
Brain The most complicated human organ sense, perceive, feel, think, believe, remember, utter Information processing Centre Neurons : information processing units
6
MCP Neuron McCulloch and Pitts gave a model of a neuron in 1943 But it's only a highly simplified model of real neuron Positive weights (activators) Negative weights (inhibitors)
7
Artificial Neural Neworks (ANN) Interconnected units : model neurons Modifiable weights (models synapse)
8
Types of ANN Feed-forward Networks Signals travel in one way only Good at computing static functions Neuro-Fuzzy Networks Combines advantages of both fuzzy reasoning and Neural Networks. Good at modeling real life data. Recurrent Networks
9
Recurrent Neural Networks Activation Function: f(x) = x, if x > 0 0, otherwise
10
Turing Machine and Turing Complete Languages Turing Machine: As powerful as any other computer Turing Complete Language: Programming language that can compute any function that a Turing Machine can compute.
11
A language L Four basic operations: No operation : V V Increment: V V+1 Decrement : V max(0,V-1) Conditional Branch: IF V != 0 GOTO j (V is any variable having positive integer values, and j stands for line numbers) L is Turing Complete
12
Turing Machine can encode Neural Network Turing machine can compute any computable function, by definition The activation function, in our case, is a simple non-linear function Turing machine can therefore simulate our recurrent neural net Intuitive, cant we write code to simulate our neural net
13
An example Function that initiates: Y=X L1: X X - 1 Y Y + 1 if X != 0 goto L1
14
An example Function that initiates: Y=X if X != 0 goto L1 X X + 1 if X != 0 goto L2 L1: X X - 1 Y Y + 1 if X != 0 goto L1 L2: Y Y
15
L is Turing Complete : Conceptual Understanding Idea: Don’t think of C++, think of 8085 ;) Subtraction: Y = X1 - X2
16
L is Turing Complete : Conceptual Understanding Idea: Don’t think of C++, think of 8085 ;) Subtraction: Y = X1 - X2 Yes, decrement X, increment Y, when Y=0, stop Multiplication, division:
17
L is Turing Complete : Conceptual Understanding Idea: Don’t think of C++, think of 8085 ;) Subtraction: Y = X1 - X2 Yes, decrement X, increment Y, when Y=0, stop Multiplication, division: Yes, think of the various algos you studied in Hardware Class :)
18
L is Turing Complete : Conceptual Understanding If: if X=0 goto L
19
L is Turing Complete : Conceptual Understanding If: if X=0 goto L Yes, if X != 0 goto L2 Z Z + 1 // Z is dummy if Z != 0 goto L L2:...
20
Constructing a Perceptron Network for Language L For each variable V : entry node N V For each program row i : instruction node N i Conditional branch instruction on row i : 2 transition nodes N i’ and N i”
21
Constructions Variable V : N V No Operation: N i N i+1
22
Constructions Continued Increment Operation : N i N i+1 N i N V Decrement Operation: N i N i+1 N i N V
23
Constructions Continued Conditional Branch Operation : N i N i’ N i N i’’ N v N i’’ N i’’ N i+1 N i’ N j N i’’ N j
24
Definitions Legal State: Transition Nodes (N i’ and N i’’ ) : 0 outputs Exactly one N i has output=1 Final State : All instruction nodes have 0 outputs.
25
Properties If y i = 1 then program counter is on line i V = output of N v Changes in network state are activated by non-zero nodes.
26
Proof If row i is V V If row i is V V - 1 For V V + 1, behavior is similar
27
Proof Continued If row i is : if V != 0 GOTO j
28
Proof Continued Thus a legal state leads to another legal state.
29
What about other activation functions? Feed-forward Net With Binary State Function is DFA Recurrent Neural Network with: 1) Sigmoid is Turing Universal 2) Saturated Linear Function is Turing Universal
30
Conclusion Turing machine can encode Recurrent Neural Network. Recurrent neural net can simulate a Turing complete language. So, Turing machines are Recurrent Neural Networks !
31
Project Implementation of Push down automata (PDA) using Neural Networks.
32
References [1] McCulloch, W. S., and Pitts, W. : A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics 5, 1943 :115-133. [2] Hyotyniemi, H : Turing Machines are Recurrent Neural Networks. In SYmposium of Artificial Networks, Vaasa, Finland, Aug 19-23, 1926, pp. 13-24 [3] Davis, Martin D.Weyuker, Elaine J. : Computability, complexity, and languages : fundamentals of theoretical computer science, New York : Academic Press, 1983.
33
References Continued [4] J. E. Hopcroft and J. D. Ullman, Introduction to Automata Theory, Languages and Computation. Addison-Wesley, 1979. [5] Arbib, M. : Turing Machnies, Finite Automata and Neural Nets. Journal of the ACM (JACM), NY, USA, Oct 1961, Vol 8, Issue 4, pp. 467 - 475.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.