Download presentation
Presentation is loading. Please wait.
1
PYTHON Deep Learning Prof. Muhammad Saeed
2
Dept. Of Computer Science & IT FUUAST
Deep Learning Introduction 11/23/2018 Dept. Of Computer Science & IT FUUAST
3
Dept. Of Computer Science & IT FUUAST
Deep Learning 11/23/2018 Dept. Of Computer Science & IT FUUAST
4
Dept. Of Computer Science & IT FUUAST
Deep Learning Biological Neuron 100 billion neurons in human brain (average) 100 – 500 trillion synaptic connections 11/23/2018 Dept. Of Computer Science & IT FUUAST
5
Dept. Of Computer Science & IT FUUAST
Deep Learning Synapse vs. weight 11/23/2018 Dept. Of Computer Science & IT FUUAST
6
FUUAST Computer Science
Deep Learning McCulloch-Pitts Neuron In 1943, Warren McCulloch and Walter Pitts, published the first ‘neural network’. Their "neurons" operated under the following assumptions: 1. They are binary devices (Vi = [0,1]) 2. Each neuron has a fixed threshold, theta 3. The neuron receives inputs from excitatory synapses, all having identical weights. 4. Inhibitory inputs have an absolute veto power over any excitatory inputs. 5. At each time step the neurons are simultaneously (synchronously) updated by summing the weighted excitatory inputs and setting the output (Vi) to 1 iff the sum is greater than or equal to the threhold AND if the neuron receives no inhibitory input. 9/6/2019 FUUAST Computer Science
7
Dept. Of Computer Science & IT FUUAST
Deep Learning McCulloch-Pitts Neuron for AND Function McCulloch-Pitts Neuron for OR Function 11/23/2018 Dept. Of Computer Science & IT FUUAST
8
Dept. Of Computer Science & IT FUUAST
Deep Learning McCulloch-Pitts Neuron for XOR Function 11/23/2018 Dept. Of Computer Science & IT FUUAST
9
Dept. Of Computer Science & IT FUUAST
Deep Learning McCulloch-Pitts Neuron A NOR gate gives you an output of 1 only when all inputs are zero 11/23/2018 Dept. Of Computer Science & IT FUUAST
10
Dept. Of Computer Science & IT FUUAST
Deep Learning McCulloch-Pitts Neuron A NAND gate gives you an output of 0 only when all inputs are 1 11/23/2018 Dept. Of Computer Science & IT FUUAST
11
Dept. Of Computer Science & IT FUUAST
Deep Learning Perceptron Perceptrons (Rosenblatt 1958, Minsky/Papert 1969) are generalized variants of a former, more simple model (McCulloch/Pitts neurons, 1942): • Inputs are weighted • Weights are real numbers (positive and negative) • No special inhibitory inputs Perceptron is a single layer neural network and a multi-layer perceptron is called Neural Networks. 11/23/2018 Dept. Of Computer Science & IT FUUAST
12
Dept. Of Computer Science & IT FUUAST
Deep Learning Rosenblatt Perceptron 11/23/2018 Dept. Of Computer Science & IT FUUAST
13
Dept. Of Computer Science & IT FUUAST
Deep Learning Structure of a node: Squashing (Transfer) functions limit node output: 1. 2. 11/23/2018 Dept. Of Computer Science & IT FUUAST
14
Dept. Of Computer Science & IT FUUAST
Deep Learning Squashing (Transfer ) Functions 3. 4. 11/23/2018 Dept. Of Computer Science & IT FUUAST
15
Dept. Of Computer Science & IT FUUAST
Deep Learning Feeding data through the net: (1 0.25) + (0.5 (-1.5)) = (-0.75) = Squashing: 11/23/2018 Dept. Of Computer Science & IT FUUAST
16
Dept. Of Computer Science & IT FUUAST
Deep Learning Problems with Perceptrons Frank Rosenblatt proved mathematically that the perceptron learning rule converges if the two classes can be separated by linear hyperplane, but problems arise if the classes cannot be separated perfectly by a linear classifier. 11/23/2018 Dept. Of Computer Science & IT FUUAST
17
Dept. Of Computer Science & IT FUUAST
Deep Learning Artificial Neural Networks …… Neural networks are a set of algorithms, modeled loosely after the human brain, that are designed to recognize patterns. They interpret sensory data through a kind of machine perception, labeling or clustering raw input. The patterns they recognize are numerical, contained in vectors, into which all real-world data, be it images, sound, text or time series, must be translated. 11/23/2018 Dept. Of Computer Science & IT FUUAST
18
Dept. Of Computer Science & IT FUUAST
Deep Learning ……. Artificial Neural Networks Neural networks help us cluster and classify. You can think of them as a clustering and classification layer on top of the data you store and manage. They help to group unlabeled data according to similarities among the example inputs, and they classify data when they have a labeled dataset to train on. (Neural networks can also extract features that are fed to other algorithms for clustering and classification 11/23/2018 Dept. Of Computer Science & IT FUUAST
19
Dept. Of Computer Science & IT FUUAST
Deep Learning ANNs – The Basics ANNs incorporate the two fundamental components of biological neural nets: Layer 1 Layer 2 Layer 3 Neurones (nodes) Synapses (weights) 11/23/2018 Dept. Of Computer Science & IT FUUAST
20
Dept. Of Computer Science & IT FUUAST
Deep Learning One Layer Neural Network 11/23/2018 Dept. Of Computer Science & IT FUUAST
21
Dept. Of Computer Science & IT FUUAST
Deep Learning Multiple Layers Of Neurons 11/23/2018 Dept. Of Computer Science & IT FUUAST
22
FUUAST Computer Science
Deep Learning Deep Neural Networks A deep neural network is a neural network with a certain level of complexity, a neural network with more than two layers. Deep neural networks use sophisticated mathematical modeling to process data in complex ways. 9/6/2019 FUUAST Computer Science
23
Dept. Of Computer Science & IT FUUAST
Deep Learning Multiple Layer Neural Networks 11/23/2018 Dept. Of Computer Science & IT FUUAST
24
Dept. Of Computer Science & IT FUUAST
Deep Learning ANN Architecture Multilayer Perceptron (MLP) Radial Basis Function Networks(RBF) Self-Organizing Feature Maps(SOFM) Recurrent (Feedbackward) Learning Supervised Learning Unsupervised Learning 11/23/2018 Dept. Of Computer Science & IT FUUAST
25
Dept. Of Computer Science & IT FUUAST
Deep Learning Feed-forward nets Information flow is unidirectional Data is presented to Input layer Passed on to Hidden Layer Passed on to Output layer Information is distributed Information processing is parallel Internal representation (interpretation) of data 11/23/2018 Dept. Of Computer Science & IT FUUAST
26
Dept. Of Computer Science & IT FUUAST
Deep Learning Data is presented to the network in the form of activations in the input layer Examples Pixel intensity (for pictures) Molecule concentrations (for artificial nose) Share prices (for stock market prediction) Data usually requires preprocessing Analogous to senses in biology How to represent more abstract data, e.g. a name? Choose a pattern, e.g. 0-0-1 for “Zeeshan” 0-1-0 for “AleRaza” 11/23/2018 Dept. Of Computer Science & IT FUUAST
27
Dept. Of Computer Science & IT FUUAST
Deep Learning Applications of Feed-forward nets Pattern recognition Character recognition Face Recognition Sonar mine/rock recognition (Gorman & Sejnowksi, 1988) Navigation of a car (Pomerleau, 1989) Stock-market prediction Pronunciation (NETtalk) (Sejnowksi & Rosenberg, 1987) 11/23/2018 Dept. Of Computer Science & IT FUUAST
28
Dept. Of Computer Science & IT FUUAST
Deep Learning Feed-backward Neural Networks Feedback (or recurrent or interactive) networks can have signals traveling in both directions by introducing loops in the network. Feedback networks are powerful and can get extremely complicated. Computations derived from earlier input are fed back into the network, which gives them a kind of memory. Feedback networks are dynamic; their 'state' is changing continuously until they reach an equilibrium point. They remain at the equilibrium point until the input changes and a new equilibrium needs to be found. The learning rate is introduced as a constant (usually very small), in order to force the weight to get updated very smoothly and slowly (to avoid big steps and chaotic behaviour). 11/23/2018 Dept. Of Computer Science & IT FUUAST
29
Dept. Of Computer Science & IT FUUAST
Deep Learning Recurrent Networks Feed forward networks: Information only flows one way One input pattern produces one output No sense of time (or memory of previous state) Recurrency Nodes connect back to other nodes or themselves Information flow is multidirectional Sense of time and memory of previous state(s) Biological nervous systems show high levels of recurrency (but feed-forward structures exists too) 11/23/2018 Dept. Of Computer Science & IT FUUAST
30
Dept. Of Computer Science & IT FUUAST
Deep Learning Training the Network – Learning Backpropagation Requires training set (input / output pairs) Starts with small random weights Error is used to adjust weights (supervised learning) Gradient descent on error landscape Weight settings determine the behaviour of a network. 11/23/2018 Dept. Of Computer Science & IT FUUAST
31
Dept. Of Computer Science & IT FUUAST
Deep Learning New weight = old weight — Derivative Rate * learning rate 11/23/2018 Dept. Of Computer Science & IT FUUAST
32
Dept. Of Computer Science & IT FUUAST
Deep Learning Pros & Cons Advantages It works! Relatively fast Downsides Requires a training set Can be slow Probably not biologically realistic Alternatives to Backpropagation Hebbian learning: Not successful in feed-forward nets Reinforcement learning : Only limited success Artificial evolution: More general, but can be even slower than backprop 11/23/2018 Dept. Of Computer Science & IT FUUAST
33
Dept. Of Computer Science & IT FUUAST
Deep Learning The End 11/23/2018 Dept. Of Computer Science & IT FUUAST
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.