Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

Introduction to Neural Networks Eric Wong Martin Ho Eddy Li Kitty Wong.
Backpropagation Learning Algorithm
Slides from: Doug Gray, David Poole
NEURAL NETWORKS Backpropagation Algorithm
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Tuomas Sandholm Carnegie Mellon University Computer Science Department
Kostas Kontogiannis E&CE
Artificial Neural Networks
Machine Learning Neural Networks
Simple Neural Nets For Pattern Classification
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Spring 2002 Shreekanth Mandayam Robi Polikar ECE Department.
Data Mining with Neural Networks (HK: Chapter 7.5)
CS 484 – Artificial Intelligence
Neural networks.
Artificial neural networks:
Some more Artificial Intelligence
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Chapter 9 Neural Network.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
CONTENTS:  Introduction  What is neural network?  Models of neural networks  Applications  Phases in the neural network  Perceptron  Model of fire.
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
BACKPROPAGATION: An Example of Supervised Learning One useful network is feed-forward network (often trained using the backpropagation algorithm) called.
Multi-Layer Perceptron
CS621 : Artificial Intelligence
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
EEE502 Pattern Recognition
Neural Networks 2nd Edition Simon Haykin
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Neural networks.
Neural Networks.
Learning with Perceptrons and Neural Networks
Learning in Neural Networks
Artificial Intelligence (CS 370D)
Artificial neural networks:
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Data Mining with Neural Networks (HK: Chapter 7.5)
Neural Network - 2 Mayank Vatsa
Artificial Intelligence Lecture No. 28
Presentation transcript:

Neural Networks

Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems capable of sophisticated computations similar to the human brain.

Biological analogy and some main ideas The brain is composed of a mass of interconnected neurons – each neuron is connected to many other neurons Neurons transmit signals to each other Whether a signal is transmitted is an all-or-nothing event (the electrical potential in the cell body of the neuron is thresholded) Whether a signal is sent, depends on the strength of the bond (synapse) between two neurons

How Does the Brain Work ? (1) NEURON - The cell that performs information processing in the brain. - Fundamental functional unit of all nervous system tissue.

Each consists of : SOMA, DENDRITES, AXON, and SYNAPSE. How Does the Brain Work ? (2)

Brain vs. Digital Computers (1) - Computers require hundreds of cycles to simulate a firing of a neuron. - The brain can fire all the neurons in a single step. Parallelism - Serial computers require billions of cycles to perform some tasks but the brain takes less than a second. e.g. Face Recognition

Definition of Neural Network A Neural Network is a system composed of many simple processing elements operating in parallel which can acquire, store, and utilize experiential knowledge.

Artificial Neural Network? Neurons vs. Units (1) Each element of NN is a node called unit. Units are connected by links. Each link has a numeric weight.

Neurons vs. units (2) Real neuron is far away from our simplified model - unit Chemistry, biochemistry, quantumness.

Computing Elements A typical unit:

Planning in building a Neural Network Decisions must be taken on the following: - The number of units to use. - The type of units required. - Connection between the units.

How NN learns a task. Issues to be discussed - Initializing the weights. - Use of a learning algorithm. - Set of training examples. - Encode the examples as inputs. - Convert output into meaningful results.

Neural Network Example A very simple, two-layer, feed-forward network with two inputs, two hidden nodes, and one output node.

Simple Computations in this network - There are 2 types of components: Linear and Non- linear. - Linear: Input function - calculate weighted sum of all inputs. - Non-linear: Activation function - transform sum into activation level.

Calculations Input function: g Activation function g:

A Computing Unit. Now in more detail but for a particular model only A unit

Activation Activation Functions - Use different functions to obtain different models. - 3 most common choices : 1) Step function 2) Sign function 3) Sigmoid function - An output of 1 represents firing of a neuron down the axon.

Step Function Perceptrons

3 Activation Functions

Standard structure of an artificial neural network Input units – represents the input as a fixed-length vector of numbers (user defined) Hidden units – calculate thresholded weighted sums of the inputs – represent intermediate calculations that the network learns Output units – represent the output as a fixed length vector of numbers

Representations Logic rules – If color = red ^ shape = square then + Decision trees – tree Nearest neighbor – training examples Probabilities – table of probabilities Neural networks – inputs in [0, 1] Can be used for all of them Many variants exist

Notation

Notation (cont.)

Operation of individual units Output i = f(W i,j * Input j + W i,k * Input k + W i,l * Input l ) – where f(x) is a threshold (activation) function – f(x) = 1 / (1 + e -Output ) “sigmoid” – f(x) = step function

Artificial Neural Networks

Perceptron Learning Theorem Recap: A perceptron (threshold unit) can learn anything that it can represent (i.e. anything separable with a hyperplane) 26

The Exclusive OR problem A Perceptron cannot represent Exclusive OR since it is not linearly separable. 27

28

Properties of architecture No connections within a layer No direct connections between input and output layers Fully connected between layers Often more than 3 layers Number of output units need not equal number of input units Number of hidden units per layer can be more or less than input or output units Each unit is a perceptron Often include bias as an extra weight 29

Conceptually: Forward Activity - Backward Error 30

Backward pass phase: computes ‘error signal’, propagates the error backwards through network starting at output units (where the error is the difference between actual and desired output values) Forward pass phase: computes ‘functional signal’, feed forward propagation of input pattern signals through network Backpropagation learning algorithm ‘BP’ Solution to credit assignment problem in MLP. Rumelhart, Hinton and Williams (1986) (though actually invented earlier in a PhD thesis relating to economics) BP has two phases: 31

Forward Propagation of Activity Step 1: Initialize weights at random, choose a learning rate η Until network is trained: For each training example i.e. input pattern and target output(s): Step 2: Do forward pass through net (with fixed weights) to produce output(s) – i.e., in Forward Direction, layer by layer: Inputs applied Multiplied by weights Summed ‘Squashed’ by sigmoid activation function Output passed to each neuron in next layer – Repeat above until network output(s) produced 32

Step 3. Back-propagation of error 33

‘Back-prop’ algorithm summary (with Maths!) 34

‘Back-prop’ algorithm summary (with NO Maths!) 35

MLP/BP: A worked example 36

Worked example: Forward Pass 37

Worked example: Forward Pass 38

Worked example: Backward Pass 39

Worked example: Update Weights Using Generalized Delta Rule (BP) 40

Similarly for the all weights wij: 41

Verification that it works 42

Training This was a single iteration of back-prop Training requires many iterations with many training examples or epochs (one epoch is entire presentation of complete training set) It can be slow ! Note that computation in MLP is local (with respect to each neuron) Parallel computation implementation is also possible 43

Training and testing data How many examples ? –The more the merrier ! Disjoint training and testing data sets –learn from training data but evaluate performance (generalization ability) on unseen test data Aim: minimize error on test data 44

More resources Binary Logic Unit in an example – MultiLayer Perceptron Learning Algorithm –