Learning in neural networks Chapter 19 DRAFT. Biological neuron.

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Tuomas Sandholm Carnegie Mellon University Computer Science Department
G5BAIM Artificial Intelligence Methods Graham Kendall Neural Networks.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Neural Networks Marco Loog.
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Artificial Neural Networks
Data Mining with Neural Networks (HK: Chapter 7.5)
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial Basis Function Networks
Some more Artificial Intelligence
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Neural Networks
Computer Science and Engineering
Artificial Neural Networks An Overview and Analysis.
Chapter 9 Neural Network.
Machine Learning Chapter 4. Artificial Neural Networks
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
START OF DAY 4 Reading: Chap. 3 & 4. Project Topics & Teams Select topics/domains Select teams Deliverables – Description of the problem – Selection.
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
Appendix B: An Example of Back-propagation algorithm
Backpropagation An efficient way to compute the gradient Hung-yi Lee.
NEURAL NETWORKS FOR DATA MINING
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
Feed-Forward Neural Networks 主講人 : 虞台文. Content Introduction Single-Layer Perceptron Networks Learning Rules for Single-Layer Perceptron Networks – Perceptron.
Artificial Neural Networks Bruno Angeles McGill University – Schulich School of Music MUMT-621 Fall 2009.
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
CS621 : Artificial Intelligence
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Artificial Neural Network
EEE502 Pattern Recognition
November 21, 2013Computer Vision Lecture 14: Object Recognition II 1 Statistical Pattern Recognition The formal description consists of relevant numerical.
1 Perceptrons Louis Oliphant CS 540 section 2 slides borrowed (with modifications) from Burr Settles.
Introduction to Neural Networks Freek Stulp. 2 Overview Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered.
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Start with student evals. What function does perceptron #4 represent?
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Belief Networks CS121 – Winter Other Names Bayesian networks Probabilistic networks Causal networks.
Neural Networks References: “Artificial Intelligence for Games” "Artificial Intelligence: A new Synthesis"
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Neural Networks.
第 3 章 神经网络.
CS621: Artificial Intelligence
Machine Learning Today: Reading: Maria Florina Balcan
Chapter 3. Artificial Neural Networks - Introduction -
Biological Neuron Cell body Dendrites Axon Impulses Synapses
Neural Networks Chapter 5
G5AIAI Introduction to AI
Belief Networks CS121 – Winter 2003 Belief Networks.
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya.
Presentation transcript:

Learning in neural networks Chapter 19 DRAFT

Biological neuron

Neuron Soma Dendrites Axon Synapse Action potential

Perceptron  g xixi x0x0 xnxn y wiwi y = g (  i=1,…,n w i x i )

Perceptron Synonym for Single- Layer, Feed-Forward Network First Studied in the 50’s Other networks were known about but the perceptron was the only one capable of learning and thus all research was concentrated in this area

Perceptron A single weight only affects one output so we can restrict our investigations to a model as shown on the right Notation can be simpler, i.e.

What can perceptrons represent?

0,0 0,1 1,0 1,1 0,0 0,1 1,0 1,1 AND XOR Functions which can be separated in this way are called Linearly Separable Only linearly Separable functions can be represented by a perceptron

What can perceptrons represent? Linear Separability is also possible in more than 3 dimensions – but it is harder to visualise

Training a perceptron Aim

Training a perceptrons t = 0.0 y x W = 0.3 W = -0.4 W = 0.5

Function-Learning Formulation  Goal function f  Training set: (x i, f(x i )), i = 1,…,n  Inductive inference: find a function h that fits the point well  Same Keep-It-Simple bias

Perceptron  g xixi x0x0 xnxn y wiwi y = g (  i=1,…,n w i x i ) ?

Unit (Neuron)  g xixi x0x0 xnxn y wiwi y = g(  i=1,…,n w i x i ) g(u) = 1/[1 + exp(-  u)]

Neural Network Network of interconnected neurons  g xixi x0x0 xnxn y wiwi  g xixi x0x0 xnxn y wiwi Acyclic (feed-forward) vs. recurrent networks

Two-Layer Feed-Forward Neural Network InputsHidden layer Output layer

Backpropagation (Principle)  New example y k = f(x k )  φ k = outcome of NN with weights w for inputs x k  Error function: E(w) = ||φ k – y k || 2  w ij (k) = w ij (k-1) – ε  E/  w ij  Backpropagation: Update the weights of the inputs to the last layer, then the weights of the inputs to the previous layer, etc.

Comments and Issues  How to choose the size and structure of networks? If network is too large, risk of over-fitting (data caching) If network is too small, representation may not be rich enough  Role of representation: e.g., learn the concept of an odd number  Incremental learning