Introduction to Neural Networks Freek Stulp. 2 Overview Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered.

Slides:



Advertisements
Similar presentations
NEURAL NETWORKS Biological analogy
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Kostas Kontogiannis E&CE
Artificial Neural Networks - Introduction -
Artificial Neural Networks - Introduction -
Machine Learning Neural Networks
Lecture 14 – Neural Networks
1 Part I Artificial Neural Networks Sofia Nikitaki.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Biological inspiration Animals are able to react adaptively to changes in their external and internal environment, and they use their nervous system to.
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Neural Networks Marco Loog.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Chapter 6: Multilayer Neural Networks
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Artificial Neural Networks
ICS 273A UC Irvine Instructor: Max Welling Neural Networks.
Neural Networks Slides by Megan Vasta. Neural Networks Biological approach to AI Developed in 1943 Comprised of one or more layers of neurons Several.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Neural Networks An Overview and Analysis.
Explorations in Neural Networks Tianhui Cai Period 3.
2101INT – Principles of Intelligent Systems Lecture 10.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Appendix B: An Example of Back-propagation algorithm
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
1 Introduction to Reinforcement Learning Freek Stulp.
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Artificial Neural Network
Learning Neural Networks (NN) Christina Conati UBC
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Neural Networks.
Artificial Neural Networks
Learning with Perceptrons and Neural Networks
Artificial Intelligence (CS 370D)
CSE 473 Introduction to Artificial Intelligence Neural Networks
CSE P573 Applications of Artificial Intelligence Neural Networks
Artificial Intelligence Methods
of the Artificial Neural Networks.
CSE 573 Introduction to Artificial Intelligence Neural Networks
Neural Networks Chapter 5
Presentation transcript:

Introduction to Neural Networks Freek Stulp

2 Overview Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered Feed-Forward Networks 3. Recurrent Networks Conclusion

3 Biological Background Neuron consists of: Cell body Dendrites Axon Synapses Neural activation : Throught dendrites/axon Synapses have different strengths

4 Artificial Neuron ajaj W ji Input links (dendrites) Unit (cell body) Output links (axon) aiai a i = g(in i ) in i =  a j W ji

5 Class I: Perceptron a = g(in) in =  a j W j a = g(-W 0 + W 1 a 1 + W 2 a 2 ) g(in) = 0, in<0 1, in>0 { a1a1 a2a2 IjIj a O W0W0 W1W1 W2W2 WjWj

6 Learning in Perceptrons Perceptrons can learn mappings from inputs I to outputs O by changing weights W Training set D: Inputs: I 0, I 1... I n Targets: T 0, T 1... T n Example: boolean OR D: Output O of network is not necessary equal to T! dIT

7 Learning in Perceptrons Error often defined as: E(W) = 1 / 2  d  D (t d -o d ) 2 Go towards the minimum error! Update rules: w i = w i  w i  w i = -  E/  w i  E/  w i =  /  w i 1 / 2  d  D (t d -o d ) 2 =  d  D (t d -o d )i id This is called gradient descent i

8 Class II: Multi-layer Feed-forward Networks Feed-forward: Output links only connected to input links in the next layer InputHiddenOutput Multiple layers: hidden layer(s) Complex non-linear functions can be represented

9 Learning in MLFF Networks For output layer, weight updating similar to perceptrons. Problem: What are the errors in the hidden layer? Backpropagation Algorithm For each hidden layer (from output to input) :  For each unit in the layer determine how much it contributed to the errors in the previous layer.  Adapt the weight according to this contribution This is also gradient descent

10 Class III: Recurrent Networks InputHiddenOutput No restrictions on connections Behaviour more difficult to predict/ understand

11 Conclusion Inspiration from biology, though artificial brains are still very far away. Perceptrons too simple for most problems. MLFF Networks good as function approximators. Many of your articles use these networks! Recurrent networks complex but useful too.

12 Literature Artificial Intelligence: A Modern Approach Stuart Russel and Peter Norvig Machine Learning Tom M. Mitchell