Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Neural Networks Jianfeng Feng School of Cognitive and Computing Sciences Spring 2001.

Similar presentations


Presentation on theme: "Introduction to Neural Networks Jianfeng Feng School of Cognitive and Computing Sciences Spring 2001."— Presentation transcript:

1 Introduction to Neural Networks Jianfeng Feng School of Cognitive and Computing Sciences jianfeng@cogs.susx.ac.uk Spring 2001

2 Tutors Name Office Email Jianfeng Feng COGS-5C19 jianfeng (first half term) Andy Philippides CCNR andrewop (second half term)

3 Tutors Name Office Email Jianfeng Feng COGS-5C19 jianfeng (first half term) Andy Philippides CCNR andrewop (second half term) Lectures -- 2 per week Time Day Place 2:00 - 2:50 Thu Arun - 401 4:00 - 4:50 Fri PEV1- 1A7 Seminar– 1 per week (from first week) (Andy)

4 Tutors Name Office Email Jianfeng Feng COGS-5C19 jianfeng (first half term) Andy Philippides CCNR andrewop (second half term) Lectures -- 2 per week Time Day Place 2:00 - 2:50 Thu Arun - 401 4:00 - 4:50 Fri PEV1- 1A7 Seminar– 1 per week (from first week) (Andy) Please see school notice board Lecture notes are available at my homepage (I will modify them before each lecture)

5 Today’s Topics:  Summary  A comparison of biological neuron networks with artificial neural networks  A little bit math  Single artificial neuron

6 After a short introduction to neurons, synapses, and the concept of learning (biological and statistical foundations of neural networks), the course covers methods of supervised learning (perceptron and linear separability, backprop, Course Summary

7 After a short introduction to neurons, synapses, and the concept of learning (biological and statistical foundations of neural networks), the course covers methods of supervised learning (perceptron and linear separability, backprop, radial basis functions and the problems of generalization, support vector machine) methods of unsupervised learning (classification, PCA, Kohonen, vector quantization) Computational neuroscience and robots Course Summary

8 Course Topics 1. Introduction to neural 6. Radial basis function networks networks(2) 2. Formal neurons ( 2) 7. Support vector machines 3. Learning (1) 8. Unsupervised learning(2) 4. Single layer perceptron 9. Preprocessing and PCA 5. Multilayer perceptron (2) 11. Computation Neurosci(2) (one revision) (2)=two lectures (3)=three lectures

9 Why do we need NN?

10 Neural Networks Are For Applications Science Character recognition Neuroscience Optimization Physics, mathematics statistics Financial prediction Computer science Automatic driving Psychology.........................................................

11 History Minsky & Papert(1969) ------ Perceptrons Rosenblatt(1960) ------ Perceptron Minsky(1954) ------ Neural Networks (PhD Thesis) Hebb(1949) --------The organization of behaviour McCulloch & Pitts (1943) -----neural networks and artificial intelligence were born

12 History spiking neural networks Vapnik (1990) ---support vector machine Broomhead & Lowe (1988) ----Radial basis functions (RBF) Linsker (1988) ----- Informax principle Rumelhart, Hinton -------- Back-propagation & Williams (1986) Kohonen(1982) ------ Self-organizing maps Hopfield(1982) ------ Hopfield Networks Minsky & Papert(1969) ------ Perceptrons Rosenblatt(1960) ------ Perceptron Minsky(1954) ------ Neural Networks (PhD Thesis) Hebb(1949) --------The organization of behaviour McCulloch & Pitts (1943) -----neural networks and artificial intelligence were born

13 History spiking neural networks (Feng) Vapnik (1990) ---support vector machine Broomhead & Lowe (1988) ----Radial basis functions (RBF) Linsker (1988) ----- Informax principle Rumelhart, Hinton -------- Back-propagation & Williams (1986) Kohonen(1982) ------ Self-organizing maps Hopfield(1982) ------ Hopfield Networks Minsky & Papert(1969) ------ Perceptrons Rosenblatt(1960) ------ Perceptron Minsky(1954) ------ Neural Networks (PhD Thesis) Hebb(1949) --------The organization of behaviour McCulloch & Pitts (1943) -----neural networks and artificial intelligence were born

14 History spiking neural networks (Feng, Cogs) Vapnik (1990) ---support vector machine Broomhead & Lowe (1988) ----Radial basis functions (RBF) Linsker (1988) ----- Informax principle Rumelhart, Hinton -------- Back-propagation & Williams (1986) Kohonen(1982) ------ Self-organizing maps Hopfield(1982) ------ Hopfield Networks Minsky & Papert(1969) ------ Perceptrons Rosenblatt(1960) ------ Perceptron Minsky(1954) ------ Neural Networks (PhD Thesis) Hebb(1949) --------The organization of behaviour McCulloch & Pitts (1943) -----neural networks and artificial intelligence were born

15 Today’s Topics:  Summary  A comparison of biological neuron networks with artificial neural networks

16 What are biological neuron networks? (see next lectures for more details) UNITs: nerve cells called neurons, many different types and are extremely complex, around 10 11 neurons in the brain INTERACTIONs: signal is conveyed by action potentials, interactions could be chemical (release or receive ions) or electrical. Each neuron makes contact with around 10 3 other neurons STRUCTUREs: feedforward and feedback and self-activation recurrent

17 A carton : Soma

18 The complexity of a neuronal system can be partly seen from a picture in a book I am currently editing on computational neuroscience

19

20

21

22

23

24

25 What are (artificial) neural networks? It is a network with interactions, in attempt to mimicking the brain UNITs: artificial neuron (linear or nonlinear input-output unit), small numbers, a few hundreds INTERACTIONs: simply by weights, how strong a neuron affects others STRUCTUREs: could be feedforward, feedback or recurrent It is still far too naive, and the development of the field relies on all of us

26 xnxn x1x1 x2x2 Input (visual input) Output (Motor output) Four-layer networks Hidden layers

27 Reading list: any book on NN 1. Neural networks, Haykin S. (1999), Prentice Hall international Inc. (nice, but too much) 2. Neutral Network for Pattern Recognition Bishop C.M. (1995), Oxford: Clarendon Press (too much on statistics, less dynamical theory) 3. Introduction to the theory of neural computation Hertz J., Krogh A., and Palmer R.G. (nice, but somewhat out of date) 4. Theoretical Neuroscience Dayan P., and Abbott L.F. (2001) More on specific topics

28 Assessment Master students: 3 assignments (50%) and unseen exam (50%) Undergraduates: 3 assignments (100%) -no exam. The assignments will use MATLAB or any other languages (up to your choice). Seminar running by Andy will help you ensure the assignments are completed.

29 Don’t panic, everything is going to be easy. To understand main ideas and be able to implement them using one of the languages (MATLAB, C, JAVA) you are most familiar with, be able to apply them to solving practical problems For these students who get bored, please come to see me or visit my homepage to get rough ideas on the research of Neural Networks and Computational Neuroscience http://www.cogs.susx.ac.uk/users/jianfeng/ office: 5c19, COGS

30 Today’s Topics:  Summary  A comparison of biological neuron networks with artificial neural networks  A little bit math

31 A Little bit math matrix vector

32 where f is a function, for example the sigmoidal function underline of a variable = vector A Little bit math

33 sigmoidal function A Little bit math Output firing rate of a neuron

34 A Little bit math Random variables: X (capital), random vector X discrete case: P(X=a)=p

35 More math http://www- slab.usc.edu/courses/CS599- NCANN.htmlhttp://www- slab.usc.edu/courses/CS599- NCANN.html, Appendix of Duda, Hart and Stork’s book

36 More math http://www- slab.usc.edu/courses/CS599- NCANN.htmlhttp://www- slab.usc.edu/courses/CS599- NCANN.html, Appendix of Duda, Hart and Stork’s book

37 The general artificial neuron model has five components, shown in the following list. (The subscript i indicates the i-th input or weight.) 1. A set of inputs, x i.inputs 2. A set of weights, w i.weights 3. A bias, u. 4. An activation function, f.activation function 5. Neuron output, y

38


Download ppt "Introduction to Neural Networks Jianfeng Feng School of Cognitive and Computing Sciences Spring 2001."

Similar presentations


Ads by Google