Presentation is loading. Please wait.

Presentation is loading. Please wait.

Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.

Similar presentations


Presentation on theme: "Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang."— Presentation transcript:

1 Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks
Chiung-Yao Fang

2 Learning What is leaning? The type of learning Incremental Learning
Active Learning The type of learning Supervised Learning Unsupervised Learning Reinforcement Learning 2018/12/9

3 Understanding the Brain
Levels of analysis (Marr, 1982) Computational theory Representation and algorithm Hardware implementation Example: sorting The same computational theory may have multiple representations and algorithms. A given representation and algorithm may have multiple hardware implementations. Reverse engineering: From hardware to theory 2018/12/9

4 Understanding the Brain
Parallel processing: SIMD vs MIMD SIMD: single instruction multiple data machines All processors execute the same instruction but on different pieces of data MIMD: multiple instruction multiple data machines Different processors may execute different instructions on different data Neural net: NIMD: neural instruction multiple data machines Each processor corresponds to a neuron, local parameters correspond to its synaptic weights, and the whole structure is a neural network. Learning: Update by training/experience Learning from examples 2018/12/9

5 Biological-Type Neural Networks
2018/12/9

6 Application-Driven Neural Networks
Three main characteristics: Adaptiveness and self-organization Nonlinear network processing Parallel processing 2018/12/9

7 Perceptron (Rosenblatt, 1962) 2018/12/9

8 2018/12/9

9 What a Perceptron Does Regression: y=wx+w0 x0: bias unit y y w0 w x x
Connection weight w w0 y x x0=+1 y x 2018/12/9

10 What a Perceptron Does Classification: y= 1(wx+w0>0)
Define s (.) as the threshold function Choose C1 if s (wx+w0)>0 else choose C2 w0 w w0 y x s 2018/12/9

11 K Outputs 2018/12/9

12 Learning Boolean AND 2018/12/9

13 XOR No w0, w1, w2 satisfy: (Minsky and Papert, 1969) 2018/12/9

14 Multilayer Perceptrons
(Rumelhart et al., 1986) 2018/12/9

15 x1 XOR x2 = (x1 AND ~x2) OR (~x1 AND x2)
y x1 XOR x2 = (x1 AND ~x2) OR (~x1 AND x2) 2018/12/9

16 Structures of Neural Networks
2018/12/9

17 Connection Structures
Four types of weighted connections: Feedforward connections Feedback connections Lateral connections Time-delay connections 2018/12/9

18 Connection Structures
Single-layer example 2018/12/9

19 Taxonomy of Neural Networks
HAM SOM 2018/12/9

20 Supervised and Unsupervised Networks
2018/12/9

21 A Top-down Perspective
2018/12/9

22 2018/12/9

23 Applications: Association
Auto-Association Hetero-Association 2018/12/9

24 Applications: Classification
Unsupervised classification (clustering) Supervised classification 2018/12/9

25 Applications: Pattern Completions
Two kinds of pattern completion problems: Static pattern completion Multilayer nets, Boltzmann machines, and Hopfield nets Temporal pattern completion Markov models and time-delay dynamic networks 2018/12/9

26 Applications: Regression and Generalization
2018/12/9

27 Applications: Optimization
2018/12/9

28 Examples: A Toy OCR Optical character recognition (OCR)
Supervised learning The retrieving phase The training phase 2018/12/9

29 Examples: A Toy OCR 2018/12/9

30 2018/12/9

31 Supervised Learning Neural Networks
Backpropagation HAM

32 Backpropagation 2018/12/9

33 Regression Backward Forward x 2018/12/9

34 Hidden Layer Do we have more hidden layers? Yes! But complicate.
“Long and narrow” network vs “Short and fat” network Two hidden layer example: For every input case of region, that region can be delimited by hyperplanes on all sides using hidden units on the first hidden layer. A hidden unit in the second layer than ANDs them together to bound the region. It has been proven that an MLP with one hidden layer can learn any nonlinear function of the input. 2018/12/9

35 HAM (Hetero-Associative Memory) Neural Network
j Output layer (Competitive layer) Excitatory connection Input layer wij xj i vi v1 v2 vn 2018/12/9

36 Training Patterns for HAM
2018/12/9

37 2018/12/9

38 Unsupervised Learning Neural Networks
SOM ART1 ART2

39 Self-organization Feature Maps
2018/12/9

40 2018/12/9

41 2018/12/9

42 An Assembly of SSO Neural Networks for Character Recognition
2018/12/9

43 An Assembly of SSO Neural Networks for Character Recognition
2018/12/9

44 ART1 Neural Networks 2018/12/9

45 Attentional subsystem
ART2 Neural Networks r p u w v x q y Input vector i Input representation field F1 Attentional subsystem Orienting subsystem G Category representation field F2 Reset signal Signal generator S 2018/12/9

46 Road Sign Recognition System
2018/12/9

47 Classification Results of ART2
Training Set Test Set 2018/12/9

48 Conclusions 2018/12/9

49 STA Neural Networks

50 STA (Spatial-temporal attention ) Neural Network
ak ai Output layer (Attention layer) nk Inhibitory connection ni wij Excitatory connection xj nj Input layer 2018/12/9

51 The linking strengths between the input and the attention layers
STA Neural Network The input to attention neuron ni due to input stimuli x: The linking strengths between the input and the attention layers corresponding neurons wkj ni nj nk Input neuron Attention layer rk Gaussian function G 2018/12/9

52 “Mexican-hat” function of lateral interaction
STA Neural Network The input to attention neuron ni due to lateral interaction: Lateral distance “Mexican-hat” function of lateral interaction Interaction + 2018/12/9

53 STA Neural Network The net input to attention neuron ni :
: a threshold to limit the effects of noise where 1< d <0 2018/12/9

54 The activation of an attention neuron in response to a stimulus.
STA Neural Network (5) stimulus activation t 1 1 p pd The activation of an attention neuron in response to a stimulus. 2018/12/9

55 Results of STA Neural Networks
2018/12/9

56 Results of STA Neural Networks
2018/12/9


Download ppt "Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang."

Similar presentations


Ads by Google