Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.

Slides:



Advertisements
Similar presentations
Introduction to Artificial Neural Networks
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Decision Support Systems
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Artificial Neural Network
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Intrusion Detection Using Hybrid Neural Networks Vishal Sevani ( )
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
Introduction to Neural Networks Debrup Chakraborty Pattern Recognition and Machine Learning 2006.
Explorations in Neural Networks Tianhui Cai Period 3.
ANNs (Artificial Neural Networks). THE PERCEPTRON.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
NEURAL NETWORKS FOR DATA MINING
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
National Taiwan A Road Sign Recognition System Based on a Dynamic Visual Model C. Y. Fang Department of Information and.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 8: Neural Networks.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Chapter 18 Connectionist Models
Back-propagation network (BPN) Student : Dah-Sheng Lee Professor: Hahn-Ming Lee Date:20 September 2003.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Perceptrons Michael J. Watts
Chapter 6 Neural Network.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
National Taiwan Normal A System to Detect Complex Motion of Nearby Vehicles on Freeways C. Y. Fang Department of Information.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
INTRODUCTION TO NEURAL NETWORKS 2 A new sort of computer What are (everyday) computer systems good at... and not so good at? Good at..Not so good at..
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Neural networks.
Introduction to Neural Networks
Multilayer Perceptrons
Learning in Neural Networks
INTRODUCTION TO Machine Learning 3rd Edition
Real Neurons Cell structures Cell body Dendrites Axon
Joost N. Kok Universiteit Leiden
CSSE463: Image Recognition Day 17
CSE P573 Applications of Artificial Intelligence Neural Networks
Artificial Intelligence Methods
of the Artificial Neural Networks.
XOR problem Input 2 Input 1
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Artificial Intelligence Chapter 3 Neural Networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
CSSE463: Image Recognition Day 17
Artificial Intelligence Chapter 3 Neural Networks
CSSE463: Image Recognition Day 17
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
The Network Approach: Mind as a Web
PYTHON Deep Learning Prof. Muhammad Saeed.
Artificial Intelligence Chapter 3 Neural Networks
Presentation transcript:

Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang

Learning What is leaning? The type of learning Machine learning is programming computers to optimize a performance criterion using example data or past experience. There is no need to “learn” to calculate payroll Incremental learning, active learning,… The type of learning Supervised learning Unsupervised learning Reinforcement learning 2018/11/19

Understanding the Brain Levels of analysis (Marr, 1982) Computational theory Representation and algorithm Hardware implementation Example: sorting The same computational theory may have multiple representations and algorithms. A given representation and algorithm may have multiple hardware implementations. Reverse engineering: From hardware to theory 2018/11/19

Understanding the Brain Parallel processing: SIMD vs MIMD SIMD: single instruction multiple data machines All processors execute the same instruction but on different pieces of data MIMD: multiple instruction multiple data machines Different processors may execute different instructions on different data Neural net: NIMD: neural instruction multiple data machines Each processor corresponds to a neuron, local parameters correspond to its synaptic weights, and the whole structure is a neural network. Learning: Update by training/experience Learning from examples 2018/11/19

Biological-Type Neural Networks 突觸(兩個神經原的相接處) ( 樹狀突 軸突 2018/11/19

Application-Driven Neural Networks Three main characteristics: Adaptiveness and self-organization Nonlinear network processing Parallel processing 2018/11/19

Perceptron (Rosenblatt, 1962) Bias(偏壓) 2018/11/19

2018/11/19

What a Perceptron Does Regression: y=wx+w0 x0: bias unit y y w0 w x x Connection weight w w0 y x x0=+1 y x 2018/11/19

What a Perceptron Does Classification: y = s(wx+w0>0) Define s(.) as the threshold function Choose C1 if s(wx+w0)>0 else choose C2 w0 w w0 y x s 2018/11/19

K Outputs 2018/11/19

Learning Boolean AND 2018/11/19

XOR No w0, w1, w2 satisfy: (Minsky and Papert, 1969) r w2 w0 w1 x0=1 2018/11/19

Multilayer Perceptrons (Rumelhart et al., 1986) 2018/11/19

x1 XOR x2 = (x1 AND ~x2) OR (~x1 AND x2) y x1 XOR x2 = (x1 AND ~x2) OR (~x1 AND x2) 2018/11/19

Structures of Neural Networks 2018/11/19

Connection Structures Four types of weighted connections: Feedforward connections Feedback connections Lateral connections Time-delay connections 2018/11/19

Connection Structures Single-layer example 2018/11/19

Taxonomy of Neural Networks HAM SOM 2018/11/19

Supervised and Unsupervised Networks 2018/11/19

A Top-down Perspective 2018/11/19

2018/11/19

Applications: Association Auto-Association Hetero-Association 2018/11/19

Applications: Classification Unsupervised classification (clustering) Supervised classification 2018/11/19

Applications: Pattern Completions Two kinds of pattern completion problems: Static pattern completion Multilayer nets, Boltzmann machines, and Hopfield nets Temporal pattern completion Markov models and time-delay dynamic networks 2018/11/19

Applications: Regression and Generalization 2018/11/19

Applications: Optimization 2018/11/19

Examples: A Toy OCR Optical character recognition (OCR) Supervised learning The retrieving phase The training phase 2018/11/19

Examples: A Toy OCR 2018/11/19

2018/11/19

Supervised Learning Neural Networks Backpropagation HAM

Backpropagation 2018/11/19

Regression Forward Backward x 2018/11/19

Hidden Layer Do we have more hidden layers? Two hidden layer example: Yes! But complicate. “Long and narrow” network vs “Short and fat” network Two hidden layer example: For every input case of region, that region can be delimited by hyperplanes on all sides using hidden units on the first hidden layer. A hidden unit in the second layer than ANDs them together to bound the region. It has been proven that an MLP (multi-layer perceptrons) with one hidden layer can learn any nonlinear function of the input. 2018/11/19

HAM (Hetero-Associative Memory) Neural Network (1) j Output layer (Competitive layer) Excitatory connection Input layer wij xj i vi v1 v2 vn 2018/11/19

HAM Neural Network (2) The input to neuron ni due to input stimuli x: nc: the winner after the competition 2018/11/19

Training Patterns for HAM 2018/11/19

2018/11/19

Unsupervised Learning Neural Networks SOM ART1 ART2

Self-organization Feature Maps 2018/11/19

2018/11/19

2018/11/19

An Assembly of SSO Neural Networks for Character Recognition 2018/11/19

An Assembly of SSO Neural Networks for Character Recognition 2018/11/19

ART1 Neural Networks 2018/11/19

Attentional subsystem ART2 Neural Networks (1) r p u w v x q y Input vector i Input representation field F1 Attentional subsystem Orienting subsystem G Category representation field F2 Reset signal + - Signal generator S 2018/11/19

ART2 Neural Network (2) The activities on each of the six sublayers on F 1: where I is an input pattern where where the J th node on F 2 is the winner 2018/11/19

ART2 Neural Network (3) Initial weights: Parameters: Top-down weights: Bottom-up weights: Parameters: 2018/11/19

Road Sign Recognition System 2018/11/19

Classification Results of ART2 Training Set Test Set 2018/11/19

Conclusions 2018/11/19

STA Neural Networks

STA (Spatial-temporal attention ) Neural Network ak ai Output layer (Attention layer) nk Inhibitory connection ni wij Excitatory connection xj nj Input layer 2018/11/19

The linking strengths between the input and the attention layers STA Neural Network The input to attention neuron ni due to input stimuli x: The linking strengths between the input and the attention layers corresponding neurons wkj ni nj nk Input neuron Attention layer rk Gaussian function G 2018/11/19

“Mexican-hat” function of lateral interaction STA Neural Network The input to attention neuron ni due to lateral interaction: Lateral distance “Mexican-hat” function of lateral interaction Interaction + 2018/11/19

STA Neural Network The net input to attention neuron ni : : a threshold to limit the effects of noise where 1< d <0 2018/11/19

The activation of an attention neuron in response to a stimulus. STA Neural Network (5) stimulus activation t 1 1 p pd The activation of an attention neuron in response to a stimulus. 2018/11/19

Results of STA Neural Networks 2018/11/19

Experimental Results 2018/11/19

Results of STA Neural Networks 2018/11/19