Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.

Slides:



Advertisements
Similar presentations
NEURAL NETWORKS Biological analogy
Advertisements

Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Artificial Neural Networks (1)
Navneet Goyal, BITS-Pilani Perceptrons. Labeled data is called Linearly Separable Data (LSD) if there is a linear decision boundary separating the classes.
Biological and Artificial Neurons Michael J. Watts
Artificial Neural Networks - Introduction -
Artificial Neural Networks - Introduction -
Machine Learning Neural Networks
Artificial Intelligence (CS 461D)
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
How does the mind process all the information it receives?
Data Mining with Neural Networks (HK: Chapter 7.5)
Neural Networks An Introduction.
INTRODUCTION TO ARTIFICIAL INTELLIGENCE
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Artificial Neural Networks
Artificial neural networks:
CISC 4631 Data Mining Lecture 11: Neural Networks.
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Neural Networks AI – Week 21 Sub-symbolic AI One: Neural Networks Lee McCluskey, room 3/10
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
1 CS 343: Artificial Intelligence Neural Networks Raymond J. Mooney University of Texas at Austin.
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
Explorations in Neural Networks Tianhui Cai Period 3.
2101INT – Principles of Intelligent Systems Lecture 10.
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Features of Biological Neural Networks 1)Robustness and Fault Tolerance. 2)Flexibility. 3)Ability to deal with variety of Data situations. 4)Collective.
Features of Biological Neural Networks 1)Robustness and Fault Tolerance. 2)Flexibility. 3)Ability to deal with variety of Data situations. 4)Collective.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Lecture 5 Neural Control
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
For Friday Read ch. 20, sections 1-3 Program 4 due.
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Where are we? What’s left? HW 7 due on Wednesday Finish learning this week. Exam #4 next Monday Final Exam is a take-home handed out next Friday in class.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Introduction to Neural Networks
CS 388: Natural Language Processing: Neural Networks
Neural Networks.
Artificial Intelligence (CS 370D)
Artificial neural networks:
Real Neurons Cell structures Cell body Dendrites Axon
Data Mining with Neural Networks (HK: Chapter 7.5)
OVERVIEW OF BIOLOGICAL NEURONS
Artificial Intelligence Lecture No. 28
Data Mining Neural Networks Last modified 1/9/19.
ARTIFICIAL NEURAL networks.
Introduction to Neural Network
Presentation transcript:

Neural Network I Week 7 1

Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore neural network tools. beginning of the lecture on Friday March18 th.

3

4

Neurons Components of a neuron: cell body, dendrites, axon, synaptic terminals. The electrical potential across the cell membrane exhibits spikes called action potentials. Originating in the cell body, this spike travels down the axon and causes chemical neurotransmitters to be released at synaptic terminals. This chemical diffuses across the synapse into dendrites of neighboring cells. 5

Neural Speed Real neuron “switching time” is on the order of milliseconds (10 −3 sec) – compare to nanoseconds (10 −10 sec) for current transistors – transistors are a million times faster! But: – Biological systems can perform significant cognitive tasks (vision, language understanding) in approximately 10 −1 second. There is only time for about 100 serial steps to perform such tasks. – Even with limited abilities, current machine learning systems require orders of magnitude more serial steps. 6

ANN (1) Rosenblatt first applied the single-layer perceptrons to pattern-classification learning in the late 1950s ANN is an abstract computational model of the human brain The brain is the best example we have of a robust learning system 7

ANN (2) The human brain has an estimated tiny units called neurons These neurons are interconnected with an estimated links (each neuron makes synapses with approximately 10 4 other neurons). Massive parallelism allows for computational efficiency 8

ANN General Approach (1) Neural networks are loosely modeled after the biological processes involved in cognition: Real: Information processing involves a large number of neurons. ANN: A perceptron is used as the artificial neuron. Real: Each neuron applies an activation function to the input it receives from other neurons, which determines its output. ANN: The perceptron uses an mathematically modeled activation function. 9

ANN General Approach (2) Real: Each neuron is connected to many others. Signals are transmitted between neurons using connecting links. ANN: We will use multiple layers of neurons, i.e. the outputs of some neurons will be the input to others. 10

Characteristics of ANN Nonlinearity Learning from examples Adaptivity Fault tolerance Uniformity of analysis and design 11

Model of an Artificial Neuron ∑ f(net k ) net k x1x1 x2x2 xmxm ykyk w k1 w km w k2 k th artificial neuron b k (=w k0 & x 0 =1) A model of an artificial neuron (perceptron) A set of connecting links An adder An activation function 12

13

Data Mining: Concepts, Models, Methods, And Algorithms [Kantardzic, 2003] 14

A Single Node ∑ f(net 1 ) net 1 X 1 =0.5 y1y X 2 =0.5 X 3 =0.5 f(net 1 ): 1.(Log-)sigmoid 2.Hyperbolic tangent sigmoid 3.Hard limit transfer (threshold) 4.Symmetrical hard limit transfer 5.Saturating linear 6.Linear ……. 15

A Single Node ∑|f(net 1 ) X 1 =0.5 y1y X 2 =0.5 X 3 =0.5 f(net 1 ): 1.(Log-)sigmoid 2.Hyperbolic tangent sigmoid 3.Hard limit transfer (threshold) 4.Symmetrical hard limit transfer 5.Saturating linear 6.Linear ……. 16

Perceptron with Hard Limit Activation Function y1y1 x1x1 x2x2 xmxm w k1 w km w k2 bkbk

Perceptron Learning Process The learning process is based on the training data from the real world, adjusting a weight vector of inputs to a perceptron. In other words, the learning process is to begin with random weighs, then iteratively apply the perceptron to each training example, modifying the perceptron weights whenever it misclassifies a training data. 18

Backpropagation A major task of an ANN is to learn a model of the world (environment) to maintain the model sufficiently consistent with the real world so as to achieve the target goals of the application. Backpropagation is a neural network learning algorithm. 19

Learning Performed through Weights Adjustments ∑ net k x1x1 x2x2 xmxm ykyk w k1 w km w k2 k th perceptron bkbk ∑ tktk Weights adjustment

Perceptron Learning Rule inputoutput Sample k x k0,x k1, …, x km y k Perceptron Learning Rule 21

Perceptron Learning Process 22/32 n (training data)x1x1 x2x2 x3x3 tktk ∑| X1X b=0 X2X2 X3X3 ∑ tktk Learning rate η = 0.1 ykyk - + Weights adjustment

Adjustment of Weight Factors with the Previous Slide 23

Implementing Primitive Boolean Functions Using A Perceptron AND OR XOR ( ¬ OR) 24

AND Boolean Function 25 ∑| X1X1 b=X 0 X2X2 ykyk x 1 x 2 output Learning rate η = 0.05

OR Boolean Function 26 ∑| X1X1 b X2X2 ykyk x 1 x 2 output Learning rate η = 0.05

Exclusive OR (XOR) Function 27 ∑| X1X1 b X2X2 ykyk x 1 x 2 output Learning rate η = 0.05

Exclusive OR (XOR) Problem A single “linear” perceptron cannot represent XOR(x 1, x 2 ) Solutions – Multiple linear units Notice XOR(x 1, x 2 ) = (x 1 ∧¬ x 2 ) ∨ ( ¬ x 1 ∧ x 2 ). – Differentiable non-linear threshold units 28

Exclusive OR (XOR) Problem Solutions – Multiple linear units Notice XOR(x 1, x 2 ) = (x 1 ∧¬ x 2 ) ∨ ( ¬ x 1 ∧ x 2 ). – Differentiable non-linear threshold units 29