Neural Networks.

Slides:



Advertisements
Similar presentations
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Multi Layer Perceptrons (MLP) Course website: The back-propagation algorithm Following Hertz chapter 6.
From Neuroinformatics to Bioinformatics: Methods for Data Analysis David Horn Spring 2006 Weizmann Institute of Science Course website:
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
CS 484 – Artificial Intelligence
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Artificial neural networks:
Introduction to Neural Networks CMSC475/675
Neural Networks Chapter 18.7
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Computer Science and Engineering
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Appendix B: An Example of Back-propagation algorithm
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
NEURAL NETWORKS FOR DATA MINING
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Multi-Layer Perceptron
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
CS621 : Artificial Intelligence
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Neural Networks 2nd Edition Simon Haykin
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Neural networks.
Multiple-Layer Networks and Backpropagation Algorithms
Introduction to Neural Networks
CS 388: Natural Language Processing: Neural Networks
Artificial Neural Networks
Learning with Perceptrons and Neural Networks
Artificial Intelligence (CS 370D)
Artificial neural networks:
Real Neurons Cell structures Cell body Dendrites Axon
Dr. Unnikrishnan P.C. Professor, EEE
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Prof. Carolina Ruiz Department of Computer Science
Artificial Neural Network & Backpropagation Algorithm
Neural Network - 2 Mayank Vatsa
Backpropagation.
Lecture Notes for Chapter 4 Artificial Neural Networks
Backpropagation.
Artificial neurons Nisheeth 10th January 2019.
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
David Kauchak CS158 – Spring 2019
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Neural Networks

Biological neural activity Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing and rest. Neuron fires if total incoming stimulus exceeds a threshold Synapse: thin gap between axon of one neuron and dendrite of another. Signal exchange Synaptic strength/efficiency

Artificial neural network Set of nodes (units, neurons, processing elements) Each node has input and output Each node performs a simple computation by its node function Weighted connections between nodes Connectivity gives the structure/architecture of the net What can be computed by a NN is primarily determined by the connections and their weights Simplified version of networks of neurons in animal nerve systems

Weighted input summation ANN Neuron Models Each node has one or more inputs from other nodes, and one output to other nodes Input/output values can be Binary {0, 1} Bipolar {-1, 1} Continuous (bounded or not) All inputs to a node come in at same time and remain activated until output is produced Weights associated with links Node function General neuron model Weighted input summation

Node Function Step function Ramp function

Node Function Sigmoid function When y = 0 and z = 0: S-shaped Continuous and everywhere differentiable Rotationally symmetric about some point (net = c) Asymptotically approaches saturation points Examples: Sigmoid function When y = 0 and z = 0: a = 0, b = 1, c = 0. When y = 0 and z = -0.5 a = -0.5, b = 0.5, c = 0. Larger x gives steeper curve

A single layer neural network Perceptron A single layer neural network

Simple architectures

Can we make a two bit adder? Inputs are bits x1 and x2 Outputs: carry bit (y1), sum bit (y2) Two NNs, really x1 y1 x2 y2 X1 X2 Y1 (carry) Y2 (sum) 1

Perceptron training rule Adjust weights slightly to reduce error between perceptron output o and target value t; repeat

Not with a perceptron  Training examples are not linearly separable for one case: sum=1 iff x1 xor x2

Works well on some problems Learning curves Are majority of inputs 1? Restaurant example: WillWait?

Sigmoid Unit

Multilayer Networks Input Output Hidden

Backpropagation Algorithm Forward direction Calculate network and error

Backpropagation Algorithm Backward direction Backpropagate: from output to input, recursively compute and adjust weights 𝜕𝐸 𝜕 𝑤 𝑖𝑗 = 𝛁 𝑤 𝐸

Network Architecture: Feedforward net A connection is allowed from a node in layer i only to nodes in layer i + 1. Most widely used architecture. Conceptually, nodes at higher levels successively abstract features from preceding layers

Recurrent neural networks Good for learning sequences of data e.g., text Lots of variations today: convoluted NNs, LSTMs, …

Neural network playground http://playground.tensorflow.org/