Neural Networks.

Slides:



Advertisements
Similar presentations
1 Backpropagation Neural Network for Soil Moisture Retrieval Using NAFE05 Data : A Comparison Of Different Training Algorithms Soo See Chai Department.
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
1 Image Classification MSc Image Processing Assignment March 2003.
Artificial Neural Networks (1)
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Salvatore giorgi Ece 8110 machine learning 5/12/2014
Perceptron.
Machine Learning Neural Networks
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Spring 2002 Shreekanth Mandayam Robi Polikar ECE Department.
Data Mining with Neural Networks (HK: Chapter 7.5)
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent. x0x0 + -
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
CS621 : Artificial Intelligence
PARALLELIZATION OF ARTIFICIAL NEURAL NETWORKS Joe Bradish CS5802 Fall 2015.
Neural Networks Demystified by Louise Francis Francis Analytics and Actuarial Data Mining, Inc.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
COMP53311 Other Classification Models: Neural Network Prepared by Raymond Wong Some of the notes about Neural Network are borrowed from LW Chan’s notes.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Neural networks.
Multiple-Layer Networks and Backpropagation Algorithms
Supervised Learning in ANNs
Other Classification Models: Neural Network
Other Classification Models: Neural Network
CSE 473 Introduction to Artificial Intelligence Neural Networks
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
بحث في موضوع : Neural Network
Artificial Neural Network & Backpropagation Algorithm
Synaptic DynamicsII : Supervised Learning
of the Artificial Neural Networks.
Artificial Neural Network
Machine Learning: Lecture 4
Machine Learning: UNIT-2 CHAPTER-1
Artificial Neural Networks
Artificial Neural Networks
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
David Kauchak CS158 – Spring 2019

Artificial Intelligence Chapter 3 Neural Networks
Presentation transcript:

Neural Networks

Introduction Artificial Neural Networks (ANN) Connectionist computation Parallel distributed processing Biologically Inspired computational models Machine Learning Artificial intelligence "the study and design of intelligent agents" where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success.

History McCulloch and Pitts introduced the Perceptron in 1943. Simplified model of a biological neuron The drawback in the late 1960's (Minsky and Papert) Perceptron limitations The solution in the mid 1980's Multi-layer perceptron Back-propagation training

Summary of Applications Function approximation Pattern recognition/Classification Signal processing Modeling Control Machine learning

Biologically Inspired. Electro-chemical signals Threshold output firing Human brain: About 100 billion (1011) neurons and 100 trillion (1014) synapses

The Perceptron Sum of Weighted Inputs. Threshold activation function

Activation Function The sigmoid function: Logsig (Matlab)

Activation Function The tanH function: tansig (Matlab)

The multi layer perceptron (MLP) W1 W2 W3 f f f f f f zin ... f f f zout ... ... ... f f f 1 1 1 Y0 X1 Y1 X2 Y2 X3 Y3 W1 W2 W3 F1 F2 F3 1 zin zout

The multi layer perceptron (MLP) X1 Y1 X2 Y2 X3 Y3 W1 W2 W3 F1 F2 F3 1 zin zout

Supervised Learning Learning a function from supervised training data. A set of Input vectors Zin and corresponding desired output vectors Zout. The performance function

Supervised Learning Gradient descent backpropagation The Back Propagation Error Algorithm

BPE learning. f S X1 Y1 zin F zout W1 W2 f S 1 ... f S ... ... f S 1

Neural Networks 0 Collect data. 1 Create the network. 2 Configure the network. 3 Initialize the weights. 4 Train the network. 5 Validate the network. 6 Use the network.

Lack of information in the traning data. Collect data. Lack of information in the traning data. The main problem ! As few neurons in the hidden layer as posible. Only use the network in working points represented in the traningdata. Use validation and test data. Normalize inputs/targets to fall in the range [-1,1] or have zero mean and unity variance

Create the network. Configure the network. Initialize the weights. ... f S ... ... f S Only one hidden layer. 1 Number of neurons in the hidden layer

Train the network. Validate the network. Dividing the Data into three subsets. Training set (fx. 70%) Validation set (fx. 15%) Test set (fx. 15%) trainlm: Levenberg-Marquardt trainbr: Bayesian Regularization trainbfg: BFGS Quasi-Newton trainrp: Resilient Backpropagation trainscg: Scaled Conjugate Gradient traincgb: Conjugate Gradient with Powell/Beale Restarts traincgf: Fletcher-Powell Conjugate Gradient traincgp: Polak-Ribiére Conjugate Gradient trainoss: One Step Secant traingdx: Variable Learning Rate Gradient Descent traingdm: Gradient Descent with Momentom traingd: Gradient Descent Number of iterations.

Other types of Neural networks The RCE net: Only for classification. o X1 x o x o x x x o o x x x o o o x x x o o X2

Other types of Neural networks The RCE net: Only for classification. o X1 x o x o l x l x x S o o ... l x ... x x o o l o x x x o o X2

Parzen Estimator X Y G S G Xin Yout G / S G Yout x x x x x x x x Xin ... G / S ... G Yout x x x x x x x x Xin