Neural Networks in Electrical Engineering Prof. Howard Silver School of Computer Sciences and Engineering Fairleigh Dickinson University Teaneck, New Jersey.

Slides:



Advertisements
Similar presentations
Neural Network Toolbox COMM2M Harry R. Erwin, PhD University of Sunderland.
Advertisements

Slides from: Doug Gray, David Poole
Introduction to Neural Networks Computing
Perceptron Learning Rule
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Biological and Artificial Neurons Michael J. Watts
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Overview over different methods – Supervised Learning
Simple Neural Nets For Pattern Classification
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
PERCEPTRON. Chapter 3: The Basic Neuron  The structure of the brain can be viewed as a highly interconnected network of relatively simple processing.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
1 Artificial Neural Networks: An Introduction S. Bapi Raju Dept. of Computer and Information Sciences, University of Hyderabad.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Before we start ADALINE
September 23, 2010Neural Networks Lecture 6: Perceptron Learning 1 Refresher: Perceptron Training Algorithm Algorithm Perceptron; Start with a randomly.
Data Mining with Neural Networks (HK: Chapter 7.5)
Handwritten Character Recognition Using Block wise Segmentation Technique (BST) in Neural Network 47th Annual Convention of the Computer Society of India.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
CS 4700: Foundations of Artificial Intelligence
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
BEE4333 Intelligent Control
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
NEURAL NETWORKS FOR DATA MINING
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
The Perceptron. Perceptron Pattern Classification One of the purposes that neural networks are used for is pattern classification. Once the neural network.
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
SUPERVISED LEARNING NETWORK
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Supervised learning network G.Anuradha. Learning objectives The basic networks in supervised learning Perceptron networks better than Hebb rule Single.
EEE502 Pattern Recognition
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Start with student evals. What function does perceptron #4 represent?
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
Neural Networks References: “Artificial Intelligence for Games” "Artificial Intelligence: A new Synthesis"
Artificial Neural Networks An Introduction. Outline Introduction Biological and artificial neurons Perceptrons (problems) Backpropagation network Training.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Neural networks.
Big data classification using neural network
Neural Networks.
Artificial neural networks:
Real Neurons Cell structures Cell body Dendrites Axon
CSSE463: Image Recognition Day 17
Hebb and Perceptron.
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
Artificial Neural Networks
Machine Learning.
Presentation transcript:

Neural Networks in Electrical Engineering Prof. Howard Silver School of Computer Sciences and Engineering Fairleigh Dickinson University Teaneck, New Jersey

Axon from Another Neuron Axon from Another Neuron Synaptic Gap Synaptic Gap Soma Axon Dendrite Dendrite of Another Neuron Dendrite of Another Neuron Biological Neuron

Steps in applying Perceptron: ! Initialize the weights and bias to 0. ! Set the learning rate alpha (0 < α <= 1) and threshold θ. ! For each input pattern, ! __Compute for each output ! y in = b + x 1 * w 1 + x 2 * w 2 + x 3 * w ! and set ! __y = ‑ 1 for y in < ‑ θ ! __y = 0 for -θ<= y in <= θ ! __y = 1 for y in > θ ! __If the jth output y j is not equal to t j, set ! ____w ij(new) = w ij(old) + α * x i * t j ! ____b j(new) = b j(old) + α * t j ! __(else no change in w ij and b j ) Example of Supervised Learning Algorithm - Perceptron

Perceptron Applied to Character Recognition Neural net inputs x 1 to x 25 Binary target output string (t 1 to t 26 )

Signal Classification with Perceptron

SN=5

>> nnet11i Enter frequency separation in pct. (del) 100 Number of samples per cycle (xtot) 64 Enter number of training epochs (m) 100 Final outputs after training: Enter signal to noise ratio (SN) - zero to exit 1 Classification of signals embedded in noise Classification of Three Sinusoids of Different Frequency SignalsNoisy Signals

>> nnet11i Enter frequency separation in pct. (del) 10 Number of samples per cycle (xtot) 64 Enter number of training epochs (m) 100 Final outputs after training: Enter signal to noise ratio (SN) - zero to exit 10 Classification of signals embedded in noise SignalsNoisy Signals

SignalsNoisy Signals Enter signal to noise ratio (SN) - zero to exit 10 Classification of signals embedded in noise 0? >> nnet11i Enter frequency separation in pct. (del) 5 Number of samples per cycle (xtot) 64 Enter number of training epochs (m) 500 Final outputs after training:

SignalsNoisy Signals >> nnet11i Enter frequency separation in pct. (del) 1 Number of samples per cycle (xtot) 1000 Enter number of training epochs (m) Final outputs after training: Enter signal to noise ratio (SN) - zero to exit 100 Classification of signals embedded in noise 100 0?0 001

Unsupervised Learning

Initialize the weights (e.g. random values). Set the neighborhood radius (R) and a learning rate (α). Repeat the steps below until convergence or a maximum number of epochs is reached. For each input pattern X = [x 1 x 2 x ] Compute a "distance" D(j) = (w 1j ‑ x 1 ) 2 + (w 2j ‑ x 2 ) 2 + (w 3j ‑ x 3 ) for each cluster (i.e. all j), and find jmin, the value of j corresponding to the minimum D(j). If j is "in the neighborhood of" jmin, w ij (new) = w ij (old) + α [x i ‑ w ij (old)] for all i. Decrease α (linearly or geometrically) and reduce R (at a specified rate) if R > 0. Kohonen Learning Steps

NNET19.M

Two input neurons ‑ one each for the x and y coordinate numbers for the cities The Traveling Salesman Problem Distance function: D(j) = (w 1j ‑ x 1 ) 2 + (w 2j ‑ x 2 ) 2 Example - 4 cities on the vertices of a square

The coordinates of the cities: Weight matrices from three different runs: W = [ 1 1 ‑ 1 ‑ 1 1 ‑ 1 ‑ 1 1] W = [ 1 ‑ 1 ‑ ‑ 1 ‑ 1] W = [ ‑ ‑ 1 ‑ 1 ‑ 1 1 1]

100 “Randomly” Located Cities 0.1 < α < 0.25, 100 epochs, Initial R = 2