Mihir Patel and Nikhil Sardana

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

Pattern Association.
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Slides from: Doug Gray, David Poole
NEURAL NETWORKS Backpropagation Algorithm
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Machine Learning Neural Networks
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
1 Part I Artificial Neural Networks Sofia Nikitaki.
The back-propagation training algorithm
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Neural Networks Lecture 8: Two simple learning algorithms
Artificial neural networks:
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Networks
1 Artificial Neural Networks Sanun Srisuk EECP0720 Expert Systems – Artificial Neural Networks.
Explorations in Neural Networks Tianhui Cai Period 3.
Backpropagation An efficient way to compute the gradient Hung-yi Lee.
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
CSC321: Neural Networks Lecture 2: Learning with linear neurons Geoffrey Hinton.
Multi-Layer Perceptron
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
ADALINE (ADAptive LInear NEuron) Network and
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
1 Technological Educational Institute Of Crete Department Of Applied Informatics and Multimedia Intelligent Systems Laboratory.
Intro. ANN & Fuzzy Systems Lecture 11. MLP (III): Back-Propagation.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Combining Neural Networks and Context-Driven Search for On- Line, Printed Handwriting Recognition in the Newton Larry S. Yaeger, Brandn J. Web, and Richard.
Back Propagation and Representation in PDP Networks
Back Propagation and Representation in PDP Networks
Support Vector Machines
Learning with Perceptrons and Neural Networks
Artificial neural networks:
CSE 473 Introduction to Artificial Intelligence Neural Networks
Java Implementation of Optimal Brain Surgeon
CSE P573 Applications of Artificial Intelligence Neural Networks
Generalization ..
CSE 473 Introduction to Artificial Intelligence Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Lecture 11. MLP (III): Back-Propagation
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Neural Networks Geoff Hulten.
Backpropagation.
Fundamentals of Neural Networks Dr. Satinder Bal Gupta
Artificial Neural Networks
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Back Propagation and Representation in PDP Networks
Artificial Neural Networks
Mihir Patel and Nikhil Sardana
Mihir Patel and Nikhil Sardana
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Prof. Carolina Ruiz Department of Computer Science
Overall Introduction for the Lecture
Presentation transcript:

Mihir Patel and Nikhil Sardana Neural Networks Mihir Patel and Nikhil Sardana

Introduction Neural networks are modeled after the brain

The Neuron Each neuron takes in multiple inputs Each one has an associated weight These are summed An activation function is run on the sum This yields a standardized output

Biases Biases give the default case

Full Network

Cost Function The cost function essentially is the difference between the actual output and the expected output The squaring allows for removal of signs

Minimizing the Cost Function: Hill Climbing Run each direction to see if It is a better move

Minimizing the Cost Function: Derivatives Derivatives allow you to know which direction is better instantly

Backpropagation You only know error in output Backpropagate the error

Why Don’t Networks Achieve the Same Results? There are multiple local minimums!

Matrix Representation Forward propagation can be simplified using matrix math Can be parallelized for GPU based acceleration

Pros and Cons Pros Very powerful — can identify nearly any pattern even when a human cannot With GPUs are now able to be trained in reasonable amounts of time Flexible and adaptable to problem situations Cons Require large amounts of data For these networks, they require a human to pre-identify the correct answer