Artificial Neural Networks

Slides:



Advertisements
Similar presentations
NEURAL NETWORKS Biological analogy
Advertisements

Slides from: Doug Gray, David Poole
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Lecture 14 – Neural Networks
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Neural Networks Basic concepts ArchitectureOperation.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
Biological inspiration Animals are able to react adaptively to changes in their external and internal environment, and they use their nervous system to.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
Neural Networks Chapter Feed-Forward Neural Networks.
Artificial Neural Networks Torsten Reil
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Spring 2002 Shreekanth Mandayam Robi Polikar ECE Department.
Data Mining with Neural Networks (HK: Chapter 7.5)
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
Overview of Back Propagation Algorithm
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Supervised Learning: Perceptrons and Backpropagation.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Explorations in Neural Networks Tianhui Cai Period 3.
1 Machine Learning Neural Networks Neural Networks NN belong to the category of trained classification models The learned classification model is an.
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Appendix B: An Example of Back-propagation algorithm
Backpropagation An efficient way to compute the gradient Hung-yi Lee.
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Techniques Multilayer Perceptrons.
1 Introduction to Neural Networks And Their Applications.
Multi-Layer Perceptron
Music Genre Classification Alex Stabile. Example File
Procedure for Training a Child to Identify a Cat using 10,000 Example Cats For Cat_index  1 to Show cat and describe catlike features (Cat_index)
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
1 Machine Learning Neural Networks Tomorrow no lesson! (will recover on Wednesday 11, WEKA !!)
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Neural Network and Deep Learning 王强昌 MLA lab.
Artificial Neural Networks. 2 Outline What are Neural Networks? Biological Neural Networks ANN – The basics Feed forward net Training Example – Voice.
CAP6938 Neuroevolution and Artificial Embryogeny Neural Network Weight Optimization Dr. Kenneth Stanley January 18, 2006.
Neural Network Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Alex Stabile. Research Questions: Could a computer learn to distinguish between different composers? Why does music by different composers even sound.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Neural Networks.
Real Neurons Cell structures Cell body Dendrites Axon
Photorealistic Image Colourization with Generative Adversarial Nets
CSE 473 Introduction to Artificial Intelligence Neural Networks
Dr. Kenneth Stanley September 6, 2006
Neural Networks A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Introduction to Neural Networks And Their Applications
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
Prof. Carolina Ruiz Department of Computer Science
General Aspects of Learning
Example: Voice Recognition
XOR problem Input 2 Input 1
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Neural Networks Geoff Hulten.
Comp3710 Artificial Intelligence Thompson Rivers University
COSC 4335: Part2: Other Classification Techniques
3. Feedforward Nets (1) Architecture
David Kauchak CS51A Spring 2019
General Aspects of Learning
Artificial Neural Networks / Spring 2002
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Artificial Neural Networks Torsten Reil torsten.reil@zoo.ox.ac.uk

Example: Voice Recognition Task: Learn to discriminate between two different voices saying “Hello” Data Sources Steve Simpson David Raubenheimer Format Frequency distribution (60 bins) Analogy: cochlea

Network architecture Feed forward network 60 input (one for each frequency bin) 6 hidden 2 output (0-1 for “Steve”, 1-0 for “David”)

Presenting the data Steve David

Presenting the data (untrained network) Steve 0.43 0.26 David 0.73 0.55

Calculate error 0.43 – 0 = 0.43 0.26 –1 = 0.74 0.73 – 1 = 0.27 Steve 0.43 – 0 = 0.43 0.26 –1 = 0.74 David 0.73 – 1 = 0.27 0.55 – 0 = 0.55

Backprop error and adjust weights Steve 0.43 – 0 = 0.43 0.26 – 1 = 0.74 1.17 David 0.73 – 1 = 0.27 0.55 – 0 = 0.55 0.82

Repeat process (sweep) for all training pairs Present data Calculate error Backpropagate error Adjust weights Repeat process multiple times

Presenting the data (trained network) Steve 0.01 0.99 David 0.99 0.01

Results – Voice Recognition Performance of trained network Discrimination accuracy between known “Hello”s 100% Discrimination accuracy between new “Hello”’s Demo

Results – Voice Recognition (ctnd.) Network has learnt to generalise from original data Networks with different weight settings can have same functionality Trained networks ‘concentrate’ on lower frequencies Network is robust against non-functioning nodes