Neural Nets How the brain achieves intelligence 10^11 1mhz cpu’s.

Slides:



Advertisements
Similar presentations
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Advertisements

Computational Learning An intuitive approach. Human Learning Objects in world –Learning by exploration and who knows? Language –informal training, inputs.
1 Image Classification MSc Image Processing Assignment March 2003.
Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington.
Machine Learning Lecture 4 Multilayer Perceptrons G53MLE | Machine Learning | Dr Guoping Qiu1.
Tuomas Sandholm Carnegie Mellon University Computer Science Department
CS Perceptrons1. 2 Basic Neuron CS Perceptrons3 Expanded Neuron.
Final Project: Project 9 Part 1: Neural Networks Part 2: Overview of Classifiers Aparna S. Varde April 28, 2005 CS539: Machine Learning Course Instructor:
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
Neural Networks Dr. Peter Phillips. Neural Networks What are Neural Networks Where can neural networks be used Examples Recognition systems (Voice, Signature,
Artificial Neural Networks
Introduction to Recurrent neural networks (RNN), Long short-term memory (LSTM) Wenjie Pei In this coffee talk, I would like to present you some basic.
Artificial Neural Networks
Linear Discriminators Chapter 20 From Data to Knowledge.
Artificial Neural Networks for Secondary Structure Prediction CSC391/691 Bioinformatics Spring 2004 Fetrow/Burg/Miller (slides by J. Burg)
Computer Science and Engineering
Multi-Layer Perceptrons Michael J. Watts
Neural Networks Chapter 6 Joost N. Kok Universiteit Leiden.
Artificial Neural Networks An Overview and Analysis.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
ANNs (Artificial Neural Networks). THE PERCEPTRON.
Machine Learning Chapter 4. Artificial Neural Networks
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
Appendix B: An Example of Back-propagation algorithm
Backpropagation An efficient way to compute the gradient Hung-yi Lee.
NEURAL NETWORKS FOR DATA MINING
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Figure 1.1 Rules for the contact lens data.. Figure 1.2 Decision tree for the contact lens data.
An informal description of artificial neural networks John MacCormick.
Multi-Layer Perceptron
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
Weka Just do it Free and Open Source ML Suite Ian Witten & Eibe Frank University of Waikato New Zealand.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Machine Learning (ML) with Weka Weka can classify data or approximate functions: choice of many algorithms.
Neural networks – Hands on
Diagonal is sum of variances In general, these will be larger when “within” class variance is larger (a bad thing) Sw(iris[,1:4],iris[,5]) Sepal.Length.
Neural Networks Lecture 4 out of 4. Practical Considerations Input Architecture Output.
Computational Intelligence Semester 2 Neural Networks Lecture 2 out of 4.
Chapter 13 Artificial Intelligence. Artificial Intelligence – Figure 13.1 The Turing Test.
Learning: Neural Networks Artificial Intelligence CMSC February 3, 2005.
Rodney Nielsen Many of these slides were adapted from: I. H. Witten, E. Frank and M. A. Hall Data Science Output: Knowledge Representation WFH: Data Mining,
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Chapter 18 From Data to Knowledge
Learning with Perceptrons and Neural Networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
CSSE463: Image Recognition Day 17
Linear Discriminators
Prof. Carolina Ruiz Department of Computer Science
Neural Networks Advantages Criticism
Classification Neural Networks 1
Weka Free and Open Source ML Suite Ian Witten & Eibe Frank
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
network of simple neuron-like computing elements
Neural Networks Chapter 5
Introduction to Neural Networks And Their Applications - Basics
Basics of Deep Learning No Math Required
CSSE463: Image Recognition Day 17
Neural Networks Geoff Hulten.
CSSE463: Image Recognition Day 17
Hubs and Authorities & Learning: Perceptrons
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
III. Introduction to Neural Networks And Their Applications - Basics
Lecture 09: Introduction Image Recognition using Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

Neural Nets How the brain achieves intelligence 10^11 1mhz cpu’s

Concerns Representation –What is it –What can it do Learnability –How can it be trained Efficiency –Of Learning –Of Learned concept

Weka’s Neural Net Output on Iris Node 0 Iris-Setosa Threshold Node Node Node Node 1 Iris-versicolor Threshold 1.06 Node Node Node Node 2 Iris-viginica Threshold Node Node Node Node 3 Threshold 3.38 sepallength 0.90 sepalwidth 1.56 petallength -5.0 petalwidth Node 4 Threshold sepallength sepalwidth 3.12 petallength petalwidth Node 5 Threshold sepallength sepalwidth petallength 8.40 petalwidth 9.46

Representation : Feed-Forward Neural Net DAG of perceptrons Leaf nodes take inputs Outputs node yield decisions Architecture: no one knows how to build them. Weights: trained by “hill-climbing”; slow and guarantee of only local optimum.

Representational Power Any boolean function can be represented in disjunctive or conjunctive normal form. Disjunctive = “or” of “anded” features. Since perceptron can learn “or” and “and”, 2-layer network can represent any boolean function.

Neural Nets Work Disease diagnosis: 90% accurate on prostate cancer prediction Handwritten Character Recognition (5-layer net) 99% accurate NetTalk: 80 hidden units, 28 inputs. 78% accuracy. Sounds like child learning to talk.

Summary Neural nets can do multiple classes and regression Training is slow Performance is fast and high quality No one knows how to create architecture Neural nets tend to be incomprehensible