Supplemental slides for CSE 327 Prof. Jeff Heflin

Slides:



Advertisements
Similar presentations
Artificial Intelligence 12. Two Layer ANNs
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Slides from: Doug Gray, David Poole
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Support Vector Machines
Ch. 19 – Knowledge in Learning Supplemental slides for CSE 327 Prof. Jeff Heflin.
Learning Department of Computer Science & Engineering Indian Institute of Technology Kharagpur.
Reading for Next Week Textbook, Section 9, pp A User’s Guide to Support Vector Machines (linked from course website)
Ch. 18 – Learning Supplemental slides for CSE 327 Prof. Jeff Heflin.
Classification Neural Networks 1
Artificial Intelligence Statistical learning methods Chapter 20, AIMA (only ANNs & SVMs)
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Linear Learning Machines  Simplest case: the decision function is a hyperplane in input space.  The Perceptron Algorithm: Rosenblatt, 1956  An on-line.
Data Mining with Neural Networks (HK: Chapter 7.5)
CS 4700: Foundations of Artificial Intelligence
Linear Discriminators Chapter 20 From Data to Knowledge.
Machine Learning Chapter 4. Artificial Neural Networks
Appendix B: An Example of Back-propagation algorithm
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
An informal description of artificial neural networks John MacCormick.
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
Is this a square or a pentagon? It is a square.
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18.
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
CS621 : Artificial Intelligence
Teacher Miguel’s. For the month of November our focus was recognition of different shapes and colors. We look around the classroom and find different.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
= 5 = 2 = 4 = 3 How could I make 13 from these shapes? How could I make 19 from these shapes? STARTER.
Start with student evals. What function does perceptron #4 represent?
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Machine Learning Lecture 1: Intro + Decision Trees Moshe Koppel Slides adapted from Tom Mitchell and from Dan Roth.
Neural Networks References: “Artificial Intelligence for Games” "Artificial Intelligence: A new Synthesis"
129 Feed-Forward Artificial Neural Networks AMIA 2003, Machine Learning Tutorial Constantin F. Aliferis & Ioannis Tsamardinos Discovery Systems Laboratory.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Neural networks and support vector machines
Other Classification Models: Neural Network
ECE 5424: Introduction to Machine Learning
Biological Inspiration
Chapter 2 Single Layer Feedforward Networks
Other Classification Models: Neural Network
CSE 473 Introduction to Artificial Intelligence Neural Networks
Supplemental slides for CSE 327 Prof. Jeff Heflin
Neural Networks CS 446 Machine Learning.
CSSE463: Image Recognition Day 17
Linear Discriminators
Supplemental slides for CSE 327 Prof. Jeff Heflin
Classification Neural Networks 1
Biological Neuron Cell body Dendrites Axon Impulses Synapses
Fractions 1/2 1/8 1/3 6/8 3/4.
CS Fall 2016 (Shavlik©), Lecture 2

Backpropagation.
CSSE463: Image Recognition Day 17
Here are four triangles. What do all of these triangles have in common
Advanced Artificial Intelligence Classification
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 13
Machine Learning: Lecture 4
Support Vector Machines and Kernels
Machine Learning: UNIT-2 CHAPTER-1
CSSE463: Image Recognition Day 18
Artificial Intelligence 12. Two Layer ANNs
Ensembles An ensemble is a set of classifiers whose combined results give the final decision. test feature vector classifier 1 classifier 2 classifier.
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 18
Shapes.
Translate 5 squares left and 4 squares up.
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya.
Presentation transcript:

Supplemental slides for CSE 327 Prof. Jeff Heflin Ch. 18 – Learning Supplemental slides for CSE 327 Prof. Jeff Heflin

Decision Tree Learning function Dec-Tree-Learn(examples,attribs,parent_examples) returns a decision tree if examples is empty then return Plurality-Value(parent_examples) else if all examples have the same classification then return the classification else if attribs is empty then return Plurality-Value (examples) else A  argmaxaattribs Importance(a, examples) tree  a new decision tree with root test A for each value vk of A do exs  {e : e  examples and e.A = vk} subtree  Dec-Tree-Learn (exs,attribs – A, examples) add a branch to tree with label (A = vk) and subtree subtree return tree From Figure 18.5, p. 702

Decision Tree Data Set Example Color Size Shape Goal Predicate X1 blue small square no X2 green large triangle X3 red circle yes X4 X5 yellow X6 X7 X8

Decision Tree Result Shape? No No Color? Yes No Yes Yes +: X3,X6 -: X1,X2,X4,X5,X7,X8 Shape? circle square triangle +: -: X2,X7 +: -: X1,X4,X8 +: X3,X6 -: X5 No No Color? green red yellow blue +: X3,X6 -: +: -: X5 +: -: +: -: Yes No Yes Yes

Alternate Decision Tree +: X3,X6 -: X1,X2,X4,X5,X7,X8 What if Size was the first attribute? Size? small large +: X3 -: X2,X7 +: X6 -: X1,X4,X5,X8 Color? Shape? red yellow blue green circle square triangle +:X6 -:X8 +: -: X5 +: -: X1 +: -: X4 +: X3 -: +: -: +: -: X2,X7 Shape? No No No Yes No No circle square triangle +: X6 -: +: -: X8 +: -: Yes No No

A Neuron

Perceptron Learning function Perceptron-Learning(examples,network) returns a perceptron hypothesis inputs: examples, a set of examples with input x and output y network, a perceptron with weights Wj and activation function g repeat for each example (x,y) in examples do Err  y – g(in) for each j in 0..n Wj  Wj +   Err  g’(in)  xj until some stopping criteria is satisfied return Neural-Net-Hypothesis(network)

Perceptron Training Example -1 W0=0.2 W1= -0.2 W2= 0.3 Training Set X1 X2 Y 1 =0.1 Epoch Ex X0 W0 X1 W1 X2 W2 in out Y Err W0 W1 W2 1 -1 0.2 -0.2 0.3 -0.1 0.1*1*-1 0.1*1*1 2 0.1 0.4 0.1*-1*-1 0.1*-1*0 0.1*-1*1 3 0.1*0*-1 0.1*0*0 4 -0.3 0.1*0*1 2  0  1  -0.4 = W1X1 + W2X2 – W0 = f(in) = Y - out = *Err*Xi

NETTalk /r/ … … O _ A R E _ Y 26 output units one layer of 80 hidden units … 7x29 input units O _ A R E _ Y

ALVINN 30 output units 5 hidden units 1 input pixel Straight ahead Sharp right Sharp left 30 output units 5 hidden units 1 input pixel Pictures from Tom Mitchell’s Machine Learning book slides Input is 30x32 pixels = 960 values

SVM Kernels Non-linear separator in 2 dimensions: Mapped to 3 dimensions