Ch. 18 – Learning Supplemental slides for CSE 327 Prof. Jeff Heflin.

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

Revision.
Linear Separators.
Support Vector Machines
Ch. 19 – Knowledge in Learning Supplemental slides for CSE 327 Prof. Jeff Heflin.
Ch. 2 – Intelligent Agents Supplemental slides for CSE 327 Prof. Jeff Heflin.
Back-propagation Chih-yun Lin 5/16/2015. Agenda Perceptron vs. back-propagation network Network structure Learning rule Why a hidden layer? An example:
Learning Department of Computer Science & Engineering Indian Institute of Technology Kharagpur.
Reading for Next Week Textbook, Section 9, pp A User’s Guide to Support Vector Machines (linked from course website)
Classification Neural Networks 1
Overview over different methods – Supervised Learning
Artificial Intelligence Statistical learning methods Chapter 20, AIMA (only ANNs & SVMs)
An Illustrative Example
CS 4700: Foundations of Artificial Intelligence
Click on the text to see the different geometric proofs.
Areas How do you work out the area of this shape ? HINT.
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
Classification / Regression Neural Networks 2
1 Chapter 20 Section Slide Set 2 Perceptron examples Additional sources used in preparing the slides: Nils J. Nilsson’s book: Artificial Intelligence:
1 Pattern Classification X. 2 Content General Method K Nearest Neighbors Decision Trees Nerual Networks.
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
For Wednesday No reading Homework: –Chapter 18, exercise 6.
Ch. 9 – FOL Inference Supplemental slides for CSE 327 Prof. Jeff Heflin.
Is this a square or a pentagon? It is a square.
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18.
For Monday No new reading Homework: –Chapter 18, exercises 3 and 4.
1 Decision Tree Learning Original slides by Raymond J. Mooney University of Texas at Austin.
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
Miss. Ajsa’s Geometry shapes practice Click the arrow button to begin.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
CS621 : Artificial Intelligence
Chapter 2 Single Layer Feedforward Networks
Teacher Miguel’s. For the month of November our focus was recognition of different shapes and colors. We look around the classroom and find different.
Fractional Parts of a Whole By:. What part of this object is colored red?
= 5 = 2 = 4 = 3 How could I make 13 from these shapes? How could I make 19 from these shapes? STARTER.
Supervised Machine Learning: Classification Techniques Chaleece Sandberg Chris Bradley Kyle Walsh.
Ch. 3 – Search Supplemental slides for CSE 327 Prof. Jeff Heflin.
COMP53311 Other Classification Models: Neural Network Prepared by Raymond Wong Some of the notes about Neural Network are borrowed from LW Chan’s notes.
Ch. 4 – Informed Search Supplemental slides for CSE 327 Prof. Jeff Heflin.
Ch. 7 – Logical Agents Supplemental slides for CSE 327 Prof. Jeff Heflin.
Neural Networks References: “Artificial Intelligence for Games” "Artificial Intelligence: A new Synthesis"
129 Feed-Forward Artificial Neural Networks AMIA 2003, Machine Learning Tutorial Constantin F. Aliferis & Ioannis Tsamardinos Discovery Systems Laboratory.
Leaves Recognition By Zakir Mohammed Indiana State University Computer Science.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
CSSE463: Image Recognition Day 14
Design Elements and Principles. Part 1.
Ch. 2 – Intelligent Agents
Supplemental slides for CSE 327 Prof. Jeff Heflin
Supplemental slides for CSE 327 Prof. Jeff Heflin
Follow Directions with Colors and Shapes
Classification Neural Networks 1
Biological Neuron Cell body Dendrites Axon Impulses Synapses
Fractions 1/2 1/8 1/3 6/8 3/4.
Face Recognition with Neural Networks

network of simple neuron-like computing elements
Backpropagation.
Multilayer Perceptron & Backpropagation
Here are four triangles. What do all of these triangles have in common
How Many Colorful Shapes?
Introduction to Fractions
A connectionist model in action
numbers letters colors shapes animals 1pt 1 pt 1 pt 1pt 1 pt 2 pt 2 pt
Supplemental slides for CSE 327 Prof. Jeff Heflin
Shapes.
Translate 5 squares left and 4 squares up.
Supplemental slides for CSE 327 Prof. Jeff Heflin
Ch. 2 – Intelligent Agents
Supplemental slides for CSE 327 Prof. Jeff Heflin
Supplemental slides for CSE 327 Prof. Jeff Heflin
Presentation transcript:

Ch. 18 – Learning Supplemental slides for CSE 327 Prof. Jeff Heflin

Decision Tree Learning function D EC -T REE -L EARN (examples,attribs,parent_examples) returns a decision tree if examples is empty then return P LURALITY -V ALUE (parent_examples) else if all examples have the same classification then return the classification else if attribs is empty then return P LURALITY -V ALUE (examples) else A  argmax a  attribs I MPORTANCE ( A, examples) tree  a new decision tree with root test A for each value v k of A do exs  {e : e  examples and e.A = v k } subtree  D EC -T REE -L EARN (exs,attribs – A, examples) add a branch to tree with label (A = v k ) and subtree subtree return tree From Figure 18.5, p. 702

Decision Tree Data Set ExampleColorSizeShape Goal Predicate X1bluesmallsquareno X2greenlargetriangleno X3redlargecircleyes X4greensmallsquareno X5yellowsmallcircleno X6redsmallcircleyes X7bluelargetriangleno X8redsmallsquareno

Decision Tree Result Shape? Color? No Yes +: X3,X6 -: X1,X2,X4,X5,X7,X8 +: -: X1,X4,X8 +: -: X2,X7 +: X3,X6 -: X5 +: X3,X6 -: +: -: X5 +: -: square triangle circle red blue green yellow

A Neuron

Perceptron Learning function P ERCEPTRON -L EARNING (examples,network) returns a perceptron hypothesis inputs: examples, a set of examples with input x and output y network, a perceptron with weights W j and activation function g repeat for each example (x,y) in examples do Err  y – g(in) for each j in 0..n W j  W j +   Err  g’(in)  x j until some stopping criteria is satisfied return N EURAL -N ET -H YPOTHESIS (network)

NETTalk OE_Y_AR … … 26 output units one layer of 80 hidden units 7x29 input units /r/

ALVINN Input is 30x32 pixels = 960 values 1 input pixel 5 hidden units 30 output units Sharp right Straight ahead Sharp left

SVM Kernels Non-linear separator in 2 dimensions: Mapped to 3 dimensions