Ch. 19 – Knowledge in Learning

Slides:



Advertisements
Similar presentations
Concept Learning and the General-to-Specific Ordering
Advertisements

Ch. 19 – Knowledge in Learning Supplemental slides for CSE 327 Prof. Jeff Heflin.
Ch. 5 – Adversarial Search Supplemental slides for CSE 327 Prof. Jeff Heflin.
Ch. 2 – Intelligent Agents Supplemental slides for CSE 327 Prof. Jeff Heflin.
Search in the semantic domain. Some definitions atomic formula: smallest formula possible (no sub- formulas) literal: atomic formula or negation of an.
Last time Proof-system search ( ` ) Interpretation search ( ² ) Quantifiers Equality Decision procedures Induction Cross-cutting aspectsMain search strategy.
 x (x 2  0) 1. True2. False.  x (3x + 2 = 12) 1. True2. False.
MACHINE LEARNING. What is learning? A computer program learns if it improves its performance at some task through experience (T. Mitchell, 1997) A computer.
Artificial Intelligence University Politehnica of Bucharest Adina Magda Florea
Learning Holy grail of AI. If we can build systems that learn, then we can begin with minimal information and high-level strategies and have systems better.
Ch. 3 – Search Supplemental slides for CSE 327 Prof. Jeff Heflin.
November 10, Machine Learning: Lecture 9 Rule Learning / Inductive Logic Programming.
Machine Learning Chapter 2. Concept Learning and The General-to-specific Ordering Tom M. Mitchell.
Ch. 9 – FOL Inference Supplemental slides for CSE 327 Prof. Jeff Heflin.
Overview Concept Learning Representation Inductive Learning Hypothesis
1 Inductive Learning (continued) Chapter 19 Slides for Ch. 19 by J.C. Latombe.
Supplemental slides for CSE 327 Prof. Jeff Heflin
Ch. 3 – Search Supplemental slides for CSE 327 Prof. Jeff Heflin.
Machine Learning Concept Learning General-to Specific Ordering
Basic Definitions of Set Theory Lecture 24 Section 5.1 Fri, Mar 2, 2007.
Ch. 7 – Logical Agents Supplemental slides for CSE 327 Prof. Jeff Heflin.
Ch. 3 – Search Supplemental slides for CSE 327 Prof. Jeff Heflin.
Ch. 4 – Informed Search Supplemental slides for CSE 327 Prof. Jeff Heflin.
Computational Learning Theory Part 1: Preliminaries 1.
Basic Definitions of Set Theory Lecture 23 Section 5.1 Mon, Feb 21, 2005.
Chap. 10 Learning Sets of Rules 박성배 서울대학교 컴퓨터공학과.
Inductive Learning (2/2) Version Space and PAC Learning Russell and Norvig: Chapter 18, Sections 18.5 through 18.7 Chapter 18, Section 18.5 Chapter 19,
Section 1.4. Propositional Functions Propositional functions become propositions (and have truth values) when their variables are each replaced by a value.
Ch. 7 – Logical Agents Supplemental slides for CSE 327 Prof. Jeff Heflin.
Naive Bayes Classifier. REVIEW: Bayesian Methods Our focus this lecture: – Learning and classification methods based on probability theory. Bayes theorem.
CSE 143 read: 12.5 Lecture 18: recursive backtracking.
CSE573 Autumn /09/98 Machine Learning Administrative –Last topic: Decision Tree Learning Reading: 5.1, 5.4 Last time –finished NLP sample system’s.
CSE573 Autumn /11/98 Machine Learning Administrative –Finish this topic –The rest of the time is yours –Final exam Tuesday, Mar. 17, 2:30-4:20.
Learning by Analyzing differences: Positive and negative examples
Action Modeling with Graph-Based Version Spaces in Soar
Chapter 2 Concept Learning
CS 9633 Machine Learning Inductive-Analytical Methods
Ch. 2 – Intelligent Agents
2.1 Propositions and Logical Operations
Supplemental slides for CSE 327 Prof. Jeff Heflin
First-Order Logic and Inductive Logic Programming
Ordering of Hypothesis Space
Prof. Neary Adapted from slides by Dr. Katherine Gibson
Introduction to javadoc
Supplemental slides for CSE 327 Prof. Jeff Heflin
Version Spaces Learning
Depth-First Searches Introduction to AI.
Chapter 3 Probability Sampling Theory Hypothesis Testing.
A General Backtracking Algorithm
A General Backtracking Algorithm
Introduction to javadoc
IES 511 Machine Learning Dr. Türker İnce (Lecture notes by Prof. T. M
Templates of slides for P4 Experiments with your synthesizer
Predicates and Quantifiers
CSC 172 DATA STRUCTURES.
Supplemental slides for CSE 327 Prof. Jeff Heflin
Machine Learning Chapter 2
Demonstration of the value of a Bayesian approach when interpreting a single experiment. Demonstration of the value of a Bayesian approach when interpreting.
Inductive Learning (2/2) Version Space and PAC Learning
Implementation of Learning Systems
Reading: Chapter 4.5 HW#2 out today, due Oct 5th
Version Space Machine Learning Fall 2018.
Supplemental slides for CSE 327 Prof. Jeff Heflin
Supplemental slides for CSE 327 Prof. Jeff Heflin
Supplemental slides for CSE 327 Prof. Jeff Heflin
Ch. 2 – Intelligent Agents
Machine Learning Chapter 2
Supplemental slides for CSE 327 Prof. Jeff Heflin
Depth-First Searches.
Supplemental slides for CSE 327 Prof. Jeff Heflin
Presentation transcript:

Ch. 19 – Knowledge in Learning Supplemental slides for CSE 327 Prof. Jeff Heflin

Current Best Hypothesis Search function CURRENT-BEST-LEARNING(examples) returns a hypothesis H  any hypothesis consistent with the first example in examples for each remaining example in examples do if e is false positive for H then H  choose a specialization of H consistent with examples else if e is false negative for H then H  choose a generalization of H consistent with examples if no consistent specialization/generalization can be found then fail return H Note: here choose is a special operator that allows you to backtrack to a previous choice and select another option when the search fails. An actual implementation would probably use depth-first search instead. From Figure 19.2, p. 681

Example Learning Problem (current best hypothesis search) Training Set Example Descriptions Classifications X1 Color(X1,Red)  Size(X1,Large)  Shape(X1,Circle) Q(X1) X2 Color(X2,Blue)  Size(X2,Large)  Shape(X2,Square) Q(X2) X3 Color(X3,Red)  Size(X3,Small)  Shape(X3,Square) Q(X3) X4 Color(X4,Green)  Size(X4,Large)  Shape(X4,Triangle) Q(X4) X5 Color(X5,Red)  Size(X5,Small)  Shape(X5,Circle) Q(X5) Only consider candidate definitions that are positive conjunctive sentences

Current-Best Hypothesis Search 1 hypothesis True example: status X1: ok FP = false positive 2 True FN = false negative X2: FP 3 Color(x,Red) Shape(x,Circle) X3: FP X3: ? 4 Color(x,Red)  Size(x,Large) 6 Color(x,Red)  Shape(x,Circle) X4: ok X4: ok 5 Color(x,Red)  Size(x,Large) 7 Color(x,Red)  Shape(x,Circle) X5: FN X5: ok

Version Space Learning function VERSION-SPACE-LEARNING(examples) returns a version space local variables: V, the version space (the set of all hypotheses) V  the set of all hypotheses for each example e in examples do if V is not empty then V  VERSION-SPACE-UPDATE(V,e) return V function VERSION-SPACE-UPDATE(V,e) returns an updated version space V  {h  V: h is consistent with e} return V From Figure 19.3, p. 683

Ordering on Hypothesis Space P(x) Q(x) R(x) P(x)  Q(x) P(x)  R(x) Q(x)  R(x) P(x)  Q(x)  R(x)

Version Space Update Details function VERSION-SPACE-UPDATE(G,S,e) returns an updated G-set and S-set (version space) for each g in G if e is a false positive for g G  G – g G  G  {h : h is the most general specialization of g that is consistent with e and h is more general than some member of S} else if e is a false negative for g G  G – g for each s in S if e is a false positive for s S  S – s else if e is a false negative for s S  S – s S  S  {h : h is the most specific generalization of s that is consistent with e and h is more specific than some member of G} return G,S

Example Learning Problem (version space learning) Training Set Descriptions Classifications Size(X1,Large)  Shape(X1,Circle)  Color(X1,Red) Q(X1) Size(X2,Large)  Shape(X2,Square)  Color(X2,Blue) Q(X2) Size(X3,Small)  Shape(X3,Circle)  Color(X3,Red) Q(X3) Size(X4,Small)  Shape(X4,Circle)  Color(X4,Blue) Q(X4) Size(X5,Large)  Shape(X5,Square)  Color(X5,Red) Q(X5) Only consider candidate definitions that are positive conjunctive sentences