Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, February 7, 2001.

Slides:



Advertisements
Similar presentations
1 Machine Learning: Lecture 3 Decision Tree Learning (Based on Chapter 3 of Mitchell T.., Machine Learning, 1997)
Advertisements

UIUC CS 497: Section EA Lecture #2 Reasoning in Artificial Intelligence Professor: Eyal Amir Spring Semester 2004.
Knowledge Representation and Reasoning Learning Sets of Rules and Analytical Learning Harris Georgiou – 4.
Computing & Information Sciences Kansas State University Lecture 20 of 42 CIS 530 / 730 Artificial Intelligence Lecture 20 of 42 Introduction to Classical.
Relational Data Mining in Finance Haonan Zhang CFWin /04/2003.
Machine Learning CSE 473. © Daniel S. Weld Topics Agency Problem Spaces Search Knowledge Representation Reinforcement Learning InferencePlanning.
Machine Learning: Symbol-Based
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 22 Jim Martin.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, January 19, 2001.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 25 January 2008 William.
Inductive Logic Programming Includes slides by Luis Tari CS7741L16ILP.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Monday, April 3, 2000 DingBing.
Notes for Chapter 12 Logic Programming The AI War Basic Concepts of Logic Programming Prolog Review questions.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 26 of 41 Friday, 22 October.
Computing & Information Sciences Kansas State University Friday, 21 Nov 2008CIS 530 / 730: Artificial Intelligence Lecture 35 of 42 Friday, 21 November.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Wednesday, 11 April 2007 William.
Kansas State University Department of Computing and Information Sciences CIS 736: Computer Graphics Wednesday, February 23, 2000 William H. Hsu Department.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, January 19, 2000.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Friday, February 4, 2000 Lijun.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 Wednesday, 20 October.
Computing & Information Sciences Kansas State University Wednesday, 15 Oct 2008CIS 530 / 730: Artificial Intelligence Lecture 20 of 42 Wednesday, 15 October.
November 10, Machine Learning: Lecture 9 Rule Learning / Inductive Logic Programming.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 16 February 2007 William.
Computing & Information Sciences Kansas State University Paper Review Guidelines KDD Lab Course Supplement William H. Hsu Kansas State University Department.
Machine Learning Chapter 2. Concept Learning and The General-to-specific Ordering Tom M. Mitchell.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Monday, January 22, 2001 William.
Computing & Information Sciences Kansas State University Wednesday, 22 Oct 2008CIS 530 / 730: Artificial Intelligence Lecture 22 of 42 Wednesday, 22 October.
Computing & Information Sciences Kansas State University Wednesday, 20 Sep 2006CIS 490 / 730: Artificial Intelligence Lecture 12 of 42 Wednesday, 20 September.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 of 41 Monday, 25 October.
Multi-Relational Data Mining: An Introduction Joe Paulowskey.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 11 of 41 Wednesday, 15.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, March 29, 2000.
Overview Concept Learning Representation Inductive Learning Hypothesis
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, January 24, 2001.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 13 of 41 Monday, 20 September.
1 CS 385 Fall 2006 Chapter 1 AI: Early History and Applications.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 17 Wednesday, 01 October.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Thursday, November 29, 2001.
For Monday Finish chapter 19 No homework. Program 4 Any questions?
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, February 2, 2000.
Computing & Information Sciences Kansas State University Monday, 25 Sep 2006CIS 490 / 730: Artificial Intelligence Lecture 14 of 42 Monday, 25 September.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 23 Friday, 17 October.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 14 of 41 Wednesday, 22.
For Monday Finish chapter 19 Take-home exam due. Program 4 Any questions?
CS 5751 Machine Learning Chapter 10 Learning Sets of Rules1 Learning Sets of Rules Sequential covering algorithms FOIL Induction as the inverse of deduction.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 18 of 41 Friday, 01 October.
Computing & Information Sciences Kansas State University Paper Review Guidelines KDD Lab Course Supplement William H. Hsu Kansas State University Department.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Friday, 14 November 2003 William.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Monday, January 24, 2000 William.
Computing & Information Sciences Kansas State University Wednesday, 19 Sep 2007CIS 530 / 730: Artificial Intelligence Lecture 12 of 42 Wednesday, 19 September.
Computing & Information Sciences Kansas State University Friday, 20 Oct 2006CIS 490 / 730: Artificial Intelligence Lecture 24 of 42 Friday, 20 October.
Computing & Information Sciences Kansas State University Lecture 12 of 42 CIS 530 / 730 Artificial Intelligence Lecture 12 of 42 William H. Hsu Department.
Computing & Information Sciences Kansas State University Wednesday, 13 Sep 2006CIS 490 / 730: Artificial Intelligence Lecture 10 of 42 Wednesday, 13 September.
Concept Learning and The General-To Specific Ordering
Computing & Information Sciences Kansas State University CIS 530 / 730: Artificial Intelligence Lecture 09 of 42 Wednesday, 17 September 2008 William H.
Chap. 10 Learning Sets of Rules 박성배 서울대학교 컴퓨터공학과.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 14 of 42 Wednesday, 22.
Computing & Information Sciences Kansas State University Wednesday, 04 Oct 2006CIS 490 / 730: Artificial Intelligence Lecture 17 of 42 Wednesday, 04 October.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, January 26, 2000.
CS-424 Gregory Dudek Lecture 14 Learning –Inductive inference –Probably approximately correct learning.
Computing & Information Sciences Kansas State University Friday, 13 Oct 2006CIS 490 / 730: Artificial Intelligence Lecture 21 of 42 Friday, 13 October.
FNA/Spring CENG 562 – Machine Learning. FNA/Spring Contact information Instructor: Dr. Ferda N. Alpaslan
Computing & Information Sciences Kansas State University Monday, 18 Sep 2006CIS 490 / 730: Artificial Intelligence Lecture 11 of 42 Monday, 18 September.
Computing & Information Sciences Kansas State University Friday, 03 Oct 2008CIS 530 / 730: Artificial Intelligence Lecture 16 of 42 Friday, 03 October.
Computing & Information Sciences Kansas State University Lecture 42 of 42 CIS 732 Machine Learning & Pattern Recognition Lecture 42 of 42 Genetic Programming.
Artificial Intelligence
CS 9633 Machine Learning Concept Learning
Analytical Learning Discussion (4 of 4):
Machine Learning: Lecture 3
Lecture 14 Learning Inductive inference
Presentation transcript:

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, February 7, 2001 William H. Hsu Department of Computing and Information Sciences, KSU Readings: “Using Inductive Learning to Generate Rules for Semantic Query Optimization”, Hsu and Knoblock KDD Presentation (3 of 3): Rule Induction Lecture 10

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Presentation Outline Paper –“Using Inductive Learning to Generate Rules for Semantic Query Optimization” –Authors: C.-N. Hsu and C. A. Knoblock –In Advances in Knowledge Discovery in Databases (Fayyad, Piatetsky-Shapiro, Smyth, Uthurusamy, eds.) Overview –Learning semantic knowledge Rule induction Purpose: semantic query optimization (SQO) –Analogue: inductive logic programming (ILP) Knowledge representation: Horn clauses Idea: use reformulation of queries to learn (induce) rules Application of Machine Learning to KDD: Issues –Rules: Good hypothesis language for performance element (SQO)? –How are goals of database query speedup achieved? –Key strengths: straightforward induction method; can use domain theory

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Induction as Inverted Deduction: Design Principles Recall: Definition of Induction –Induction: finding h such that   D. (B  D  x i ) |  f(x i ) A |  B means A logically entails B x i  ith target instance f(x i ) is the target function value for example x i (data set D = { }) Background knowledge B (e.g., inductive bias in inductive learning) Idea –Design inductive algorithm by inverting operators for automated deduction –Same deductive operators as used in theorem proving Theorem Prover Deductive System for Inductive Learning Training Examples New Instance Assertion { c  H } Inductive bias made explicit Classification of New Instance (or “Don’t Know”)

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Induction as Inverted Deduction: Example Deductive Query –“Pairs of people such that u is a child of v” –Relations (predicates) Child (target predicate) Father, Mother, Parent, Male, Female Learning Problem –Formulation Concept learning: target function f is Boolean-valued i.e., target predicate –Components Target function f(x i ): Child (Bob, Sharon) x i : Male (Bob), Female (Sharon), Father (Sharon, Bob) B: {Parent (x, y)  Father (x, y). Parent (x, y)  Mother (x, y).} –What satisfies   D. (B  D  x i ) |  f(x i )? h 1 : Child (u, v)  Father (v, u).- doesn’t use B h 2 : Child (u, v)  Parent (v, u).- uses B

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Induction as Inverted Deduction: Advantages and Disadvantages Advantages (Pros) –Subsumes earlier idea of finding h that “fits” training data –Domain theory B helps define meaning of “fitting” the data: B  D  x i |  f(x i ) –Suggests algorithms that search H guided by B Theory-guided constructive induction [Donoho and Rendell, 1995] aka Knowledge-guided constructive induction [Donoho, 1996] Disadvantages (Cons) –Doesn’t allow for noisy data Q: Why not? A: Consider what   D. (B  D  x i ) |  f(x i ) stipulates –First-order logic gives a huge hypothesis space H Overfitting… Intractability of calculating all acceptable h’s

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Inverting Resolution: Example C: Pass-Exam   Study C 2 : Know-Material   Study C 1 : Pass-Exam   Know-Material Resolution C: Pass-Exam   Study C 2 : Know-Material   Study C 1 : Pass-Exam   Know-Material Inverse Resolution

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Goals (Section 17.1) –Use semantic rules to find “shortcuts” to queries Example: all CIS 864 students have studied basic probability Query: “Find all CIS 864 students who have had courses in probability and stochastic processes” Can drop condition –Learn rules from data Observe when query can be simplified Generalize over these “training cases” Background (Section 17.2) –Queries: Datalog  select-from-where subset of Structured Query Language (SQL) –Semantic rules: Horn clauses (cf. Prolog) Learning Framework (Section 17.3) –Concept: SatisfyInputQuery (+ iff instance, i.e., tuple, satistifes query) –Algorithm for dropping constraints (generalization): greedy min-set-cover –Heuristic (preference bias): gain/cost ratio Semantic Query Optimization (SQO) Methodology

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Learning Framework and Algorithm Given: Few Example Queries, Data Set D (Many Tuples) Methodology (Sections ) –Step 1 (Optimizer): optimize queries by dropping constraints if possible Use Greedy-Min-Set-Cover algorithm Call learning module to add rules to rule base –Step 2 (Find Alternative Queries): 2a (Construct Candidate Constraints): use gain/cost ratio (number of – cases excluded / syntactic length of constraint) Rationale: Occam’s Razor bias, min-set-cover (ratio-bounded approximation) 2b (Search for Constraints): build on newly-introduced relations –Step 3 (Update Rule Bank): apply newly discovered rules Put newly-induced rules into rule base Use inference engine (Prolog) to generate facts that will shorten query search

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Design Rationale Problem (Sections ) –How to generalize well over reformulable queries? –Want to make sure inducer does not overfit observed pattern of training examples Solution Approach (Section ) –Idea: Occam’s Razor bias Prefer shorter hypotheses, all other things being equal Why does this work? Types of Bias –Preference bias Captured (“encoded”) in learning algorithm Compare: search heuristic –Language bias Captured (“encoded”) in knowledge (hypothesis) representation Compare: restriction of search space aka restriction bias

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Experimental Method Experimental Results (Section 17.5) –Improvement using SQO by rule induction (Table 17.4) Reformulation using induced rules improves short and long queries (about uniformly) Speedup Breakdown of savings by NIL queries vs. overall Claims (Section 17.5) –SQO is scalable: can use rule induction on large DBs –SQO is general: can apply other search techniques, heuristics

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Summary: Content Critique Key Contribution –Simple, direct integration of inductive rule learning with SQO –Significance to KDD: good way to apply ILP-like learning in DB optimization –Applications Inference Decision support systems (DSS) Strengths –Somewhat generalizable approach Significant for KDD Applies to other learning-for-optimization inducers –Formal analysis of SQO complexity –Experiments: measure Speedup learning % time saved How wasted time is saved (NIL queries, short vs. long queries) cf. performance profiling Weaknesses, Tradeoffs, and Questionable Issues –Insufficient comparison of alternative heuristics (MDL, etc.) –Empirical performance of exhaustive search?

Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Summary: Presentation Critique Audience: Researchers and Practitioners of –AI (machine learning, intelligent database optimization) –Database management systems –Applied logic Positive and Exemplary Points –Good, abstract examples illustrating role of SQO and ILP –Real DB optimization example (3 Oracle DBs) Negative Points and Possible Improvements –Insufficient description of analytical hypothesis representations –Semantics: not clear how to apply other algorithms of rule induction Decision tree First-order ILP (e.g., FOIL)