For Friday Finish Chapter 18 Homework: –Chapter 18, exercises 1-2.

Slides:



Advertisements
Similar presentations
Concept Learning and the General-to-Specific Ordering
Advertisements

2. Concept Learning 2.1 Introduction
Combining Inductive and Analytical Learning Ch 12. in Machine Learning Tom M. Mitchell 고려대학교 자연어처리 연구실 한 경 수
CS 484 – Artificial Intelligence1 Announcements Project 1 is due Tuesday, October 16 Send me the name of your konane bot Midterm is Thursday, October 18.
Università di Milano-Bicocca Laurea Magistrale in Informatica
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
An Introduction to Machine Learning In the area of AI (earlier) machine learning took a back seat to Expert Systems Expert system development usually consists.
Adapted by Doug Downey from: Bryan Pardo, EECS 349 Fall 2007 Machine Learning Lecture 2: Concept Learning and Version Spaces 1.
Data Mining with Decision Trees Lutz Hamel Dept. of Computer Science and Statistics University of Rhode Island.
Machine Learning CSE 473. © Daniel S. Weld Topics Agency Problem Spaces Search Knowledge Representation Reinforcement Learning InferencePlanning.
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
MACHINE LEARNING. What is learning? A computer program learns if it improves its performance at some task through experience (T. Mitchell, 1997) A computer.
Part I: Classification and Bayesian Learning
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
Planning Department of Computer Science & Engineering Indian Institute of Technology Kharagpur.
For Wednesday Read chapter 12, sections 3-5 Program 2 progress due.
Issues with Data Mining
For Monday No reading Homework: –Chapter 18, exercises 1 and 2.
1 What is learning? “Learning denotes changes in a system that... enable a system to do the same task more efficiently the next time.” –Herbert Simon “Learning.
Copyright R. Weber Machine Learning, Data Mining ISYS370 Dr. R. Weber.
Processing of large document collections Part 2 (Text categorization) Helena Ahonen-Myka Spring 2006.
1 Plan-Space Planning Dr. Héctor Muñoz-Avila Sources: Ch. 5 Appendix A Slides from Dana Nau’s lecture.
Inductive learning Simplest form: learn a function from examples
For Wednesday No new reading Homework: –Chapter 18, exercises 3, 4, 7.
For Monday Read chapter 18, sections 5-6 Homework: –Chapter 18, exercises 1-2.
For Friday Read chapter 18, sections 3-4 Homework: –Chapter 14, exercise 12 a, b, d.
CpSc 810: Machine Learning Design a learning system.
1 Machine Learning What is learning?. 2 Machine Learning What is learning? “That is what learning is. You suddenly understand something you've understood.
Machine Learning Chapter 11.
For Wednesday Read chapter 18, section 7 Homework: –Chapter 18, exercise 11.
Introduction Algorithms and Conventions The design and analysis of algorithms is the core subject matter of Computer Science. Given a problem, we want.
Learning from Observations Chapter 18 Through
For Monday Read chapter 12, sections 1-2 Homework: –Chapter 10, exercise 3.
Machine Learning Chapter 2. Concept Learning and The General-to-specific Ordering Tom M. Mitchell.
Chapter 2: Concept Learning and the General-to-Specific Ordering.
CpSc 810: Machine Learning Concept Learning and General to Specific Ordering.
Concept Learning and the General-to-Specific Ordering 이 종우 자연언어처리연구실.
Outline Inductive bias General-to specific ordering of hypotheses
Overview Concept Learning Representation Inductive Learning Hypothesis
For Friday Finish chapter 10 No homework. Program 2 Any questions?
For Friday No reading Homework: –Chapter 11, exercise 4.
CS Machine Learning 15 Jan Inductive Classification.
For Monday Finish chapter 19 Take-home exam due. Program 4 Any questions?
Dana Nau: Lecture slides for Automated Planning Licensed under the Creative Commons Attribution-NonCommercial-ShareAlike License:
Partial Order Plan Execution 1 Brian C. Williams J/6.834J Sept. 16 th, 2002 Slides with help from: Dan Weld Stuart Russell & Peter Norvig.
Machine Learning: Lecture 2
Machine Learning Concept Learning General-to Specific Ordering
Data Mining and Decision Support
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
Introduction Machine Learning: Chapter 1. Contents Types of learning Applications of machine learning Disciplines related with machine learning Well-posed.
Concept Learning and The General-To Specific Ordering
Computational Learning Theory Part 1: Preliminaries 1.
Consider the task get milk, bananas, and a cordless drill.
Concept learning Maria Simi, 2011/2012 Machine Learning, Tom Mitchell Mc Graw-Hill International Editions, 1997 (Cap 1, 2).
Dana Nau: Lecture slides for Automated Planning Licensed under the Creative Commons Attribution-NonCommercial-ShareAlike License:
Machine Learning Chapter 7. Computational Learning Theory Tom M. Mitchell.
1 CS 391L: Machine Learning: Computational Learning Theory Raymond J. Mooney University of Texas at Austin.
Data Mining Lecture 3.
Chapter 2 Concept Learning
Concept Learning Machine Learning by T. Mitchell (McGraw-Hill) Chp. 2
CS 9633 Machine Learning Concept Learning
For Wednesday Read Chapter 18, sections 1-3 Homework:
Dana S. Nau University of Maryland 3:10 AM September 12, 2018
Dana S. Nau University of Maryland 1:50 AM September 14, 2018
Machine Learning Chapter 2
Implementation of Learning Systems
Version Space Machine Learning Fall 2018.
Machine Learning Chapter 2
Presentation transcript:

For Friday Finish Chapter 18 Homework: –Chapter 18, exercises 1-2

Program 3 Any questions?

Example Op( Action: Go(there); Precond: At(here); Effects: At(there), ¬At(here) ) Op( Action: Buy(x), Precond: At(store), Sells(store,x); Effects: Have(x) ) A 0 : –At(Home) Sells(SM,Banana) Sells(SM,Milk) Sells(HWS,Drill) A  –Have(Drill) Have(Milk) Have(Banana) At(Home)

Example Steps Add three buy actions to achieve the goals Use initial state to achieve the Sells preconditions Then add Go actions to achieve new pre- conditions

Handling Threat Cannot resolve threat to At(Home) preconditions of both Go(HWS) and Go(SM). Must backtrack to supporting At(x) precondition of Go(SM) from initial state At(Home) and support it instead from the At(HWS) effect of Go(HWS). Since Go(SM) still threatens At(HWS) of Buy(Drill) must promote Go(SM) to come after Buy(Drill). Demotion is not possible due to causal link supporting At(HWS) precondition of Go(SM)

Example Continued Add Go(Home) action to achieve At(Home) Use At(SM) to achieve its precondition Order it after Buy(Milk) and Buy(Banana) to resolve threats to At(SM)

Machine Learning What do you think it is?

Machine Learning Defintion by Herb Simon: “Any process by which a system improves performance.”

Tasks Classification: –medical diagnosis, credit­card applications or transactions, investments, DNA sequences, spoken words, handwritten letters, astronomical images Problem solving, planning, and acting –solving calculus problems, playing checkers, chess, or backgamon, balancing a pole, driving a car

Performance How can we measure performance? That is, what kinds of things do we want to get out of the learning process, and how do we tell whether we’re getting them?

Performance Measures Classification accuracy Solution correctness and quality Speed of performance

Why Study Learning? (Other than your professor’s interest in it)

Study Learning Because... We want computer systems with new capabilities –Develop systems that are too difficult or impossible to construct manually because they require specific detailed knowledge or skills tuned to a particular complex task (knowledge acquisition bottleneck). –Develop systems that can automatically adapt and customize themselves to the needs of individual users through experience, e.g. a personalized news or mail filter, personalized tutoring. –Discover knowledge and patterns in databases, data mining, e.g. discovering purchasing patterns for marketing purposes.

Study Learning Because... Understand human and biological learning and teaching better. –Power law of practice. –Relative difficulty of learning disjunctive concepts. Time is right: –Initial algorithms and theory in place. –Growing amounts of on­line data. –Computational power available.

Designing a Learning System Choose the training experience. Choose what exactly is to be learned, i.e. the target function. Choose how to represent the target function. Choose a learning algorithm to learn the target function from the experience. Must distinguish between the learner and the performance element.

Architecture of a Learner Performance System Critic Generalizer Experiment Generator trace of behavior training instances learned function new problem

Training Experience Issues Direct or Indirect Experience –Direct: Chess boards labeled with correct move extracted from record of expert play. –Indirect: Potentially arbitrary sequences of moves and final games results. Credit/Blame assignment: –How do we assign blame to individual choices or moves when given only indirect feedback?

More on Training Experience Source of training data: –“Random” examples outside of learner’s control (negative examples available?) –Selected examples chosen by a benevolent teacher (near misses available?) –Ability to query oracle about correct classifications. –Ability to design and run experiments to collect one's own data. Distribution of training data: –Generally assume training data is representative of the examples to be judged on when tested for final performance.

Supervision of Learning Supervised Unsupervised Reinforcement

Concept Learning The most studied task in machine learning is inferring a function that classifies examples represented in some language as members or non­members of a concept from pre­classified training examples. This is called concept learning, or classification.

Simple Example

Concept Learning Definitions An instance is a description of a specific item. X is the space of all instances (instance space). The target concept, c(x), is a binary function over instances. A training example is an instance labeled with its correct value for c(x) (positive or negative). D is the set of all training examples. The hypothesis space, H, is the set of functions, h(x), that the learner can consider as possible definitions of c(x). The goal of concept learning is to find an h in H such that for all in D, h(x)= c(x).

Sample Hypothesis Space Consider a hypothesis language defined by a conjunction of constraints. For instances described by n features consider a vector of n constraints, where each c i is either: – ?, indicating that any value is possible for the ith feature –A specific value from the domain of the ith feature – , indicating no value is acceptable Sample hypotheses in this language: – – (most general hypothesis) – (most specific hypothesis)

Inductive Learning Hypothesis Any hypothesis that is found to approximate the target function well over a a sufficiently large set of training examples will also approximate the target function well over other unobserved examples. –Assumes that the training and test examples are drawn from the same general distribution. –This is fundamentally an unprovable hypothesis unless additional assumptions are made about the target concept.

Concept Learning As Search Concept learning can be viewed as searching the space of hypotheses for one (or more) consistent with the training instances. Consider an instance space consisting of n binary features, which therefore has 2 n instances. For conjunctive hypotheses, there are 4 choices for each feature: T, F, , ?, so there are 4 n syntactically distinct hypotheses, but any hypothesis with a  is the empty hypothesis, so there are 3 n + 1 semantically distinct hypotheses.

Search cont. The target concept could in principle be any of the 2 2^n (2 to the 2 to the n) possible binary functions on n binary inputs. Frequently, the hypothesis space is very large or even infinite and intractable to search exhaustively.

Learning by Enumeration For any finite or countably infinite hypothesis space, one can simply enumerate and test hypotheses one by one until one is found that is consistent with the training data. For each h in H do initialize consistent to true For each in D do if h(x)¹c(x) then set consistent to false If consistent then return h This algorithm is guaranteed to terminate with a consistent hypothesis if there is one; however it is obviously intractable for most practical hypothesis spaces, which are at least exponentially large.

Finding a Maximally Specific Hypothesis (FIND­S) Can use the generality ordering to find a most specific hypothesis consistent with a set of positive training examples by starting with the most specific hypothesis in H and generalizing it just enough each time it fails to cover a positive example.

Initialize h = For each positive training instance x For each attribute a i If the constraint on a i in h is satisfied by x Then do nothing Else If a i =  Then set a i in h to its value in x Else set a i to ``?'' Initialize consistent := true For each negative training instance x if h(x)=1 then set consistent := false If consistent then return h

Example Trace h = Encounter as positive h = Encounter as positive h = Check to ensure consistency with any negative examples: Negative:

Comments on FIND-S For conjunctive feature vectors, the most specific hypothesis that covers a set of positives is unique and found by FIND­S. If the most specific hypothesis consistent with the positives is inconsistent with a negative training example, then there is no conjunctive hypothesis consistent with the data since by definition it cannot be made any more specific and still cover all of the positives.

Example Positives:, Negatives: FIND­S ­> which matches negative

Inductive Bias A hypothesis space that does not not include every possible binary function on the instance space incorporates a bias in the type of concepts it can learn. Any means that a concept learning system uses to choose between two functions that are both consistent with the training data is called inductive bias.

Forms of Inductive Bias Language bias: –The language for representing concepts defines a hypothesis space that does not include all possible functions (e.g. conjunctive descriptions). Search bias: –The language is expressive enough to represent all possible functions (e.g. disjunctive normal form) but the search algorithm embodies a preference for certain consistent functions over others (e.g. syntactic simplicity).

Unbiased Learning For instances described by n attributes each with m values, there are m n instances and therefore 2 m^n possible binary functions. For m=2, n=10, there are 3.4 x functions, of which only 59,049 can be represented by conjunctions (a small percentage indeed!). However unbiased learning is futile since if we consider all possible functions then simply memorizing the data without any effective generalization is an option.

Lessons Function approximation can be viewed as a search through a pre­defined space of hypotheses (a representation language) for a hypothesis which best fits the training data. Different learning methods assume different hypothesis spaces or employ different search techniques.

Varying Learning Methods Can vary the representation: –Numerical function –Rules or logicial functions –Nearest neighbor (case based) Can vary the search algorithm: –Gradient descent –Divide and conquer –Genetic algorithm

Evaluation of Learning Methods Experimental: Conduct well controlled experiments that compare various methods on benchmark problems, gather data on their performance (e.g. accuracy, run­time), and analyze the results for significant differences. Theoretical: Analyze algorithms mathematically and prove theorems about their computational complexity, ability to produce hypotheses that fit the training data, or number of examples needed to produce a hypothesis that accurately generalizes to unseen data (sample complexity).