Download presentation
Presentation is loading. Please wait.
Published byBarry Simon Modified over 9 years ago
1
Learning, page 1 CSI 4106, Winter 2005 Symbolic learning Points Definitions Representation in logic What is an arch? Version spaces Candidate elimination Learning decision trees Explanation-based learning
2
Learning, page 2 CSI 4106, Winter 2005 Definitions Learning is a change that helps improve future performance. (a paraphrase of Herbert Simon's definition) Broad categories of machine learning methods and systems: Symbolic our focus in this brief presentation Statistical Neural / Connectionist Genetic / Evolutionary
3
Learning, page 3 CSI 4106, Winter 2005 Definitions (2) Various characterizations of symbolic learning Learning can be supervised—pre-classified data, unsupervised (conceptual clustering)—raw data. Data for supervised learning can be many positive and negative examples, a theory and one example. The goal of learning can be concept discovery, (section 10.6) generalization from examples (inductive learning), a heuristic, a procedure, and so on.
4
Learning, page 4 CSI 4106, Winter 2005 Representation in logic The AI techniques in symbolic ML: search, graph matching, theorem proving. Knowledge can be represented in logic: a list of elementary properties and facts about things, examples are conjunctive formulae with constants, generalized concepts are formulae with variables.
5
Learning, page 5 CSI 4106, Winter 2005 Representation in logic (2) Two instances of a concept: size( i1, small ) colour( i1, red ) shape( i1, ball ) size( i2, large ) colour( i2, red ) shape( i2, brick ) A generalization of these instances: size( X, Y ) colour( X, red ) shape( X, Z ) Another representation: obj( small, red, ball ) obj( large, red, brick ) A generalization of these instances: obj( X, red, Z )
6
Learning, page 6 CSI 4106, Winter 2005 What is an arch? Positive and negative examples Learning the concept of an arch (section 10.1) An arch: A hypothesis: an arch has three bricks as parts. Another example—not an arch: Another hypothesis: two bricks support a third brick.
7
Learning, page 7 CSI 4106, Winter 2005 What is an arch (2) Another arch: Two bricks support a pyramid. We can generalize both positive examples if we have a taxonomy of blocks: the supported object is a polygon. Not an arch: We have to specialize the hypothesis: two bricks that do not touch support a polygon.
8
Learning, page 8 CSI 4106, Winter 2005 Version spaces This is a method of learning a concept from positive and negative examples. In the first stage, we have to select features that characterize the concepts, and their values. Our example: size={large, small} colour={red, white, blue} shape={ball, brick, cube} We will represent a feature “bundle” like this: obj( Size, Colour, Shape )
9
Learning, page 9 CSI 4106, Winter 2005 Version spaces (2) There are 18 specific objects and 30 classes of objects (expressed by variables instead of constants), variously generalized. They are all arranged into a graph called a version space. Here is part of this space for our example:
10
Learning, page 10 CSI 4106, Winter 2005 Version spaces (3) Generalization and specialization in this very simple representation are also simple. To generalize, replace a constant with a variable: obj( small, red, ball ) obj( small, X, ball ) More generalization requires introducing more (unique) variables. obj( small, X, ball ) obj( small, X, Y ) To specialize, replace a variable with a constant (it must come from the set of allowed feature values): obj( small, X, Y ) obj( small, blue, Y ) obj( small, blue, Y ) obj( small, blue, cube )
11
Learning, page 11 CSI 4106, Winter 2005 Version spaces (4) Other generalization operators (illustrated with a different representation) include the following. Drop a conjunct. size( i1, small ) colour( i1, red ) shape( i1, ball ) colour( i1, red ) shape( i1, ball ) Add a disjunct. colour( i2, red ) shape( i2, ball ) (colour( i2, red ) colour( i2, blue )) shape( i2, ball ) Use a taxonomy (assuming that you have it!). Suppose we have a hierarchy of colours where “red” is a subclass of primaryColour". We can generalize: colour( i3, red ) colour( i3, primaryColour )
12
Learning, page 12 CSI 4106, Winter 2005 Version spaces (5) The candidate elimination algorithm There are three variants: general-to-specific, specific-to-general and a combination of both directions. All three methods work with sets of hypotheses — classes of concepts. We consider, one by one, a series of examples, both positive and negative. In the specific-to-general method, the set S is the (evolving) target concept, the set N stores negative examples.
13
Learning, page 13 CSI 4106, Winter 2005 Version spaces (6) Initialize the concept set S to the first positive example. Initialize the concept set N to Ø. Then repeat: For a positive example p Replace every s S that does not match p with the minimal (most specific) generalization that matches p. Remove any s S more general than another s’ S. Remove any s S that matches some n N. For a negative example n Remove any s S that matches n. Add n to N for future use.
14
Learning, page 14 CSI 4106, Winter 2005 Version spaces (7) Example: obj(small, X, ball) minimally generalizes obj(small, white, ball) and obj(small, red, ball). The concept of a ball
15
Learning, page 15 CSI 4106, Winter 2005 Version spaces (8) Now, general-to-specific. Initialize the concept set G to the most general concept. Initialize the concept set P to Ø. Then repeat: For a negative example n Replace every g G that matches n with the minimal (most general) specialization that does not match n. Remove any g G more specific than another g’ G. Remove any g G that does not match some p P. For a positive example p Remove any g G that does not match p. Add p to P for future use.
16
Learning, page 16 CSI 4106, Winter 2005 Version spaces (9) Example: obj(large, Y, Z), obj(X, Y, ball),... minimally specialize obj(X, Y, Z). The concept of a ball
17
Learning, page 17 CSI 4106, Winter 2005 Version spaces (10) Initialize G to the most general concept, S to the first positive example. For a positive example p Remove any g G that does not match p. Replace every s S that does not match p with the most specific generalization that matches p. Remove any s S more general than another s’ S. Remove any s S more general that some g G. For a negative example n Remove any s S that matches n. Replace every g G that matches n with the most general specialization that does not match n. Remove any g G more specific than another g’ G. Remove any g G more specific than some s S. If G = S = {c}, the learning of c succeeds. If G = S = Ø, learning fails.
18
Learning, page 18 CSI 4106, Winter 2005 Version spaces (11) The concept of a red ball
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.