Download presentation
Presentation is loading. Please wait.
1
CS 540 - Fall 2016 (Shavlik©), Lecture 2
12/8/2018 Today’s Topics Chapter 2 in One Slide Chapter 18: Machine Learning (ML) Creating an ML Dataset “Fixed-length feature vectors” Relational/graph-based examples HW0 (due in one week) Getting ‘Labeled’ Training Examples Train/Tune/Test Sets N-fold Cross Validation Supervised Learning and Venn Diagrams Read Section of textbook and Wikipedia articles linked to class home page 9/8/16 CS Fall 2016 (Shavlik©), Lecture 2
2
The Big AI Picture – Chapter 2
The study of ‘agents’ that exist in an environment and perceive, act, and learn 1: Sense Environment 3: Act AI “Agent” 2: Reason 5: Learn 4: Get Feedback 9/8/16 CS Fall 2016 (Shavlik©), Lecture 2
3
What Do You Think Machine Learning Means?
Given: Do: Throughout the semester, think of what is missing in current ML, compared to human learning 9/8/16 CS Fall 2016 (Shavlik©), Lecture 2
4
What is Learning? - Herbert Simon
“Learning denotes changes in the system that … enable the system to do the same task … more effectively the next time.” - Herbert Simon “Learning is making useful changes in our minds.” - Marvin Minsky But remember, cheese and wine get better over time but don’t learn! 9/8/16 CS Fall 2016 (Shavlik©), Lecture 2
5
Learning from Labeled Examples
Positive Examples Negative Examples Category of this example? Concept Solid Red Circle in a (Regular?) Polygon What about? Figures on left side of page Figures drawn before 5pm 2/2/89 <etc> 9/8/16 CS Fall 2016 (Shavlik©), Lecture 2
6
Supervised Machine Learning: Task Overview
Real World Feature “Design” (usually done by humans) Feature Space Classifier Construction (done by learning algorithm) Concepts/ Classes/ Decisions 9/8/16 CS Fall 2016 (Shavlik©), Lecture 2
7
Standard Approach for Constructing an ML Dataset for a Task
Step 1: Choose a feature space We will use fixed-length feature vectors Choose N features Each feature has Vi possible values Each example is represented by a vector of N feature values (is a point in the feature space) eg <red, 50, round> color weight shape Step 2: Collect examples (“I/O” pairs) Defines a space weight shape color 9/8/16 CS Fall 2016 (Shavlik©), Lecture 2
8
Another View of Std ML Datasets - a Single Table (2D array)
Feature 1 Feature 2 . . . Feature N Output Category Example 1 0.0 small red true Example 2 9.3 medium false Example 3 8.2 blue Example M 5.7 green 9/8/16 CS Fall 2016 (Shavlik©), Lecture 2
9
CS 540 - Fall 2016 (Shavlik©), Lecture 2
Standard Feature Types for representing training examples – a source of “domain knowledge” Keep your eye out for places where domain knowledge is (or should be) used in ML Nominal (including Boolean) No ordering among possible values eg, color {red, blue, green} (vs. color = Hertz) Linear (or Ordered) Possible values of the feature are totally ordered eg, size {small, medium, large} ← discrete weight [0…500] ← continuous Hierarchical (not commonly used) Possible values are partially ordered in an ISA hierarchy eg, shape closed polygon continuous triangle square circle ellipse 9/8/16 CS Fall 2016 (Shavlik©), Lecture 2
10
CS 540 - Fall 2016 (Shavlik©), Lecture 2
Where We Are Have selected ‘concept’ to learn Have chosen features to rep examples Have created at least 100 labeled examples Next: learn a ‘model’ that can predict output for NEW examples 9/8/16 CS Fall 2016 (Shavlik©), Lecture 2
11
CS 540 - Fall 2016 (Shavlik©), Lecture 2
Concept Learning Learning systems differ in how they represent concepts Neural Net Backpropagation ID3, C4.5, CART Decision Tree Training Examples AQ, FOIL, Aleph Φ X Y Φ Z Rules . Weighted Sum SVMs If 5x1 + 9x2 – 3x3 > 12 Then + 9/8/16 CS Fall 2016 (Shavlik©), Lecture 2
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.