CS Fall 2016 (Shavlik©), Lecture 2

Slides:



Advertisements
Similar presentations
1 CS 391L: Machine Learning: Rule Learning Raymond J. Mooney University of Texas at Austin.
Advertisements

Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
CPSC 502, Lecture 15Slide 1 Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 15 Nov, 1, 2011 Slide credit: C. Conati, S.
PERCEPTRON LEARNING David Kauchak CS 451 – Fall 2013.
Classification Dr Eamonn Keogh Computer Science & Engineering Department University of California - Riverside Riverside,CA Who.
Mapping Between Taxonomies Elena Eneva 11 Dec 2001 Advanced IR Seminar.
© Prentice Hall1 DATA MINING Introductory and Advanced Topics Part II Margaret H. Dunham Department of Computer Science and Engineering Southern Methodist.
Copyright © 2004 by Jinyan Li and Limsoon Wong Rule-Based Data Mining Methods for Classification Problems in Biomedical Domains Jinyan Li Limsoon Wong.
1 What is learning? “Learning denotes changes in a system that... enable a system to do the same task more efficiently the next time.” –Herbert Simon “Learning.
Copyright R. Weber Machine Learning, Data Mining ISYS370 Dr. R. Weber.
CS 8751 ML & KDDSupport Vector Machines1 Support Vector Machines (SVMs) Learning mechanism based on linear programming Chooses a separating plane based.
ADVANCED CLASSIFICATION TECHNIQUES David Kauchak CS 159 – Fall 2014.
Today’s Topics Chapter 2 in One Slide Chapter 18: Machine Learning (ML) Creating an ML Dataset –“Fixed-length feature vectors” –Relational/graph-based.
AI Week 14 Machine Learning: Introduction to Data Mining Lee McCluskey, room 3/10
Today’s Topics HW0 due 11:55pm tonight and no later than next Tuesday HW1 out on class home page; discussion page in MoodleHW1discussion page Please do.
Today’s Topics Dealing with Noise Overfitting (the key issue in all of ML) A ‘Greedy’ Algorithm for Pruning D-Trees Generating IF-THEN Rules from D-Trees.
CS Fall 2015 (© Jude Shavlik), Lecture 7, Week 3
Overview of Supervised Learning Overview of Supervised Learning2 Outline Linear Regression and Nearest Neighbors method Statistical Decision.
CS 6961: Structured Prediction Fall 2014 Course Information.
1 Learning Chapter 18 and Parts of Chapter 20 AI systems are complex and may have many parameters. It is impractical and often impossible to encode all.
Today’s Topics Read –For exam: Chapter 13 of textbook –Not on exam: Sections & Genetic Algorithms (GAs) –Mutation –Crossover –Fitness-proportional.
Today Ensemble Methods. Recap of the course. Classifier Fusion
Today’s Topics Read Chapter 3 & Section 4.1 (Skim Section 3.6 and rest of Chapter 4), Sections 5.1, 5.2, 5.3, 5,7, 5.8, & 5.9 (skim rest of Chapter 5)
Today’s Topics Learning Decision Trees (Chapter 18) –We’ll use d-trees to introduce/motivate many general issues in ML (eg, overfitting reduction) “Forests”
1 Decision Tree Learning Original slides by Raymond J. Mooney University of Texas at Austin.
Today’s Topics HW1 Due 11:55pm Today (no later than next Tuesday) HW2 Out, Due in Two Weeks Next Week We’ll Discuss the Make-Up Midterm Be Sure to Check.
CS 445/545 Machine Learning Winter, 2014 See syllabus at
Machine Learning Chapter 18, 21 Some material adopted from notes by Chuck Dyer.
Today’s Topics Remember: no discussing exam until next Tues! ok to stop by Thurs 5:45-7:15pm for HW3 help More BN Practice (from Fall 2014 CS 540 Final)
Machine Learning Lecture 1: Intro + Decision Trees Moshe Koppel Slides adapted from Tom Mitchell and from Dan Roth.
Today’s Topics Artificial Neural Networks (ANNs) Perceptrons (1950s) Hidden Units and Backpropagation (1980s) Deep Neural Networks (2010s) ??? (2040s [note.
Copyright © 2004 by Jinyan Li and Limsoon Wong Rule-Based Data Mining Methods for Classification Problems in Biomedical Domains Jinyan Li Limsoon Wong.
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
CS Fall 2016 (Shavlik©), Lecture 5
Information Organization: Overview
Ananya Das Christman CS311 Fall 2016
CS Fall 2015 (Shavlik©), Midterm Topics
School of Computer Science & Engineering
MATH: Unit 6 2D Figures 10 Days
CS Fall 2016 (© Jude Shavlik), Lecture 4
cs540 - Fall 2015 (Shavlik©), Lecture 25, Week 14
Lecture 24: Convolutional neural networks
Reading: Pedro Domingos: A Few Useful Things to Know about Machine Learning source: /cacm12.pdf reading.
Introduction to Data Science Lecture 7 Machine Learning Overview
Learning from Data.
Supplemental slides for CSE 327 Prof. Jeff Heflin
CS Fall 2016 (Shavlik©), Lecture 12, Week 6
Data Mining Lecture 11.
Machine Learning Week 1.
CS540 - Fall 2016 (Shavlik©), Lecture 17, Week 10
CS Fall 2016 (© Jude Shavlik), Lecture 6, Week 4
Machine Learning Today: Reading: Maria Florina Balcan
Learning.
CSSE463: Image Recognition Day 20
CS540 - Fall 2016(Shavlik©), Lecture 16, Week 9
cs540 - Fall 2016 (Shavlik©), Lecture 18, Week 10
CS Fall 2016 (© Jude Shavlik), Lecture 3
CS Fall 2016 (© Jude Shavlik), Lecture 7, Week 4
Here are four triangles. What do all of these triangles have in common
CS Fall 2016 (Shavlik©), Lecture 12, Week 6
Learning Chapter 18 and Parts of Chapter 20
Machine learning overview
Junheng, Shengming, Yunsheng 11/09/2018
Basics of ML Rohan Suri.
CS639: Data Management for Data Science
Learning Chapter 18 and Parts of Chapter 20
Information Organization: Overview
Shapes.
Supplemental slides for CSE 327 Prof. Jeff Heflin
Machine Learning: Lecture 5
Presentation transcript:

CS 540 - Fall 2016 (Shavlik©), Lecture 2 12/8/2018 Today’s Topics Chapter 2 in One Slide Chapter 18: Machine Learning (ML) Creating an ML Dataset “Fixed-length feature vectors” Relational/graph-based examples HW0 (due in one week) Getting ‘Labeled’ Training Examples Train/Tune/Test Sets N-fold Cross Validation Supervised Learning and Venn Diagrams Read Section 18.8.1 of textbook and Wikipedia articles linked to class home page 9/8/16 CS 540 - Fall 2016 (Shavlik©), Lecture 2

The Big AI Picture – Chapter 2 The study of ‘agents’ that exist in an environment and perceive, act, and learn 1: Sense Environment 3: Act AI “Agent” 2: Reason 5: Learn 4: Get Feedback 9/8/16 CS 540 - Fall 2016 (Shavlik©), Lecture 2

What Do You Think Machine Learning Means? Given: Do: Throughout the semester, think of what is missing in current ML, compared to human learning 9/8/16 CS 540 - Fall 2016 (Shavlik©), Lecture 2

What is Learning? - Herbert Simon “Learning denotes changes in the system that … enable the system to do the same task … more effectively the next time.” - Herbert Simon “Learning is making useful changes in our minds.” - Marvin Minsky But remember, cheese and wine get better over time but don’t learn! 9/8/16 CS 540 - Fall 2016 (Shavlik©), Lecture 2

Learning from Labeled Examples Positive Examples Negative Examples Category of this example? Concept Solid Red Circle in a (Regular?) Polygon What about? Figures on left side of page Figures drawn before 5pm 2/2/89 <etc> 9/8/16 CS 540 - Fall 2016 (Shavlik©), Lecture 2

Supervised Machine Learning: Task Overview Real World Feature “Design” (usually done by humans) Feature Space Classifier Construction (done by learning algorithm) Concepts/ Classes/ Decisions 9/8/16 CS 540 - Fall 2016 (Shavlik©), Lecture 2

Standard Approach for Constructing an ML Dataset for a Task Step 1: Choose a feature space We will use fixed-length feature vectors Choose N features Each feature has Vi possible values Each example is represented by a vector of N feature values (is a point in the feature space) eg <red, 50, round> color weight shape Step 2: Collect examples (“I/O” pairs) Defines a space weight shape color 9/8/16 CS 540 - Fall 2016 (Shavlik©), Lecture 2

Another View of Std ML Datasets - a Single Table (2D array) Feature 1 Feature 2 . . . Feature N Output Category Example 1 0.0 small red true Example 2 9.3 medium false Example 3 8.2 blue Example M 5.7 green 9/8/16 CS 540 - Fall 2016 (Shavlik©), Lecture 2

CS 540 - Fall 2016 (Shavlik©), Lecture 2 Standard Feature Types for representing training examples – a source of “domain knowledge” Keep your eye out for places where domain knowledge is (or should be) used in ML Nominal (including Boolean) No ordering among possible values eg, color  {red, blue, green} (vs. color = 1000 Hertz) Linear (or Ordered) Possible values of the feature are totally ordered eg, size  {small, medium, large} ← discrete weight  [0…500] ← continuous Hierarchical (not commonly used) Possible values are partially ordered in an ISA hierarchy eg, shape  closed polygon continuous triangle square circle ellipse 9/8/16 CS 540 - Fall 2016 (Shavlik©), Lecture 2

CS 540 - Fall 2016 (Shavlik©), Lecture 2 Where We Are Have selected ‘concept’ to learn Have chosen features to rep examples Have created at least 100 labeled examples Next: learn a ‘model’ that can predict output for NEW examples 9/8/16 CS 540 - Fall 2016 (Shavlik©), Lecture 2

CS 540 - Fall 2016 (Shavlik©), Lecture 2 Concept Learning Learning systems differ in how they represent concepts Neural Net Backpropagation ID3, C4.5, CART Decision Tree Training Examples AQ, FOIL, Aleph Φ  X  Y Φ  Z Rules . Weighted Sum SVMs If 5x1 + 9x2 – 3x3 > 12 Then + 9/8/16 CS 540 - Fall 2016 (Shavlik©), Lecture 2