1 Machine Learning Spring 2010 Rong Jin. 2 CSE847 Machine Learning  Instructor: Rong Jin  Office Hour: Tuesday 4:00pm-5:00pm Thursday 4:00pm-5:00pm.

Slides:



Advertisements
Similar presentations
1 Machine Learning: Lecture 1 Overview of Machine Learning (Based on Chapter 1 of Mitchell T.., Machine Learning, 1997)
Advertisements

General Information Course Id: COSC6342 Machine Learning Time: Tuesdays and Thursdays 2:30 PM – 4:00 PM Professor: Ricardo Vilalta
1 Machine Learning Spring 2010 Rong Jin. 2 CSE847 Machine Learning Instructor: Rong Jin Office Hour: Tuesday 4:00pm-5:00pm Thursday 4:00pm-5:00pm Textbook.
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Introduction to Machine Learning Alejandro Ceccatto Instituto de Física Rosario CONICET-UNR.
INTRODUCTION TO MACHINE LEARNING David Kauchak CS 451 – Fall 2013.
Kansas State University Department of Computing and Information Sciences CIS 798: Intelligent Systems and Machine Learning Tuesday, August 24, 1999 William.
10/29/01Reinforcement Learning in Games 1 Colin Cherry Oct 29/01.
Adversarial Search Chapter 5.
CSE 5522: Survey of Artificial Intelligence II: Advanced Techniques Instructor: Alan Ritter TA: Fan Yang.
Minimax and Alpha-Beta Reduction Borrows from Spring 2006 CS 440 Lecture Slides.
CS 484 – Artificial Intelligence1 Announcements Project 1 is due Tuesday, October 16 Send me the name of your konane bot Midterm is Thursday, October 18.
1er. Escuela Red ProTIC - Tandil, de Abril, Introduction How to program computers to learn? Learning: Improving automatically with experience.
2015/6/1Course Introduction1 Welcome! MSCIT 521: Knowledge Discovery and Data Mining Qiang Yang Hong Kong University of Science and Technology
1 Machine Learning Spring 2013 Rong Jin. 2 CSE847 Machine Learning  Instructor: Rong Jin  Office Hour: Tuesday 4:00pm-5:00pm TA, Qiaozi Gao, Thursday.
Machine Learning Bob Durrant School of Computer Science
1 Collaborative Filtering Rong Jin Department of Computer Science and Engineering Michigan State University.
Machine Learning (Extended) Dr. Ata Kaban
Machine Learning Group University College Dublin 4.30 Machine Learning Pádraig Cunningham.
Machine Learning: Foundations Yishay Mansour Tel-Aviv University.
Learning From Data Chichang Jou Tamkang University.
Data Mining with Decision Trees Lutz Hamel Dept. of Computer Science and Statistics University of Rhode Island.
Machine Learning CSE 473. © Daniel S. Weld Topics Agency Problem Spaces Search Knowledge Representation Reinforcement Learning InferencePlanning.
1 Some rules  No make-up exams ! If you miss with an official excuse, you get average of your scores in the other exams – at most once.  WP only-if you.
A Brief Survey of Machine Learning
CIS 410/510 Probabilistic Methods for Artificial Intelligence Instructor: Daniel Lowd.
CSE 515 Statistical Methods in Computer Science Instructor: Pedro Domingos.
Introduction to machine learning
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
Data Mining By Andrie Suherman. Agenda Introduction Major Elements Steps/ Processes Tools used for data mining Advantages and Disadvantages.
CS Machine Learning. What is Machine Learning? Adapt to / learn from data  To optimize a performance function Can be used to:  Extract knowledge.
Machine Learning Theory Maria-Florina (Nina) Balcan Lecture 1, August 23 rd 2011.
Short Introduction to Machine Learning Instructor: Rada Mihalcea.
Reinforcement Learning
CpSc 810: Machine Learning Design a learning system.
CpSc 881: Machine Learning Introduction. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources.
Computing & Information Sciences Kansas State University Wednesday, 13 Sep 2006CIS 490 / 730: Artificial Intelligence Lecture 9 of 42 Wednesday, 13 September.
Machine Learning Introduction. 2 교재  Machine Learning, Tom T. Mitchell, McGraw- Hill  일부  Reinforcement Learning: An Introduction, R. S. Sutton and.
Lecture 10: 8/6/1435 Machine Learning Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Introduction to Artificial Intelligence CS 438 Spring 2008 Today –AIMA, Ch. 6 –Adversarial Search Thursday –AIMA, Ch. 6 –More Adversarial Search The “Luke.
1 Machine Learning (Extended) Dr. Ata Kaban Algorithms to enable computers to learn –Learning = ability to improve performance automatically through experience.
Machine Learning (ML) and Knowledge Discovery in Databases (KDD) Instructor: Rich Maclin Texts: Machine Learning, Mitchell Notes based.
Well Posed Learning Problems Must identify the following 3 features –Learning Task: the thing you want to learn. –Performance measure: must know when you.
1 Machine Learning 1.Where does machine learning fit in computer science? 2.What is machine learning? 3.Where can machine learning be applied? 4.Should.
Computing & Information Sciences Kansas State University Wednesday, 12 Sep 2007CIS 530 / 730: Artificial Intelligence Lecture 9 of 42 Wednesday, 12 September.
Machine Learning, Decision Trees, Overfitting Machine Learning Tom M. Mitchell Machine Learning Department Carnegie Mellon University January 14,
CHECKERS: TD(Λ) LEARNING APPLIED FOR DETERMINISTIC GAME Presented By: Presented To: Amna Khan Mis Saleha Raza.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 9 of 42 Wednesday, 14.
Chapter 1: Introduction. 2 목 차목 차 t Definition and Applications of Machine t Designing a Learning System  Choosing the Training Experience  Choosing.
Course Overview  What is AI?  What are the Major Challenges?  What are the Main Techniques?  Where are we failing, and why?  Step back and look at.
Machine Learning Introduction. Class Info Office Hours –Monday:11:30 – 1:00 –Wednesday:10:00 – 1:00 –Thursday:11:30 – 1:00 Course Text –Tom Mitchell:
1 Introduction to Machine Learning Chapter 1. cont.
Introduction Machine Learning: Chapter 1. Contents Types of learning Applications of machine learning Disciplines related with machine learning Well-posed.
Introduction to Machine Learning © Roni Rosenfeld,
Well Posed Learning Problems Must identify the following 3 features –Learning Task: the thing you want to learn. –Performance measure: must know when you.
Deep Learning and Deep Reinforcement Learning. Topics 1.Deep learning with convolutional neural networks 2.Learning to play Atari video games with Deep.
Introduction to Machine Learning. Introduce yourself Why you choose this course? Any examples of machine learning you know?
General Information Course Id: COSC6342 Machine Learning Time: TU/TH 1-2:30p Instructor: Christoph F. Eick Classroom:AH301
1 Machine Learning Patricia J Riddle Computer Science 367 6/26/2016Machine Learning.
Machine Learning. Definition: The ability of a machine to improve its performance based on previous results.
Supervise Learning Introduction. What is Learning Problem Learning = Improving with experience at some task – Improve over task T, – With respect to performance.
Usman Roshan Dept. of Computer Science NJIT
Sampath Jayarathna Cal Poly Pomona
School of Computer Science & Engineering

Spring 2003 Dr. Susan Bridges
Data Mining: Concepts and Techniques Course Outline
Course Summary ChengXiang “Cheng” Zhai Department of Computer Science
Why Machine Learning Flood of data
Welcome! Knowledge Discovery and Data Mining
Machine Learning (ML) and Knowledge Discovery in Databases (KDD)
Presentation transcript:

1 Machine Learning Spring 2010 Rong Jin

2 CSE847 Machine Learning  Instructor: Rong Jin  Office Hour: Tuesday 4:00pm-5:00pm Thursday 4:00pm-5:00pm  Textbook Machine Learning The Elements of Statistical Learning Pattern Recognition and Machine Learning Many subjects are from papers  Web site:

3 Requirements  6~10 homework assignments  One project for each person Team: no more than 2 people Topics: either assigned by the instructor or proposed by students themselves Results: a project proposal, a progress report and a final report  Midterm exam & final exam

4 Goal  Familiarize you with the state-of-art in Machine Learning Breadth: many different techniques Depth: Project Hands-on experience  Develop the way of machine learning thinking Learn how to model real problems using machine learning techniques Learn how to deal with real problems practically

5 Course Outline Theoretical Aspects Information Theory Optimization Theory Probability Theory Learning Theory Practical Aspects Supervised Learning Algorithms Unsupervised Learning Algorithms Important Practical Issues Applications

6 Today’s Topics  Why is machine learning?  Example: learning to play backgammon  General issues in machine learning

7 Why Machine Learning?  Past: most computer programs are mainly made by hand  Future: Computers should be able to program themselves by the interaction with their environment

8 Recent Trends  Recent progress in algorithm and theory  Growing flood of online data  Computational power is available  Growing industry

9 Three Niches for Machine Learning  Data mining: using historical data to improve decisions Medical records  medical knowledge  Software applications that are difficult to program by hand Autonomous driving Image Classification  User modeling Automatic recommender systems

10 Typical Data Mining Task Given: 9147 patient records, each describing pregnancy and birth Each patient contains 215 features Task: Classes of future patients at high risk for Emergency Cesarean Section

11 Data Mining Results One of 18 learned rules : If no previous vaginal delivery abnormal 2 nd Trimester Ultrasound Malpresentation at admission Then probability of Emergency C-Section is 0.6

12 Credit Risk Analysis Learned Rules : If Other-Delinquent-Account > 2 Number-Delinquent-Billing-Cycles > 1 ThenProfitable-Costumer ? = no IfOther-Delinquent-Account = 0 (Income > $30K or Years-of-Credit > 3) ThenProfitable-Costumer ? = yes

13 Programs too Difficult to Program By Hand  ALVINN drives 70mph on highways

14 Programs too Difficult to Program By Hand  ALVINN drives 70mph on highways

15 Programs too Difficult to Program By Hand  Image Classification

16 Image Retrieval using Texts

17 Automatic Image Annotation  Automatically annotate images with textual words  Retrieve images with textual queries

18 Software that Models Users Description: A homicide detective and a fire marshall must stop a pair of murderers who commit videotaped crimes to become media darlings Rating: Description: Benjamin Martin is drawn into the American revolutionary war against his will when a brutal British commander kills his son. Rating: Description: A biography of sports legend, Muhammad Ali, from his early days to his days in the ring Rating: History What to Recommend? Description: A high-school boy is given the chance to write a story about an up-and-coming rock band as he accompanies it on their concert tour. Recommend: ? Description: A young adventurer named Milo Thatch joins an intrepid group of explorers to find the mysterious lost continent of Atlantis. Recommend: ? No Yes

19 Netflix Contest

20 Where is this Headed?  Today: tip of iceberg First generation algorithms Applied to well-formatted databases Budding industry  Opportunities for Tomorrow Multimedia Database Robots Automatic computing Bioinformatics …

21 Relevant Disciplines  Artificial Intelligence  Statistics (particularly Bayesian Stat.)  Computational complexity theory  Information theory  Optimization theory  Philosophy  Psychology  …

22 Today’s Topics  Why is machine learning?  Example: learning to play backgammon  General issues in machine learning

23 What is the Learning Problem  Learning = Improving with experience at some task Improve over task T With respect to performance measure P Based on experience E  Example: Learning to Play Backgammon T: Play backgammon P: % of games won in world tournament E: opportunity to play against itself

24 Backgammon  More than states (boards)  Best human players see only small fraction of all board during lifetime  Searching is hard because of dice (branching factor > 100)

25 TD-Gammon by Tesauro (1995)  Trained by playing with itself  Now approximately equal to the best human player

26 Learn to Play Chess  Task T: Play chess  Performance P: Percent of games won in the world tournament  Experience E: What experience? How shall it be represented? What exactly should be learned? What specific algorithm to learn it?

27 Choose a Target Function  Goal: Policy:  : b  m  Choice of value function V: b, m   B = board  = real values

28 Choose a Target Function  Goal: Policy:  : b  m  Choice of value function V: b, m   V: b   B = board  = real values

29 Value Function V(b): Example Definition  If b final board that is won: V(b) = 1  If b final board that is lost: V(b) = -1  If b not final boardV(b) = E[V(b*)] where b* is final board after playing optimally

30 Representation of Target Function V(b) Same value for each board Lookup table (one entry for each board) No Learning No Generalization Summarize experience into Polynomials Neural Networks

31 Example: Linear Feature Representation  Features: p b (b), p w (b) = number of black (white) pieces on board b u b (b), u b (b) = number of unprotected pieces t b (b), t b (b) = number of pieces threatened by opponent  Linear function: V(b) = w 0 p b (b)+ w 1 p w (b)+ w 2 u b (b)+ w 3 u w (b)+ w 4 t b (b)+ w 5 t w (b)  Learning: Estimation of parameters w 0, …, w 5

32  Given: board b Predicted value V(b) Desired value V*(b)  Calculate error(b) = (V*(b) – V(b)) 2 For each board feature f i w i  w i + c  error(b)  f i  Stochastically minimizes  b (V*(b)-V(b)) 2 Tuning Weights Gradient Descent Optimization

33 Obtain Boards  Random boards  Beginner plays  Professionals plays

34 Obtain Target Values  Person provides value V(b)  Play until termination. If outcome is Win: V(b)  1 for all boards Loss: V(b)  -1 for all boards Draw: V(b)  0 for all boards  Play one move: b  b’ V(b)  V(b’)  Play n moves: b  b’  …  b (n) V(b)  V(b (n) )

35 A General Framework MathematicalM odeling Finding Optimal Parameters StatisticsOptimization + Machine Learning

36 Today’s Topics  Why is machine learning?  Example: learning to play backgammon  General issues in machine learning

37 Importants Issues in Machine Learning  Obtaining experience How to obtain experience?  Supervised learning vs. Unsupervised learning How many examples are enough?  PAC learning theory  Learning algorithms What algorithm can approximate function well, when? How does the complexity of learning algorithms impact the learning accuracy? Whether the target function is learnable?  Representing inputs How to represent the inputs? How to remove the irrelevant information from the input representation? How to reduce the redundancy of the input representation?