COMP 328: Midterm Review Spring 2010 Nevin L. Zhang Department of Computer Science & Engineering The Hong Kong University of Science & Technology

Slides:



Advertisements
Similar presentations
Introduction to Support Vector Machines (SVM)
Advertisements

Generative Models Thus far we have essentially considered techniques that perform classification indirectly by modeling the training data, optimizing.
1 Image Classification MSc Image Processing Assignment March 2003.
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 27, 2012.
Data Mining Classification: Alternative Techniques
Support Vector Machines and Kernels Adapted from slides by Tim Oates Cognition, Robotics, and Learning (CORAL) Lab University of Maryland Baltimore County.
Support Vector Machines
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington.
An Overview of Machine Learning
Supervised Learning Recap
Machine Learning & Data Mining CS/CNS/EE 155 Lecture 2: Review Part 2.
Tuomas Sandholm Carnegie Mellon University Computer Science Department
Lecture 13 – Perceptrons Machine Learning March 16, 2010.
Fei Xing1, Ping Guo1,2 and Michael R. Lyu2
COMP 328: Final Review Spring 2010 Nevin L. Zhang Department of Computer Science & Engineering The Hong Kong University of Science & Technology
CII504 Intelligent Engine © 2005 Irfan Subakti Department of Informatics Institute Technology of Sepuluh Nopember Surabaya - Indonesia.
CS 590M Fall 2001: Security Issues in Data Mining Lecture 3: Classification.
Measuring Model Complexity (Textbook, Sections ) CS 410/510 Thurs. April 27, 2007 Given two hypotheses (models) that correctly classify the training.
Computational Learning Theory
INSTANCE-BASE LEARNING
Oregon State University – Intelligent Systems Group 8/22/2003ICML Giorgio Valentini Dipartimento di Scienze dell Informazione Università degli Studi.
Statistical Learning Theory: Classification Using Support Vector Machines John DiMona Some slides based on Prof Andrew Moore at CMU:
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
机器学习 陈昱 北京大学计算机科学技术研究所 信息安全工程研究中心. Concept Learning Reference : Ch2 in Mitchell’s book 1. Concepts: Inductive learning hypothesis General-to-specific.
COMP3503 Intro to Inductive Modeling
Midterm Review Rao Vemuri 16 Oct Posing a Machine Learning Problem Experience Table – Each row is an instance – Each column is an attribute/feature.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
DATA MINING LECTURE 10 Classification k-nearest neighbor classifier Naïve Bayes Logistic Regression Support Vector Machines.
8/25/05 Cognitive Computations Software Tutorial Page 1 SNoW: Sparse Network of Winnows Presented by Nick Rizzolo.
Machine Learning Lecture 11 Summary G53MLE | Machine Learning | Dr Guoping Qiu1.
Algorithms: The Basic Methods Witten – Chapter 4 Charles Tappert Professor of Computer Science School of CSIS, Pace University.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Text Classification 2 David Kauchak cs459 Fall 2012 adapted from:
1 Chapter 6. Classification and Prediction Overview Classification algorithms and methods Decision tree induction Bayesian classification Lazy learning.
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18.
Non-Bayes classifiers. Linear discriminants, neural networks.
Classification Derek Hoiem CS 598, Spring 2009 Jan 27, 2009.
Classification (slides adapted from Rob Schapire) Eran Segal Weizmann Institute.
Insight: Steal from Existing Supervised Learning Methods! Training = {X,Y} Error = target output – actual output.
Chapter 2 Single Layer Feedforward Networks
Chapter1: Introduction Chapter2: Overview of Supervised Learning
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Data Mining Practical Machine Learning Tools and Techniques By I. H. Witten, E. Frank and M. A. Hall DM Finals Study Guide Rodney Nielsen.
Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.
Summary „Rough sets and Data mining” Vietnam national university in Hanoi, College of technology, Feb.2006.
DATA MINING LECTURE 10b Classification k-nearest neighbor classifier
Neural Networks 2nd Edition Simon Haykin
Oct 29th, 2001Copyright © 2001, Andrew W. Moore Bayes Net Structure Learning Andrew W. Moore Associate Professor School of Computer Science Carnegie Mellon.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Classification COMP Seminar BCB 713 Module Spring 2011.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
FNA/Spring CENG 562 – Machine Learning. FNA/Spring Contact information Instructor: Dr. Ferda N. Alpaslan
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
CMPS 142/242 Review Section Fall 2011 Adapted from Lecture Slides.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
CS 9633 Machine Learning Support Vector Machines
Matt Gormley Lecture 11 October 5, 2016
Table 1. Advantages and Disadvantages of Traditional DM/ML Methods
COMP61011 : Machine Learning Ensemble Models
Machine Learning Week 1.
Pattern Recognition CS479/679 Pattern Recognition Dr. George Bebis
General Aspects of Learning
Linear Discrimination
Derek Hoiem CS 598, Spring 2009 Jan 27, 2009
A task of induction to find patterns
Presentation transcript:

COMP 328: Midterm Review Spring 2010 Nevin L. Zhang Department of Computer Science & Engineering The Hong Kong University of Science & Technology Can be used as cheat sheet

Page 2 Overview l Algorithms for supervised learning n Decision trees n Naïve Bayes classifiers n Neural networks n Instance-based learning n Support vector machines l General issues regarding supervised learning n Classification error and confidence interval n Bias-Variance tradeoff n PAC learning theory

Supervised Learning Page 3

Decision Trees Page 4

Decision trees Page 5

Reduced-Error Pruning Page 6

Decision Trees l Issues with attributes n Continuous n Attributes with many values  Use GainRatio instead of Gain n Missing values l Tree construction is a search process n Local minimum Page 7

Naïve Bayes Classifier Page 8 Can classify using this rule: But, joint too expensive to get

Naïve Bayes Classifier Page 9

Learning Naïve Bayes Classifier Page 10 l Laplace smoothing l Continuous attribute l When independence not true, double counting of evidence l Generalization: Bayesian networks

Neural Networks Page 11 For classification and regression

Neural Networks l Activation function n Step, sign n Sigmoid, tanh (hyperbolic tangent) Page 12

Neural Network/Properties l Perceptrons are linear classifier l Two-layer network with enough perceptron units can represent all Boolean functions l One layer with enough sigmoid units can approximate any functions well Page 13

Neural Network Page 14 l Converge only when linearly separable

Neural Network Page 15 l Adaline learning: Delta rule

Neural Network Page 16

Instance-Based Learning l Lazy learning n K-NN n Distance-weighted k-NN (kernel regression) n Locally weighted regression Page 17

Support Vector Machines Page 18

SVM Page 19

SVM Page 20

SVM Page 21

SVM l Data not linearly separable Page 22

SVM Page 23

Nonlinear SVM Page 24

Impact of σ and C Page 25

Classifier Evaluation l Relationship between Page 26

Algorithm Evaluation/Model Selection Page 27 l W hich learning algorithm to use? l Given algorithm, which model to use? (How many hidden units?)

Algorithm Evaluation/Model Selection Page 28

Bias-Variance Decomposition Page 29

Bias-Variance Tradeoff Page 30 For classification problem also

PAC Learning Theory l Probably approximate correct (PAC) l Relationship between Page 31

PAC Learning Theory Page 32

VC Dimension Page 33

Sample Complexity Page 34