© Vipin Kumar CSci 8980 Fall 2002 1 CSci 8980: Data Mining (Fall 2002) Vipin Kumar Army High Performance Computing Research Center Department of Computer.

Slides:



Advertisements
Similar presentations
© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/ Classification: Definition l Given a collection of records (training set) l Find a model.
Advertisements

© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/ Other Classification Techniques 1.Nearest Neighbor Classifiers 2.Support Vector Machines.
Data Mining Classification: Alternative Techniques
Classification: Alternative Techniques
Data Mining Classification: Alternative Techniques
Data Mining Classification: Alternative Techniques
Lecture Notes for Chapter 4 (2) Introduction to Data Mining
Data Mining Classification: Alternative Techniques Lecture Notes for Chapter 5 Introduction to Data Mining by Tan, Steinbach, Kumar © Tan,Steinbach, Kumar.
Data Mining Classification: Alternative Techniques
Chapter 6. Classification and Prediction
Classification and Decision Boundaries
Lecture Notes for Chapter 4 Part III Introduction to Data Mining
Data Mining Classification: Naïve Bayes Classifier
Data Mining Classification: Alternative Techniques
Navneet Goyal. Instance Based Learning  Rote Classifier  K- nearest neighbors (K-NN)  Case Based Resoning (CBR)
Machine Learning Classification Methods
Assessing and Comparing Classification Algorithms Introduction Resampling and Cross Validation Measuring Error Interval Estimation and Hypothesis Testing.
Model Evaluation Metrics for Performance Evaluation
CS 8751 ML & KDDEvaluating Hypotheses1 Sample error, true error Confidence intervals for observed hypothesis error Estimators Binomial distribution, Normal.
Data Mining Classification: Alternative Techniques
Lecture outline Classification Naïve Bayes classifier Nearest-neighbor classifier.
CES 514 – Data Mining Lec 9 April 14 Mid-term k nearest neighbor.
The UNIVERSITY of Kansas EECS 800 Research Seminar Mining Biological Data Instructor: Luke Huan Fall, 2006.
Jeff Howbert Introduction to Machine Learning Winter Classification Bayesian Classifiers.
CSCI 347 / CS 4206: Data Mining Module 06: Evaluation Topic 07: Cost-Sensitive Measures.
Classification II (continued) Model Evaluation
Evaluating Classifiers
CSE 4705 Artificial Intelligence
© Vipin Kumar CSci 8980 Fall CSci 8980: Data Mining (Fall 2002) Vipin Kumar Army High Performance Computing Research Center Department of Computer.
DATA MINING LECTURE 11 Classification Nearest Neighbor Classification Support Vector Machines Logistic Regression Naïve Bayes Classifier Supervised Learning.
K Nearest Neighborhood (KNNs)
DATA MINING LECTURE 10 Classification k-nearest neighbor classifier Naïve Bayes Logistic Regression Support Vector Machines.
10/5/2015Data Mining: Concepts and Techniques1 Chapter 6. Classification and Prediction What is classification? What is prediction? Issues regarding classification.
Data Mining Classification: Alternative Techniques Lecture Notes for Chapter 5 Introduction to Data Mining by Tan, Steinbach, Kumar © Tan,Steinbach, Kumar.
1 Data Mining Lecture 5: KNN and Bayes Classifiers.
1 Lecture Notes 16: Bayes’ Theorem and Data Mining Zhangxi Lin ISQS 6347.
Naïve Bayes Classifier. Bayes Classifier l A probabilistic framework for classification problems l Often appropriate because the world is noisy and also.
Data Mining Classification: Basic Concepts, Decision Trees, and Model Evaluation Lecture Notes for Chapter 4 Introduction to Data Mining by Tan, Steinbach,
Data Mining Classification: Alternative Techniques Lecture Notes for Chapter 5 Introduction to Data Mining by Tan, Steinbach, Kumar © Tan,Steinbach, Kumar.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/ Statistical Inference (By Michael Jordon) l Bayesian perspective –conditional perspective—inferences.
Classification Techniques: Bayesian Classification
Supervised Learning Approaches Bayesian Learning Neural Network Support Vector Machine Ensemble Methods Adapted from Lecture Notes of V. Kumar and E. Alpaydin.
Data Mining Classification: Alternative Techniques Lecture Notes for Chapter 5 Introduction to Data Mining by Tan, Steinbach, Kumar © Tan,Steinbach, Kumar.
Data Mining Classification: Alternative Techniques Lecture Notes for Chapter 5 Introduction to Data Mining by Tan, Steinbach, Kumar © Tan,Steinbach, Kumar.
Model Evaluation l Metrics for Performance Evaluation –How to evaluate the performance of a model? l Methods for Performance Evaluation –How to obtain.
Bab /57 Bab 4 Classification: Basic Concepts, Decision Trees & Model Evaluation Part 2 Model Overfitting & Classifier Evaluation.
Practical Issues of Classification Underfitting and Overfitting –Training errors –Generalization (test) errors Missing Values Costs of Classification.
1 Data Mining Lecture 4: Decision Tree & Model Evaluation.
Bayesian decision theory: A framework for making decisions when uncertainty exit 1 Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e.
DATA MINING LECTURE 10b Classification k-nearest neighbor classifier
Eick: kNN kNN: A Non-parametric Classification and Prediction Technique Goals of this set of transparencies: 1.Introduce kNN---a popular non-parameric.
Fall 2004, CIS, Temple University CIS527: Data Warehousing, Filtering, and Mining Lecture 8 Alternative Classification Algorithms Lecture slides taken/modified.
Bayesian Learning. Bayes Classifier A probabilistic framework for solving classification problems Conditional Probability: Bayes theorem:
Tallahassee, Florida, 2016 CIS4930 Introduction to Data Mining Classification Peixiang Zhao.
Data Mining Classification: Alternative Techniques
Classification Nearest Neighbor
Bayesian Classification
Lecture Notes for Chapter 4 Introduction to Data Mining
Data Mining Classification: Alternative Techniques
Naïve Bayes CSC 600: Data Mining Class 19.
Classification Nearest Neighbor
Data Mining Classification: Alternative Techniques
COSC 4335: Other Classification Techniques
DATA MINING LECTURE 10 Classification k-nearest neighbor classifier
Data Mining Classification: Alternative Techniques
Data Mining Class Imbalance
CSE4334/5334 Data Mining Lecture 7: Classification (4)
Naïve Bayes CSC 576: Data Science.
COSC 4368 Intro Supervised Learning Organization
Presentation transcript:

© Vipin Kumar CSci 8980 Fall CSci 8980: Data Mining (Fall 2002) Vipin Kumar Army High Performance Computing Research Center Department of Computer Science University of Minnesota

© Vipin Kumar CSci 8980 Fall Instance-Based Classifiers Store the training instances Use the training instances to predict the class label of unseen cases

© Vipin Kumar CSci 8980 Fall Nearest-Neighbor Classifiers l For data set with continuous attributes l Requires three things –The set of stored instances –Distance Metric to compute distance between instances. –The value of k, the number of nearest neighbors to retrieve l For classification : –Retrieve the k nearest neighbors –Use class labels of nearest neighbors to determine the class label of unseen instance (e.g., by taking majority vote)

© Vipin Kumar CSci 8980 Fall Definition of Nearest Neighbor K-nearest neighbors of an instance x are data points that have the k smallest distance to x

© Vipin Kumar CSci 8980 Fall nearest-neighbor Voronoi Diagram

© Vipin Kumar CSci 8980 Fall Nearest neighbor classification l Compute distance between two points: –Euclidean distance  take the majority vote of class labels among the k-nearest neighbors –Weighted distance  weight factor, w = 1/d 2  weigh the vote according to distance

© Vipin Kumar CSci 8980 Fall Nearest Neighbor classification… l Choosing the value of k: –If k too small, sensitive to noise points –If k too large,  it can be computationally expensive  neighborhood may include points from other classes

© Vipin Kumar CSci 8980 Fall Nearest Neighbor Classification… l Problem with Euclidean measure: –High dimensional data  curse of dimensionality –Can produce counter-intuitive results –Normalization vs

© Vipin Kumar CSci 8980 Fall Nearest neighbor classification… l Other issues: –k-NN classifiers are lazy learners  it does not build models explicitly  unlike eager learners such as decision tree induction and rule-based systems  classifying instances are relatively expensive –Scaling of attributes  Person: (height in meters, weight in lbs, Class) –Height may vary from 1.5m to 1.85m –Weight may vary from 90lbs to 250lbs –distance measure could be dominated by difference in weights

© Vipin Kumar CSci 8980 Fall Bayes Classifier l A probabilistic (Bayesian) framework to solve the classification problem l Conditional Probability: l Bayes theorem:

© Vipin Kumar CSci 8980 Fall Example of Bayes Theorem l Given: –A doctor knows that meningitis causes stiff neck 50% of the time –Prior probability of any patient having meningitis is 1/50,000 –Prior probability of any patient having stiff neck is 1/20 l If a patient has stiff neck, what’s the probability he/she has meningitis?

© Vipin Kumar CSci 8980 Fall Bayesian Classifiers l Consider each attribute and class label as random variables. l Given a record of attributes (A 1, A 2,…,A n ) –Goal is to predict class C –Specifically, we want to find value of C that maximizes P(C| A 1, A 2,…,A n ) l Can we estimate P(C| A 1, A 2,…,A n ) directly from data?

© Vipin Kumar CSci 8980 Fall Bayesian Classifiers l Approach: –compute the posterior probability P(C | A 1, A 2, …, A n ) for all values of C using the Bayes theorem –Choose value of C that maximizes P(C | A 1, A 2, …, A n ) –Equivalent to choosing value of C that maximizes P(A 1, A 2, …, A n |C) P(C) l How to estimate P(A 1, A 2, …, A n | C )?

© Vipin Kumar CSci 8980 Fall Naïve Bayes Classifier l Assume independence among attributes A i when class is given: –P(A 1, A 2, …, A n |C) = P(A 1 | C j ) P(A 2 | C j )… P(A n | C j ) –Can estimate P(A i | C j ) for all A i and C j. –New point is classified to C j if P(C j )  P(A i | C j ) is maximal.

© Vipin Kumar CSci 8980 Fall How to Estimate Probabilities from Data? l Class: P(C) = N c /N –e.g., P(No) = 7/10 l For discrete attributes: P(A i | C k ) = |A ik |/ N c –where |A ik | is number of instances having attribute A i and belongs to class C k –Examples: P(Married|No) = 4/7 P(Refund=Yes|Yes)=0 k

© Vipin Kumar CSci 8980 Fall How to Estimate Probabilities from Data? l For continuous attributes: –Discretize the range into bins  one ordinal attribute per bin  violates independence assumption –Two-way split: (A v)  choose only one of the two splits as new attribute –Assume attribute obeys certain probability distribution  Typically, normal distribution is assumed  Use data to estimate parameters of distribution (e.g., mean and standard deviation)  Once probability distribution is known, can use it to estimate the conditional probability P(A i |c) k

© Vipin Kumar CSci 8980 Fall How to Estimate Probabilities from Data? l Normal distribution: –One for each (A i,c i ) pair l For (Income, Class=No): –If Class=No  sample mean = 110  sample variance = 2975

© Vipin Kumar CSci 8980 Fall Example of Naïve Bayes Classifier l P(X|Class=No) = P(Refund=No|Class=No)  P(Married| Class=No)  P(Income=120K| Class=No) = 4/7  4/7  = l P(X|Class=Yes) = P(Refund=No| Class=Yes)  P(Married| Class=Yes)  P(Income=120K| Class=Yes) = 1  0  1.2  = 0 Since P(X|No)P(No) > P(X|Yes)P(Yes) Therefore P(No|X) > P(Yes|X) => Class = No Given a Test instance:

© Vipin Kumar CSci 8980 Fall Naïve Bayes Classifier l If one of the conditional probability is zero, then the entire expression becomes zero –Independence  multiplication of probabilities l Laplace correction (also known as m-estimate):

© Vipin Kumar CSci 8980 Fall Example of Naïve Bayes Classifier A: attributes M: mammals N: non-mammals P(A|M)P(M) > P(A|N)P(N) => Mammals

© Vipin Kumar CSci 8980 Fall Naïve Bayes (Summary) l Robust to isolated noise points l Handle missing values by ignoring the instance during probability estimate calculations l Robust to irrelevant attributes l Independence assumption may not hold for some attributes –Use other techniques such as Bayesian Belief Networks (BBN)

© Vipin Kumar CSci 8980 Fall Model Evaluation l Evaluating the performance of a model (in terms of its predictive ability) l Confusion Matrix: PREDICTED CLASS ACTUAL CLASS Class=YesClass=No Class=Yesab Class=Nocd a: TP (true positive) b: FN (false negative) c: FP (false positive) d: TN (true negative)

© Vipin Kumar CSci 8980 Fall Model Evaluation l What are the measures to estimate performance? PREDICTED CLASS ACTUAL CLASS Class=YesClass=No Class=Yesab Class=Nocd

© Vipin Kumar CSci 8980 Fall Cost Matrix PREDICTED CLASS ACTUAL CLASS C(i|j) Class=YesClass=No Class=YesC(Yes|Yes)C(No|Yes) Class=NoC(Yes|No)C(No|No) C(i|j): Cost of misclassifying class j example as class i

© Vipin Kumar CSci 8980 Fall Cost-Sensitive Measures l Precision is biased towards C(Yes|Yes) & C(Yes|No) l Recall is biased towards C(Yes|Yes) & C(No|Yes) l F-measure is biased towards all except C(No|No)

© Vipin Kumar CSci 8980 Fall ROC (Receiver Operating Characteristic) l Developed in 1950s for signal detection theory to analyze noisy signals –Characterize the trade-off between positive hits and false alarms l ROC curve plots TP (on the y-axis) against FP (on the x-axis) l Performance of each classifier represented as a point in an ROC curve –changing the threshold of algorithm, sample distribution or cost matrix changes the location of the point

© Vipin Kumar CSci 8980 Fall ROC Curve - 1-dimensional data set containing 2 classes (positive and negative) - any points located at x > t is classified as positive At threshold t: TP=0.5, FN=0.5, FP=0.12, FN=0.88

© Vipin Kumar CSci 8980 Fall ROC Curve (TP,FP): l (0,0): everything is negative l (1,1): everything is positive l (1,0): ideal l Diagonal line: –Random guessing –Below diagonal line:  prediction is opposite of the true class l Area under ROC curve –Another metric for comparison