Download presentation
Presentation is loading. Please wait.
Published byBrett Johns Modified over 9 years ago
1
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN 1 Remaining Lectures in 2009 1.Advanced Clustering and Outlier Detection 2.Advanced Classification and Prediction 3.Top Ten Data Mining Algorithms (short) 4.Course Summary (short) 5.Assignment5 Student Presentations
2
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN 2 Clustering Part2: Advanced Clustering and Outlier Detection 1.Hierarchical Clustering 2.More on Density-based Clustering: DENCLUE 3.[EM Top10-DM-Alg] 4.Cluster Evaluation Measures 5.Outlier Detection
3
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN More on Clustering 1. Hierarchical Clustering to be discussed in Nov. 11 2. DBSCAN will be used in programming project
4
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN Hierarchical Clustering l Produces a set of nested clusters organized as a hierarchical tree l Can be visualized as a dendrogram –A tree like diagram that records the sequences of merges or splits
5
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN Agglomerative Clustering Algorithm l More popular hierarchical clustering technique l Basic algorithm is straightforward 1.Compute the proximity matrix 2.Let each data point be a cluster 3.Repeat 4.Merge the two closest clusters 5.Update the proximity matrix 6.Until only a single cluster remains l Key operation is the computation of the proximity of two clusters –Different approaches to defining the distance between clusters distinguish the different algorithms
6
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN Starting Situation l Start with clusters of individual points and a proximity matrix p1 p3 p5 p4 p2 p1p2p3p4p5......... Proximity Matrix
7
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN Intermediate Situation l After some merging steps, we have some clusters C1 C4 C2 C5 C3 C2C1 C3 C5 C4 C2 C3C4C5 Proximity Matrix
8
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN Intermediate Situation l We want to merge the two closest clusters (C2 and C5) and update the proximity matrix. C1 C4 C2 C5 C3 C2C1 C3 C5 C4 C2 C3C4C5 Proximity Matrix
9
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN After Merging l The question is “How do we update the proximity matrix?” C1 C4 C2 U C5 C3 ? ? ? ? ? C2 U C5 C1 C3 C4 C2 U C5 C3C4 Proximity Matrix
10
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN How to Define Inter-Cluster Similarity p1 p3 p5 p4 p2 p1p2p3p4p5......... Similarity? l MIN l MAX l Group Average l Distance Between Centroids l Other methods driven by an objective function –Ward’s Method uses squared error Proximity Matrix
11
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN How to Define Inter-Cluster Similarity p1 p3 p5 p4 p2 p1p2p3p4p5......... Proximity Matrix l MIN l MAX l Group Average l Distance Between Centroids l Other methods driven by an objective function –Ward’s Method uses squared error
12
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN How to Define Inter-Cluster Similarity p1 p3 p5 p4 p2 p1p2p3p4p5......... Proximity Matrix l MIN l MAX l Group Average l Distance Between Centroids l Other methods driven by an objective function –Ward’s Method uses squared error
13
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN How to Define Inter-Cluster Similarity p1 p3 p5 p4 p2 p1p2p3p4p5......... Proximity Matrix l MIN l MAX l Group Average l Distance Between Centroids l Other methods driven by an objective function –Ward’s Method uses squared error
14
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN How to Define Inter-Cluster Similarity p1 p3 p5 p4 p2 p1p2p3p4p5......... Proximity Matrix l MIN l MAX l Group Average l Distance Between Centroids l Other methods driven by an objective function –Ward’s Method uses squared error
15
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN Cluster Similarity: Group Average l Proximity of two clusters is the average of pairwise proximity between points in the two clusters. l Need to use average connectivity for scalability since total proximity favors large clusters 12345
16
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN 16 2009 Teaching of Clustering Clustering Part1: Basics (September/October) 1.What is Clustering? 2.Partitioning/Representative-based Clustering K-means K-medoids 3.Density Based Clustering centering on DBSCAN 4.Region Discovery 5.Grid-based Clustering 6.Similarity Assessment Clustering Part2: Advanced Topics (November)
17
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN DBSCAN (http://www2.cs.uh.edu/~ceick/7363/Papers/dbscan.pdf )http://www2.cs.uh.edu/~ceick/7363/Papers/dbscan.pdf l DBSCAN is a density-based algorithm. –Density = number of points within a specified radius (Eps) –Input parameter: MinPts and Eps –A point is a core point if it has more than a specified number of points (MinPts) within Eps These are points that are at the interior of a cluster –A border point has fewer than MinPts within Eps, but is in the neighborhood of a core point –A noise point is any point that is not a core point or a border point.
18
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN DBSCAN: Core, Border, and Noise Points
19
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN DBSCAN Algorithm (simplified view for teaching) 1. Create a graph whose nodes are the points to be clustered 2. For each core-point c create an edge from c to every point p in the -neighborhood of c 3. Set N to the nodes of the graph; 4. If N does not contain any core points terminate 5. Pick a core point c in N 6. Let X be the set of nodes that can be reached from c by going forward; 1.create a cluster containing X {c} 2.N=N/(X {c}) 7. Continue with step 4 Remarks: points that are not assigned to any cluster are outliers; http://www2.cs.uh.edu/~ceick/7363/Papers/dbscan.pdfhttp://www2.cs.uh.edu/~ceick/7363/Papers/dbscan.pdf gives a more efficient implementation by performing steps 2 and 6 in parallel
20
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN DBSCAN: Core, Border and Noise Points Original Points Point types: core, border and noise Eps = 10, MinPts = 4
21
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN When DBSCAN Works Well Original Points Clusters Resistant to Noise Can handle clusters of different shapes and sizes
22
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN When DBSCAN Does NOT Work Well Original Points (MinPts=4, Eps=9.75). (MinPts=4, Eps=9.12) Varying densities High-dimensional data Problems with
23
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN Assignment 3 Dataset: Earthquake
24
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN Assignment3 Dataset: Complex9 http://www2.cs.uh.edu/~ml_kdd/Complex&Diamond/2DData.htm K-Means in Weka DBSCAN in Weka Dataset: http://www2.cs.uh.edu/~ml_kdd/Complex&Diamond/Complex9.txthttp://www2.cs.uh.edu/~ml_kdd/Complex&Diamond/Complex9.txt
25
Ch. Eick: Introduction to Hierarchical Clustering and DBSCAN DBSCAN: Determining EPS and MinPts l Idea is that for points in a cluster, their k th nearest neighbors are at roughly the same distance l Noise points have the k th nearest neighbor at farther distance l So, plot sorted distance of every point to its k th nearest neighbor Non-Core-points Core-points Run DBSCAN for Minp=4 and =5
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.