Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Speech and.

Similar presentations


Presentation on theme: "University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Speech and."— Presentation transcript:

1 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Speech and Image Processing Unit Department of Computer Science University of Joensuu, FINLAND Ilja Sidoroff Pasi Fränti Dimensionality Clustering Methods: Part 6

2 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Dimensionality of data Dimensionality of data set = the minimum number of free variables needed to represent data without information loss An d-attribute data set has an intrinsic dimensionality (ID) of M if its elements lie entirely within an M-dimensional subspace of R d (M < d)

3 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Dimensionality of data The use of more dimensions than necessary leads to problems: –greater storage requirements –the speed of algorithms is slower –finding clusters and creating good classifiers is more difficult (curse of dimensionality)

4 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Curse of dimensionality When the dimensionality of space increases, distance measures become less useful –all points are more or less equidistant –most of the volume of a sphere is concentrated on a thin layer near the surface of the sphere (eg. next slide)

5 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi V(r) – volume of sphere with radius r D – dimension of the sphere

6 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Two approaches Estimation of dimensionality –knowing ID of data set could help in tuning classification or clustering performance Dimensionality reduction –projecting data to some subspace –eg. 2D/3D visualisation of multi- dimensional data set –may result in information loss if the subspace dimension is smaller than ID

7 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Goodness of the projection Can be estimated by two measures: Trustworthiness: data points that are not neighbours in input space are not mapped as neighbours in output space. Continuity: data points that are close are not mapped far away in output space [11].

8 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Trustworthiness N - number of feature vectors r(i,j) – the rank of data sample j in the ordering according to the distance from i in the original data space U k (i) – set of feature vectors that are in the size k-neighbourhood of sample i in the projection space but not in the original space A(k) – Scales the measure between 0 and 1

9 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Continuity r'(i,j) – the rank of data sample j in the ordering according to the distance from i in the projection space V k (i) – set of feature vectors that are in the size k- neighbourhood of sample i in the original space but not in the projection space

10 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Example data sets Swiss roll: 20000 3D points 2D manifold in 3D space http://isomap.stanford.edu

11 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Example data sets 16  16 pixel images of hands in different positions Each image can be considered as 4096-dimensional data element Could also be interpreted in terms of finger extension – wrist rotation (2D)

12 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Example data sets http://isomap.stanford.edu

13 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Synthetic data sets [11] S-shaped manifold Sphere Six clusters

14 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Principal component analysis (PCA) Idea: find directions of maximal variance and align coordinate axis to them. If variance is zero, that dimension is not needed. Drawback: works well only with linear data [1]

15 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi PCA method (1/2) Center data so that its means are zero Calculate covariance matrix for data Calculate eigenvalues and eigenvectors of the covariance matrix Arrange eigenvectors according to the eigenvalues For dimensionality reduction, choose the desired number of eigenvectors (2 or 3 for visualization)

16 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi PCA Method Intrinsic dimensionality = number of non-zero eigenvalues Dimensionality reduction by projection: y i = Ax i Here x i is the input vector, y i the output vector, and A is the matrix containing eigenvectors corresponding to the largest eigenvalues. For visualization typically 2 or 3 eigenvalues preserved.

17 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Example of PCA The distances between points are different in projections. Test set c: – two clusters are projected into one cluster – s-shaped cluster is projected nicely

18 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Another example of PCA [10] Data set: point lying on circle: (x 2 + y 2 = 1), ID = 2 PCA yield two non-null eigenvalues u, v – principal components

19 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Limitations of PCA Since eigenvectors are orthogonal works well only with linear data Tends to overestimate ID Kernel PCA uses so called kernel trick to apply PCA also to non linear data –make non linear projection into a higher dimensional space, perform PCA analysis in this space

20 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Multidimensional scaling method (MDS) Project data into a new space while trying to preserve distances between data points Define stress E (difference of pairwise distances in original and projection spaces) E is minimized using some optimization algorithm With certain stress functions (i.e. Kruskal) when E is 0, perfect projection exists ID of the data is the smallest projection dimension where perfect projection exists

21 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Metric MDS The simplest stress function [2], raw stress: d(x i, x j ) distance in the original space d(y i, y j ) distance in the projection space y i, y j representation of x i, x j in output space

22 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Sammon's Mapping Sammon's mapping gives small distances a larger weight [5]:

23 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Kruskal's stress Ranking the point distances accounts for decreasing distances in lower dimensional projections:

24 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi MDS example Separates clusters better than PCA Local structures are not always preserved (leftmost test set)

25 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Other MDS approaches ISOMAP [12] Curvilinear component analysis CCA [13]

26 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Local methods Previous methods are global in the sense that the all input data is considered at once. Local methods consider only some neighbourhood of data points  may be computationally less demanding Try to estimate topological dimension of the data manifold

27 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Fukunaga-Olsen algorithm [6] Assume that data can be divided into small regions, i.e. clustered Each cluster (voronoi set) of the data vector lies in an approximately linear surface => PCA method can be applied to each cluster Eigenvalues are normalized by diving by the largest eigenvalue

28 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Fukunaga-Olsen algorithm ID is defined as the number of normalized eigenvalues that are larger than a threshold T Defining a good threshold is a problem as such

29 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Near neighbour algorithm Trunk's method [7]: –An initial value for an integer parameter k is chosen (usually k=1). –k nearest neighbours for each data vector are identified. –for each data vector i, subspace spanned by vectors from i to each of its k neighbours is constructed.

30 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Near neighbour algorithm –The angle between (k+1) th near neighbour and its projection to the subspace is calculated for each data vector –If the average of these angles is below a threshold, ID is k, otherwise increase k and repeat the process (k+1) th - neighbour subspace angle

31 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Pseudocode

32 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Near neighbour algorithm It is not clear how to select suitable value for threshold Improvements to Trunk's method –Pettis et al. [8] –Verver-Duin [9]

33 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Fractal methods Global methods, but different definition of dimensionality Basic idea: –count the observations inside a ball of radius r (f(r)). –analyse the growth rate of f(r) –if f grows as r k the dimensionality of data can be considered as k

34 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Fractal methods Dimensionality can be fractional, i.e. 1.5 So does not provide projections for lesser dimensional space (what is an R 1,5 anyway?) Fractal dimensionality estimate can be used in time-series analysis etc. [10]

35 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Fractal methods Different definitions for fractal dimensions [10] –Hausdorff dimension –Box-counting dimension –Correlation dimension In order to get an accurate estimate of the dimension D, the data set cardinality must be at least 10 D/2

36 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Hausdorff dimension data set is covered by cells s i with variable diameter r i, all r i < r in other words, we look for collection of covering sets s i with diameter less than or equal to r, which minimizes the sum d-dimensional Hausdorff measure:

37 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Hausdorff dimension For every data set Γ d H is infinite if d is less than some critical value D H, and 0 if d is greater than D H The critical value D H is the Hausdorff dimension of the data set

38 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Box-Counting dimension Hausdorff dimension is not easy to calculate Box-Counting D B dimension is an upper bound of Hausdorff dimension, does not usually differ from it: v(r) – is the number of the boxes of size r needed to cover the data set

39 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Although Box-Counting dimension is easier to calculate than Hausdorff dimension, the algorithmic complexity grows exponentially with the set dimensionality => can be used only for low-dimensional data sets Correlation dimension is computationally more feasible fractal dimension measure Correlation dimension is an lower bound of the Box-Counting dimension Box-Counting dimension

40 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Correlation dimension Let x 1, x 2, x 3,..., x N be data points Correlation integral can be defined as: I(x) is indicator function: I(x) = 1, iff x is true, I(x) = 0, otherwise.

41 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Correlation dimension (some explanation needed!!!)

42 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Literature 1.M. Kirby, Geometric Data Analysis: An Empirical Approach to Dimensionality Reduction and the Study of Patterns, John Wiley and Sons, 2001. 2.J. B. Kruskal, Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis, Psychometrika 29 (1964) 1–27. 3.R. N. Shepard, The analysis of proximities: Multimensional scaling with an unknown distance function, Psychometrika 27 (1962) 125– 140. 4.R. S. Bennett, The intrinsic dimensionality of signal collections, IEEE Transactions on Information Theory 15 (1969) 517–525. 5.J. W. J. Sammon, A nonlinear mapping for data structure analysis, IEEE Transaction on Computers C-18 (1969) 401–409. 6.K. Fukunaga, D. R. Olsen, An algorithm for finding intrinsic dimensionality of data, IEEE Transactions on Computers 20 (2) (1976) 165–171. 7.G. V. Trunk, Statistical estimation of the intrinsic dimensionality of a noisy signal collection, IEEE Transaction on Computers 25 (1976) 165–171.

43 University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi 8.K. Pettis, T. Bailey, T. Jain, R. Dubes, An intrinsic dimensionality estimator from near-neighbor information, IEEE Transaction on Pattern Analysis and Machine Intelligence 1 (1) (1979) 25–37. 9.P. J. Verveer, R. Duin, An evaluation of intrinsic dimensionality estimators, IEEE Transaction on Pattern Analysis and Machine Intelligence 17 (1) (1995) 81–86. 10.F. Camastra, Data dimensionality estimation methods: a survey, Pattern Recognition 36 (2003) 2945-2954. 11.J. Venna, Dimensionality reduction for visual exploration of similarity structures (2007), PhD thesis manuscript (submitted) 12.J. B. Tenenbaum, V. de Silva, J. C. Langford, A global geometric framework for nonlinear dimensionality reduction, Science 290 (12) (2000) 2319–2323. 13.P. Demartines, J. Herault, Curvilinear component analysis: A self- organizing neural network for nonlinear mapping in cluster analysis, IEEE Transactions on Neural Networks 8 (1) (1997) 148– 154. Literature


Download ppt "University of Joensuu Dept. of Computer Science P.O. Box 111 FIN- 80101 Joensuu Tel. +358 13 251 7959 fax +358 13 251 7955 www.cs.joensuu.fi Speech and."

Similar presentations


Ads by Google