Slides:



Advertisements
Similar presentations
EigenFaces and EigenPatches Useful model of variation in a region –Region must be fixed shape (eg rectangle) Developed for face recognition Generalised.
Advertisements

3D Geometry for Computer Graphics
Surface normals and principal component analysis (PCA)
Tensors and Component Analysis Musawir Ali. Tensor: Generalization of an n-dimensional array Vector: order-1 tensor Matrix: order-2 tensor Order-3 tensor.
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Dimensionality reduction. Outline From distances to points : – MultiDimensional Scaling (MDS) – FastMap Dimensionality Reductions or data projections.
Symmetric Matrices and Quadratic Forms
Motion Analysis Slides are from RPI Registration Class.
Computer Graphics Recitation 5.
3D Geometry for Computer Graphics
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Computer Graphics Recitation 6. 2 Last week - eigendecomposition A We want to learn how the transformation A works:
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
3D Geometry for Computer Graphics
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Geometry: Dilations. We have already discussed translations, reflections and rotations. Each of these transformations is an isometry, which means.
Transforms: Basis to Basis Normal Basis Hadamard Basis Basis functions Method to find coefficients (“Transform”) Inverse Transform.
Matrices CS485/685 Computer Vision Dr. George Bebis.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
SVD(Singular Value Decomposition) and Its Applications
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
CS246 Topic-Based Models. Motivation  Q: For query “car”, will a document with the word “automobile” be returned as a result under the TF-IDF vector.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
From Pixels to Features: Review of Part 1 COMP 4900C Winter 2008.
Class 27: Question 1 TRUE or FALSE: If P is a projection matrix of the form P=A(A T A) -1 A T then P is a symmetric matrix. 1. TRUE 2. FALSE.
§ Linear Operators Christopher Crawford PHY
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
A function is a rule f that associates with each element in a set A one and only one element in a set B. If f associates the element b with the element.
3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the transformation A works:
Lecture Note 1 – Linear Algebra Shuaiqiang Wang Department of CS & IS University of Jyväskylä
Chapter 13 Discrete Image Transforms
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Transformation methods - Examples
Tutorial 6. Eigenvalues & Eigenvectors Reminder: Eigenvectors A vector x invariant up to a scaling by λ to a multiplication by matrix A is called.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Matrices and vector spaces
LECTURE 10: DISCRIMINANT ANALYSIS
Lecture: Face Recognition and Feature Reduction
Homogeneous Coordinates (Projective Space)
Dipartimento di Ingegneria «Enzo Ferrari»,
Spectral Methods Tutorial 6 1 © Maks Ovsjanikov
Linear Algebra Review.
Singular Value Decomposition
See also mappersummary2a
Some useful linear algebra
SVD: Physical Interpretation and Applications
googling (or duckduckgoing, etc): "topic of interest" type:.csv
CS485/685 Computer Vision Dr. George Bebis
PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the.
Christopher Crawford PHY
Outline Singular Value Decomposition Example of PCA: Eigenfaces.
Feature space tansformation methods
Principal Components What matters most?.
Digital Image Processing Lecture 21: Principal Components for Description Prof. Charlene Tsai *Chapter 11.4 of Gonzalez.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
LECTURE 09: DISCRIMINANT ANALYSIS
Lecture 13: Singular Value Decomposition (SVD)
The Multivariate Normal Distribution, Part I
Performance Surfaces.
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Lecture 20 SVD and Its Applications
Marios Mattheakis and Pavlos Protopapas
Presentation transcript:

https://en.wikipedia.org/wiki/Reeb_graph

https://en.wikipedia.org/wiki/Reeb_graph http://www.nature.com/srep/2013/130207/srep01236/full/srep01236.html

mapper.filters.kNN_distance(data, k, metricpar={}, callback=None) The distance to the k-th nearest neighbor as an (inverse) measure of density. Note how the number of nearest neighbors is understood: k=1, the first neighbor, makes no sense for a filter function since the first nearest neighbor of a data point is always the point itself, and hence this filter function is constantly zero. The parameter k=2 measures the distance from xi to the nearest data point other than xi itself.

mapper.filters.kNN_distance(data, k, metricpar={}, callback=None) The distance to the k-th nearest neighbor as an (inverse) measure of density. Note how the number of nearest neighbors is understood: k=1, the first neighbor, makes no sense for a filter function since the first nearest neighbor of a data point is always the point itself, and hence this filter function is constantly zero. The parameter k=2 measures the distance from xi to the nearest data point other than xi itself. d1(x) = d2(x) = d3(x) = d4(x) = z 5 4 x y 3

x y If x is in a denser region than y, then dk(x) dk(y)

Image of a circle under linear transformation T(x) = Ax where A is a symmetric matrix

https://en.wikipedia.org/wiki/File:Singular-Value-Decomposition.svg The SVD decomposes M into three simple transformations: a rotation V*, a scaling Σ along the coordinate axes and a second rotation U.

mapper.filters.dm_eigenvector(data, k=0, mean_center=True, metricpar={}, verbose=True, callback=None) Return the k-th eigenvector of the distance matrix. The matrix of pairwise distances is symmetric, so it has an orthonormal basis of eigenvectors. The parameter k can be either an integer or an array of integers (for multi-dimensional filter functions). The index is zero-based, and eigenvalues are sorted by absolute value, so k=0 returns the eigenvector corresponding to the largest eigenvalue in magnitude. If mean_center is True, the distance matrix is double-mean-centered before the eigenvalue decomposition. Reference: [R6], subsection “Principal metric SVD filters”.