Progress Report Alvaro Velasquez.

Slides:



Advertisements
Similar presentations
Nonnegative Matrix Factorization with Sparseness Constraints S. Race MA591R.
Advertisements

Example: Given a matrix defining a linear mapping Find a basis for the null space and a basis for the range Pamela Leutwyler.
Vector Operations in R 3 Section 6.7. Standard Unit Vectors in R 3 The standard unit vectors, i(1,0,0), j(0,1,0) and k(0,0,1) can be used to form any.
Sparse Modeling for Finding Representative Objects Ehsan Elhamifar Guillermo Sapiro Ren´e Vidal Johns Hopkins University University of Minnesota Johns.
Patch-based Image Deconvolution via Joint Modeling of Sparse Priors Chao Jia and Brian L. Evans The University of Texas at Austin 12 Sep
Patch to the Future: Unsupervised Visual Prediction
Extensions of wavelets
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
A novel supervised feature extraction and classification framework for land cover recognition of the off-land scenario Yan Cui
Mean transform, a tutorial KH Wong mean transform v.5a1.
Nonlinear Unsupervised Feature Learning How Local Similarities Lead to Global Coding Amirreza Shaban.
Large-Scale, Real-World Face Recognition in Movie Trailers Week 2-3 Alan Wright (Facial Recog. pictures taken from Enrique Gortez)
DIMENSIONALITY REDUCTION BY RANDOM PROJECTION AND LATENT SEMANTIC INDEXING Jessica Lin and Dimitrios Gunopulos Ângelo Cardoso IST/UTL December
Sparse and Overcomplete Data Representation
Region Segmentation. Find sets of pixels, such that All pixels in region i satisfy some constraint of similarity.
Image Denoising via Learned Dictionaries and Sparse Representations
Avoiding Communication in Sparse Iterative Solvers Erin Carson Nick Knight CS294, Fall 2011.
How to use Matrix Market matrices in Matlab The Matrix Market is an interesting collection of matrices from a variety of applications.
Orthogonality and Least Squares
6.829 Computer Networks1 Compressed Sensing for Loss-Tolerant Audio Transport Clay, Elena, Hui.
Do Now Pass out calculators. Solve the following system by graphing: Graph paper is in the back. 5x + 2y = 9 x + y = -3 Solve the following system by using.
Section 8.3 – Systems of Linear Equations - Determinants Using Determinants to Solve Systems of Equations A determinant is a value that is obtained from.
Review: Intro to recognition Recognition tasks Machine learning approach: training, testing, generalization Example classifiers Nearest neighbor Linear.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Jinhui Tang †, Shuicheng Yan †, Richang Hong †, Guo-Jun Qi ‡, Tat-Seng Chua † † National University of Singapore ‡ University of Illinois at Urbana-Champaign.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Table of Contents Topic Page # A Absolute Value Less ThAND B Absolute Value GreatOR Than Two Variable Inequalities Solve Systems.
Solving Systems of Equations
Modern Navigation Thomas Herring
Cs: compressed sensing
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
AN ORTHOGONAL PROJECTION
Vector Norms and the related Matrix Norms. Properties of a Vector Norm: Euclidean Vector Norm: Riemannian metric:
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
Class Opener:. Identifying Matrices Student Check:
Solving Linear Systems Using Linear Combinations There are two methods of solving a system of equations algebraically: Elimination (Linear Combinations)
Solve a system of linear equations By reducing a matrix Pamela Leutwyler.
GUIDED PRACTICE for Example – – 2 12 – 4 – 6 A = Use a graphing calculator to find the inverse of the matrix A. Check the result by showing.
Occlusion Tracking Using Logical Models Summary. A Variational Partial Differential Equations based model is used for tracking objects under occlusions.
Algebra Review. Systems of Equations Review: Substitution Linear Combination 2 Methods to Solve:
Chapter 5: Matrices and Determinants Section 5.5: Augmented Matrix Solutions.
REU Week 1 Presented by Christina Peterson. Edge Detection Sobel ◦ Convolve image with derivative masks:  x:  y: ◦ Calculate gradient magnitude ◦ Apply.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Progress Report #2 Alvaro Velasquez. Project Selection I chose to work with Nasim Souly on the project titled “Subspace Clustering via Graph Regularized.
REVIEW Linear Combinations Given vectors and given scalars
Lecture 16: Image alignment
13.4 Product of Two Matrices
Jinbo Bi Joint work with Jiangwen Sun, Jin Lu, and Tingyang Xu
Progress Report: Week 6 Alvaro Velasquez.
Matrix Equations Step 1: Write the system as a matrix equation. A three-equation system is shown below. First matrix are the coefficients of all the.
Use a graphing calculator to determine the graph of the equation {image} {applet}
Systems of Linear Equations
Jianping Fan Dept of CS UNC-Charlotte
Mean transform , a tutorial
Solve System by Linear Combination / Addition Method
REVIEW: Solving Linear Systems by Elimination
Solutions to Systems of Equations
Domingo Mery Department of Computer Science
Find the inverse of the matrix
Least Squares Method: the Meaning of r2
Simultaneous Equations
USING GRAPHS TO SOLVE EQUATIONS
Sparselet Models for Efficient Multiclass Object Detection
Parallelization of Sparse Coding & Dictionary Learning
Linear Algebra Lecture 39.
Systems of Equations Solve by Graphing.
Sparse and Redundant Representations and Their Applications in
GRAPHING LINEAR EQUATIONS
Amari Lewis Aidean Sharghi
Outline Sparse Reconstruction RIP Condition
Presentation transcript:

Progress Report Alvaro Velasquez

Topic: “Subspace Clustering via Graph Regularized Sparse Coding” Currently using the SegTreck_201111 dataset to track objects of interest using sparse representations. Next step is to add a graph regularizer and Laplacian matrix and apply this framework to video segmentation.

I started by choosing an object of interest and splitting into six patches. I calculated the sparse representation of these patches by representing them as a linear combination of coefficients and a dictionary. The equation is presented as a minimization problem to minimize the L1-norm of the coefficients matrix, thereby guaranteeing sparsity. The equation is solved using the SPAMS library for Matlab. More specifically, the function mexLasso is used (the dictionary and the column-wise vector representation of the patches are unit normalized first).

To track the patches in sequential frames in the video, we find the patches with the minimal Euclidean distance from the patches in the previous frame. The candidate matches are selected as all the possible patches in a subset of the image where the object of interest is likely to be (using the whole image proved to be too expensive). Six patches were tracked per frame for 20 frames. Only two of the 120 total patches were outliers.

Each patch is 18x18. They are represented as 324x1 vectors, of which all elements are initially nonzero. After finding the sparse representation of each patch, the new 324x1 vectors contained between 10 and 20 nonzero elements, but there was less than 0.1% difference in the euclidean distance from the sparse representation and the original representation.