Yang Liu Shengyu Zhang The Chinese University of Hong Kong Fast quantum algorithms for Least Squares Regression and Statistic Leverage Scores.

Slides:



Advertisements
Similar presentations
Numerical Linear Algebra in the Streaming Model Ken Clarkson - IBM David Woodruff - IBM.
Advertisements

Statistical perturbation theory for spectral clustering Harrachov, 2007 A. Spence and Z. Stoyanov.
RandNLA: Randomized Numerical Linear Algebra
Canonical Correlation
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Image acquisition using sparse (pseudo)-random matrices Piotr Indyk MIT.
Dimensionality Reduction For k-means Clustering and Low Rank Approximation Michael Cohen, Sam Elder, Cameron Musco, Christopher Musco, and Madalina Persu.
Uniform Sampling for Matrix Approximation Michael Cohen, Yin Tat Lee, Cameron Musco, Christopher Musco, Richard Peng, Aaron Sidford M.I.T.
Two-View Geometry CS Sastry and Yang
Sketching for M-Estimators: A Unified Approach to Robust Regression
Lecture 17 Introduction to Eigenvalue Problems
Revisiting Revisiting the Nyström Method for Improved Large-scale Machine Learning Michael W. Mahoney Stanford University Dept. of Mathematics
Data mining and statistical learning - lab2-4 Lab 2, assignment 1: OLS regression of electricity consumption on temperature at 53 sites.
Sampling algorithms for l 2 regression and applications Michael W. Mahoney Yahoo Research (Joint work with P. Drineas.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Inverse Kinematics Problem:
Linear Regression Models Based on Chapter 3 of Hastie, Tibshirani and Friedman Slides by David Madigan.
5. Topic Method of Powers Stable Populations Linear Recurrences.
Fast Approximation of Matrix Coherence and Statistical Leverage Michael W. Mahoney Stanford University ( For more info, see: cs.stanford.edu/people/mmahoney/
Computing Sketches of Matrices Efficiently & (Privacy Preserving) Data Mining Petros Drineas Rensselaer Polytechnic Institute (joint.
Inverse Kinematics Problem: Input: the desired position and orientation of the tool Output: the set of joints parameters.
What is Learning All about ?  Get knowledge of by study, experience, or being taught  Become aware by information or from observation  Commit to memory.
Sketching as a Tool for Numerical Linear Algebra David Woodruff IBM Almaden.
Sketching for M-Estimators: A Unified Approach to Robust Regression Kenneth Clarkson David Woodruff IBM Almaden.
Digital Control Systems Vector-Matrix Analysis. Definitions.
Bootstrapping a Heteroscedastic Regression Model with Application to 3D Rigid Motion Evaluation Bogdan Matei Peter Meer Electrical and Computer Engineering.
Correlation. The sample covariance matrix: where.
The Multiple Correlation Coefficient. has (p +1)-variate Normal distribution with mean vector and Covariance matrix We are interested if the variable.
Algorithms for a large sparse nonlinear eigenvalue problem Yusaku Yamamoto Dept. of Computational Science & Engineering Nagoya University.
1 Information Retrieval through Various Approximate Matrix Decompositions Kathryn Linehan Advisor: Dr. Dianne O’Leary.
Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.
F ACTORIZATION M ACHINE : MODEL, OPTIMIZATION AND APPLICATIONS Yang LIU Supervisors: Prof. Andrew Yao Prof. Shengyu Zhang 1.
Statistical Leverage and Improved Matrix Algorithms Michael W. Mahoney Yahoo Research ( For more info, see: )
Multivariate Dyadic Regression Trees for Sparse Learning Problems Xi Chen Machine Learning Department Carnegie Mellon University (joint work with Han Liu)
Chapter 7 Multivariate techniques with text Parallel embedded system design lab 이청용.
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
1 Beginning & Intermediate Algebra – Math 103 Math, Statistics & Physics.
Instructor: Shengyu Zhang 1. Secretary hiring problem 2.
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
CS Statistical Machine learning Lecture 12 Yuan (Alan) Qi Purdue CS Oct
Machine Learning in CSC 196K
Project NExT CAS Panel Session University of Wisconsin, Madison July 30, 2008 Mathematica in the Military: Using CAS tools at the United States Military.
Iterative Row Sampling Richard Peng Joint work with Mu Li (CMU) and Gary Miller (CMU) CMU  MIT.
Instructor: Shengyu Zhang 1. Offline algorithms Almost all algorithms we encountered in this course assume that the entire input is given all at once.
Notes Over 3.1 Solving a System Graphically Graph the linear system and estimate the solution. Then check the solution algebraically.
Nonlinear Adaptive Kernel Methods Dec. 1, 2009 Anthony Kuh Chaopin Zhu Nate Kowahl.
Statistics 350 Lecture 13. Today Last Day: Some Chapter 4 and start Chapter 5 Today: Some matrix results Mid-Term Friday…..Sections ; ;
Complex Eigenvalues and Phase Portraits. Fundamental Set of Solutions For Linear System of ODEs With Eigenvalues and Eigenvectors and The General Solution.
Final Outline Shang-Hua Teng. Problem 1: Multiple Choices 16 points There might be more than one correct answers; So you should try to mark them all.
Canonical Correlation Analysis (CCA). CCA This is it! The mother of all linear statistical analysis When ? We want to find a structural relation between.
Background on Classification
Probabilistic Models for Linear Regression
Effcient quantum protocols for XOR functions
Eigenvalues and Eigenvectors
Linear Regression.
Linear Regression.
Some useful linear algebra
The Science of Predicting Outcome
Overview Massive data sets Streaming algorithms Regression
Results (Accuracy of Low Rank Approximation by iSVD)
Singular Value Decompsition The Chinese University of Hong Kong
Section 2: Linear Regression.
Eigenvalues and Eigenvectors
Lecture 15: Least Square Regression Metric Embeddings
MAT 2401 Linear Algebra 7.2 Diagonalization
Linear Algebra Lecture 30.
Introduction to Scientific Computing
Linear Algebra Lecture 28.
On Solving Linear Systems in Sublinear Time
Learning-Based Low-Rank Approximations
Presentation transcript:

Yang Liu Shengyu Zhang The Chinese University of Hong Kong Fast quantum algorithms for Least Squares Regression and Statistic Leverage Scores

Part I. Linear regression –Output a “quantum sketch” of solution. Part II. Computing leverage scores and matrix coherence. –Output the target numbers.

Part I: Linear regression

Closed-form solution

Relaxations *1. K. Clarkson, D. Woodruff. STOC, *2. J. Nelson, H. Nguyen. FOCS, 2013.

Quantum sketch *1. *1. A. Harrow, A. Hassidim, S. Lloyd, PRL, 2009.

Controversy

LSR results *1. *1. N. Wiebe, D. Braun, S. Lloyd, PRL, 2012.

Our algorithm for LSR

Algorithm Tool: Phase Estimation quantum algorithm. Output eigenvalue for a given eigenvector.

Extension 1: Ridge regression

Extension 2: Truncated SVD

Part II. statistic leverage scores *1. M. Mahoney, Randomized Algorithms for Matrices and Data, Foundations & Trends in Machine Learning, 2010.

Computing leverage scores *1. P. Drineas, M. Magdon-Ismail, M. Mahoney, D. Woodruff. J. MLR, 2012.

Algorithm for LSR

Algorithm

Summary We give efficient quantum algorithms for two canonical problems on sparse inputs –Least squares regression –Statistical leverage score The problems are linear algebraic, not group/number/polynomial theoretic