On L1q Regularized Regression Authors: Han Liu and Jian Zhang Presented by Jun Liu.

Slides:



Advertisements
Similar presentations
Leo Lam © Signals and Systems EE235. Leo Lam © Pet Q: Has the biomedical imaging engineer done anything useful lately? A: No, he's.
Advertisements

1 K  Convexity and The Optimality of the (s, S) Policy.
Solving Inequalities with variables on both sides of the Sign
Supremum and Infimum Mika Seppälä.
Section 4.6 (Rank).
12.1 Systems of Linear Equations: Substitution and Elimination.
Chapter 5 Orthogonality
4.I. Definition 4.II. Geometry of Determinants 4.III. Other Formulas Topics: Cramer’s Rule Speed of Calculating Determinants Projective Geometry Chapter.
Visual Recognition Tutorial
5.6 Maximization and Minimization with Mixed Problem Constraints
Chapter Similar Solids Two Solids with equal ratios of corresponding linear measurements Ratios Not Equal Not Similar Ratios equal Solids are.
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
III. Reduced Echelon Form
Auto-tuned high-dimensional regression with the TREX: theoretical guarantees and non-convex global optimization Jacob Bien 1, Irina Gaynanova 1, Johannes.
Vocabulary Chapter 6.
Solving Equations with variables on both sides of the Equals Chapter 3.5.
4 Solving Equations 4.1 Simplifying Expressions and Combining Like Terms 4.2 Addition and Subtraction Properties of Equality 4.3 Multiplication and Division.
Scientific Computing Matrix Norms, Convergence, and Matrix Condition Numbers.
Systems of linear equations. Simple system Solution.
Math Dept, Faculty of Applied Science, HCM University of Technology
How To Find The Reduced Row Echelon Form. Reduced Row Echelon Form A matrix is said to be in reduced row echelon form provided it satisfies the following.
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Inequalities and Proof
Chapter 2 Simultaneous Linear Equations (cont.)
A matrix equation has the same solution set as the vector equation which has the same solution set as the linear system whose augmented matrix is Therefore:
Adjoint matrix Yes! The array of algebraic complements!
SECTION 4-3: SYSTEMS OF LINEAR INEQUALITIES Goal: Graph, write and use a system of linear inequalities.
Section 4-1: Introduction to Linear Systems. To understand and solve linear systems.
Chapter 2 Nonnegative Matrices. 2-1 Introduction.
Mathe III Lecture 3 Mathe III Lecture 3 Mathe III Lecture 3 Mathe III Lecture 3.
Chapter 3 Determinants Linear Algebra. Ch03_2 3.1 Introduction to Determinants Definition The determinant of a 2  2 matrix A is denoted |A| and is given.
Goal: Solve linear equations.. Definitions: Equation: statement in which two expressions are equal. Linear Equation (in one variable): equation that.
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
Chapter 1 - Fundamentals Inequalities. Rules for Inequalities Inequalities.
E XACT MATRIX C OMPLETION VIA CONVEX OPTIMIZATION E MMANUEL J. C ANDES AND B ENJAMIN R ECHT M AY 2008 Presenter: Shujie Hou January, 28 th,2011 Department.
Inequalities and their Graphs Objective: To write and graph simple inequalities with one variable.
Section 1.2 Gaussian Elimination. REDUCED ROW-ECHELON FORM 1.If a row does not consist of all zeros, the first nonzero number must be a 1 (called a leading.
Chapter 1 Linear Algebra S 2 Systems of Linear Equations.
M3 1.5 Systems of Linear Inequalities M3 1.5 Systems of Linear Inequalities Essential Questions: How can we write and graph a system of linear inequalities.
2.1 – Linear and Quadratic Equations Linear Equations.
Chapter 2 Determinants. With each square matrix it is possible to associate a real number called the determinant of the matrix. The value of this number.
Copyright © 2010 Pearson Education, Inc. All rights reserved Sec
Similar Solids definition Similar Solids Two solids in which their corresponding linear measures form equal ratios
2.8 Two-Variable Inequalities In some situations you need to compare quantities. You can use inequalities for situations that involve these relationships:
Sullivan Algebra and Trigonometry: Section 12.9 Objectives of this Section Set Up a Linear Programming Problem Solve a Linear Programming Problem.
Appendix A.6 Solving Inequalities. Introduction Solve an inequality  Finding all values of x for which the inequality is true. The set of all real numbers.
4.3 Riemann Sums and Definite Integrals
Solving Multistep Linear Equations Using Algebra Tiles
Systems of linear equations
Cyclic Codes 1. Definition Linear:
College Algebra Chapter 3 Polynomial and Rational Functions
Visual Recognition Tutorial
Solving Linear Equations and Inequalities
Linear Inequalities in Two Variables
After checking the solutions to the Lesson Practice from Friday, take a copy of Worksheet 2-5 and complete problems 2, 4, &
FRACTIONAL INEQUALITIES
Chapter 7 Factoring. Chapter 7 Factoring Solving Equations by Factoring 7.5 Solving Equations by Factoring.
Absolute Value Inequalities
5.5 Properties of the Definite Integral
4.6: Rank.
VII.1 Hille-Yosida Theorem
Solving Linear Inequalities
What can we know from RREF?
Chapter 1: Linear Equations in Linear Algebra
Solving Linear Equations and Inequalities
Properties of Definite Integrals
Absolute Value Inequalities
CIS 700: “algorithms for Big Data”
1. How do I Solve Linear Equations
NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS
Presentation transcript:

On L1q Regularized Regression Authors: Han Liu and Jian Zhang Presented by Jun Liu

Problem (1)

Problem (2) The number of groups is much larger than the number of samples

Outline Proposition 2.1, 2.2 (Subgradient, linearly dependent) Definition (Properties to be established) Theorem 3.1 (Variable Selection Consistency) Lemma 4.1 (Technical lemma) Assumption 1, Theorem 4.3 (Consistency, linear model) Assumption 2. Theorem 4.5 (Inequality, misspecified model) Assumption 4, Theorem 5.1 (Risk consistency)

We want to find the such that the q’-norm of is either equal to a constant value (for nonzero groups) or bounded (for zero groups)

Scale invariant, Sign Preserving Scale invariant, Sign Preserving Scale invariant, Sign Preserving

Hint This result is similar to the Lasso. The key is that, so that any m>n columns of X are linearly dependent.

g j, j=2, …, s is not changed

Outline Proposition 2.1, 2.2 (Subgradient, linear dependent) Definition (Properties to be established) Theorem 3.1 (Variable Selection Consistency) Lemma 4.1 (Technical lemma) Assumption 1, Theorem 4.3 (Consistency, linear model) Assumption 2. Theorem 4.5 (Inequality, misspecified model) Assumption 4, Theorem 5.1 (Risk consistency)

Outline Proposition 2.1, 2.2 (Subgradient, linear dependent) Definition (Properties to be established) Theorem 3.1 (Variable Selection Consistency) Lemma 4.1 (Technical lemma) Assumption 1, Theorem 4.3 (Consistency, linear model) Assumption 2. Theorem 4.5 (Inequality, misspecified model) Assumption 4, Theorem 5.1 (Risk consistency)

Key Points in the Proof Objective Two parts Tools Proof by construction Solution is not unique

Part 1

Part 2

Outline Proposition 2.1, 2.2 (Subgradient, linear dependent) Definition (Properties to be established) Theorem 3.1 (Variable Selection Consistency) Lemma 4.1 (Technical lemma) Assumption 1, Theorem 4.3 (Consistency, linear model) Assumption 2. Theorem 4.5 (Inequality, misspecified model) Assumption 4, Theorem 5.1 (Risk consistency)

Outline Proposition 2.1, 2.2 (Subgradient, linear dependent) Definition (Properties to be established) Theorem 3.1 (Variable Selection Consistency) Lemma 4.1 (Technical lemma) Assumption 1, Theorem 4.3 (Consistency, linear model) Assumption 2. Theorem 4.5 (Inequality, misspecified model) Assumption 4, Theorem 5.1 (Risk consistency)