Download presentation
Presentation is loading. Please wait.
1
Chebyshev Estimator Presented by: Orr Srour
2
References Yonina Eldar, Amir Beck and Marc Teboulle, "A Minimax Chebyshev Estimator for Bounded Error Estimation" (2007), to appear in IEEE Trans. Signal Proc. Amir Beck and Yonina C. Eldar, Regularization in Regression with Bounded Noise: A Chebyshev Center Approach, SIAM J. Matrix Anal. Appl. 29 (2), 606-625 (2007). Jacob (Slava) Chernoi and Yonina C. Eldar, Extending the Chebyshev Center estimation Technique, TBA
3
Chebyshev Center - Agenda Introduction CC - Basic Formulation CC - Geometric Interpretation CC - So why not..? Relaxed Chebyshev Center (RCC) Formulation of the problem Relation with the original CC Feasibility of the original CC Feasibility of the RCC CLS as a CC relaxation CLS vs. RCC Constraints formulation Extended Chebyshev Center
4
Notations y – boldface lowercase = vector y i - i ’ th component of the vector y A - boldface uppercase = matrix - hat = the estimated vector of x = A – B is PD, PSD
5
The Problem Estimate the deterministic parameter vector from observations with: A – n x m model matrix w – perturbation vector.
6
LS Solution When nothing else is known, a common approach is to find the vector that minimizes the data error: Known as “ least squares ”, this solution can be written explicitly: (Assuming A has a full column rank)
7
But … In practical situations A is often ill- conditioned -> poor LS results
8
Regularized LS Assume we have some simple prior information regarding the parameter vector x. Then we can use the regularized least squares (RLS):
9
But … But what if we have some prior information regarding the noise vector as well … ? What if we have some more complicated information regarding the parameter vector x?
10
Assumptions From now on we assume that the noise is norm-bounded : And that x lies in a set defined by: (hence C is the intersection of k ellipsoids)
11
Assumptions The feasible parameter set of x is then given by: (hence Q is compact) Q is assumed to have non-empty interior
12
Constrained Least Squares (CLS) Given the prior knowledge, a popular estimation strategy is: - Minimization of the data error over C - But: the noise constraint is unused … More importantly, it doesn ’ t necessarily lead to small estimation error:
13
Chebyshev Center The goal: estimator with small estimation error Suggested method: minimize the worst-case error over all feasible vectors
14
Chebyshev Center – Geometric Interpretation Alternative representation: -> find the smallest ball (hence its center and its radius r ) which encloses the set Q.
15
Chebyshev Center – Geometric Interpretation
16
Chebyshev Center This problem is more commonly known as finding “ Chebyshev ’ s Center ”. Pafnuty Lvovich Chebyshev 16.5.1821 – 08.12.1894
17
Chebyshev Center – The problem The inner maximization is non-convex Computing CC is a hard optimization problem Can be solved efficiently over the complex domain for intersection of 2 ellipsoids
18
Relaxed Chebyshev Center (RCC) Let us consider the inner maximization first: and:
19
Relaxed Chebyshev Center (RCC) Denoting, we can write the optimization problem as: with: Concave Not Convex
20
Relaxed Chebyshev Center (RCC) Let us replace G with: And write the RCC as the solution of: Convex
21
Relaxed Chebyshev Center (RCC) T is bounded The objective is concave (linear) in The objective is convex in We can replace the order: min-max to max-min
22
Relaxed Chebyshev Center (RCC) The inner minimization is a simple quadratic problem resulting with Thus the RCC problem can be written as: Note: this is a convex optimization problem.
23
RCC as an upper bound for CC RCC is not generally equal to the CC (except for k = 1 over the the complex domain) Since we have: Hence the RCC provides an upper bound on the optimal minimax value.
24
CC – The problem
25
RCC Solution Theorem: The RCC estimator is given by:
26
RCC Solution Where are the optimal solution of: subject to:
27
RCC Solution – as SDP Or as a semidefinite program (SDP): s.t.:
28
Feasibility of the CC Proposition: is feasible. Proof: Let us write the opt. problem as: with: 1. Convex in 2. strictly convex: 3. has a UNIQUE solution
29
Feasibility of the CC Let us assume that is infeasible, and denote by y its projection onto Q. By the projection theorem: and therefore:
30
Feasibility of the CC So: Which using the compactness of Q implies: But this contradicts the optimality of. Hence: is unique and feasible.
31
Feasibility of the RCC Proposition: is feasible. Proof: Uniqueness follows from the approach used earlier. Let us prove feasibility by showing that any solution of the RCC is also a solution of the CC.
32
Feasibility of the RCC Let be a solution for the RCC problem. Then: Since: We get:
33
CLS as CC relaxation We now show that CLS is also a (looser) relaxation of the Chebyshev center. Reminder:
34
CLS as CC relaxation Note that is equivalent to Define the following CC relaxation: unharmed relaxed
35
CLS as CC relaxation Theorem: The CLS estimate is the same as the relaxed CC over V (here CCV). Proof: Les us assume is the CCV solution, and the RCC solution,. The RCC is a strictly convex problem, so its solution is unique:
36
CLS as CC relaxation Define It is easy to show that (hence it is a valid solution for the CCV)
37
CLS as CC relaxation Denote by the objective of the CCV. By definition: contradicting the optimality of. > 0
38
CLS vs. RCC Now, as in the proof of the feasibility of the RCC, we know that: And so: Which means that the CLS estimate is the solution of a looser relaxation than that of the RCC.
39
Modeling Constraints The RCC optimization method is based upon a relaxation of the set Q Different characterizations of Q may lead to different relaxed sets. Indeed, the RCC depends on the specific chosen form of Q. (unlike CC and CLS)
40
Linear Box Constraints Suppose we want to append box constraints upon x: These can also be written as: Which of the two is preferable … ?
41
Linear Box Constraints Define:
42
Linear Box Constraints Suppose, then: Since, it follows that: Which can be written as:
43
Linear Box Constraints Hence: T1 is a looser relaxation -> T2 is preferable.
44
Linear Box Constraints An example in R2: The constraints have been chosen as the intersection of: A randomly generated ellipsoid [-1, 1] x [-1, 1]
45
Linear Box Constraints
47
An example – Image Deblurring x is a raw vector of a 16 x 16 image. A is a 256 x 256 matrix representing atmospheric turbulence blur (4 HBW, 0.8 STD). w is a WGN vector with std 0.05. The observations are Ax+w We want x back …
48
An example – Image Deblurring LS: RLS: with CLS: RCC:
50
Chebyshev Center - Agenda Introduction CC - Basic Formulation CC - Geometric Interpretation CC - So why not..? Relaxed Chebyshev Center (RCC) Formulation of the problem Relation with the original CC Feasibility of the original CC Feasibility of the RCC CLS as a CC relaxation CLS vs. RCC Constraints formulation Extended Chebyshev Center
51
Questions..? Now is the time …
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.