How do I know the answer if I’m not sure of the question?

Slides:



Advertisements
Similar presentations
Edge Preserving Image Restoration using L1 norm
Advertisements

1 Simple Linear Regression and Correlation The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES Assessing the model –T-tests –R-square.
Solving Linear Equations
Topic 3: Simple Linear Regression. Outline Simple linear regression model –Model parameters –Distribution of error terms Estimation of regression parameters.
GG 313 Geological Data Analysis # 18 On Kilo Moana at sea October 25, 2005 Orthogonal Regression: Major axis and RMA Regression.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Fitting. We’ve learned how to detect edges, corners, blobs. Now what? We would like to form a higher-level, more compact representation of the features.
Lecture 3 Calibration and Standards. The points (1,2) and (6,5) do not fall exactly on the solid line, but they are too close to the line to show their.
Mr Barton’s Maths Notes
Section 9.1 The Square Root Property Section 9.2 The Quadratic Formula.
Last lecture summary independent vectors x
CISE-301: Numerical Methods Topic 1: Introduction to Numerical Methods and Taylor Series Lectures 1-4: KFUPM.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Solving Quadratic Equations Section 1.3
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
CISE-301: Numerical Methods Topic 1: Introduction to Numerical Methods and Taylor Series Lectures 1-4: KFUPM CISE301_Topic1.
AP Physics Monday Standards: 1)a. Students should understand the general relationships between position velocity & acceleration for a particle.
MA 1128: Lecture 17 – 6/17/15 Adding Radicals Radical Equations.
CHEMISTRY ANALYTICAL CHEMISTRY Fall Lecture 6.
1.5 – Day 1 Equations.
The Quadratic Formula Students will be able to solve quadratic equations by using the quadratic formula.
8-1 Completing the Square
Jeopardy Looking for a parabola A Curve that Fits Factor it Square it A formula for the quadratic Imagine that
Backward Thinking Confessions of a Numerical Analyst Keith Evan Schubert.
How do I know the answer if I’m not sure of the question? Putting robustness into estimation K. E. Schubert 11/7/00.
Solving Quadratic Equations by Finding Square Roots.
Warm Up Find each product. 1. (x + 2)(x + 7) 2. (x – 11)(x + 5)
Classification Course web page: vision.cis.udel.edu/~cv May 14, 2003  Lecture 34.
Linear Regression Line of Best Fit. Gradient = Intercept =
Lecture 8: Ordinary Least Squares Estimation BUEC 333 Summer 2009 Simon Woodcock.
Fitting.
Problems in solving generic AX = B Case 1: There are errors in data such that data cannot be fit perfectly (analog: simple case of fitting a line while.
Chapter 14, continued More simple linear regression Download this presentation.
2.2 Solving Quadratic Equations Algebraically Quadratic Equation: Equation written in the form ax 2 + bx + c = 0 ( where a ≠ 0). Zero Product Property:
SOLVING QUADRATIC EQUATIONS A.4c: The student will solve multi-step linear and quadratic equations in two variables, including…solving quadratic equations.
Mr Barton’s Maths Notes
Regression Analysis AGEC 784.
Unit 3: Visualizing the Change Section 5: Starting from Over There
Statistics 200 Lecture #5 Tuesday, September 6, 2016
Non-gaussian statistics Location and scale An easy application
6.5 The Quadratic Formula and the Discriminant 2/13/07
Algebra 7. Solving Quadratic Equations
A special case of calibration
Quadratics 40 points.
Singular Value Decomposition
Models for Robust Estimation and Identification
Introduction to Instrumentation Engineering
Multiple Column Partitioned Min Max
Graph and Write Equations of Circles
1B.1- Solving Quadratics:
Linear regression Fitting a straight line to observations.
Confessions of a Numerical Analyst Keith Evan Schubert
Backward Error Estimation
MATH 1311 Section 3.4.
Nonlinear regression.
Models for Robust Estimation and Identification
Solving Quadratic Equations by Factoring
Copyright © Cengage Learning. All rights reserved.
Discrete Least Squares Approximation
Mr Barton’s Maths Notes
MATH 1311 Section 3.4.
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Does the function f(x) = 3x2 + 6x have a maximum or a minimum value?
Warmup Does the function f(x) = 3x2 + 6x have a maximum or a minimum value?
Backward Error Estimation
Algebra 1 Section 12.2.
Calibration and homographies
Back to equations of geometric transformations
Using Clustering to Make Prediction Intervals For Neural Networks
Presentation transcript:

How do I know the answer if I’m not sure of the question? Putting robustness into estimation K. E. Schubert 11/7/00

Familiar Picture?

Basic Problem Picture of something that has been blurred x b A Picture of something that has been blurred If I know how it was blurred then I should be able to clean it up If system is invertible then I can get the original x b A†

Familiar Picture

Encountering Resistance Consider a simpler problem. Unknown resistor. Take current and voltage measurements. Plot them out. Want to fit a line to the points. No measurement is perfect. No exact fit to all the points. Want “best” fit.

Measured Values

Gauss’ Stellar Problem Orbit of Ceres. Errors were in people’s measurements Consider distance from the measurements to the equation to fit minimize the square of this distance min ||Ax-b||2 x=(ATA)-1ATb=A†b

Understanding Solution In our problem A, b are vectors Finding nearest scaled A to b Projection b A Ax Ax-b

Resistor Solved Want to find slope, 1/R i=(1/R)v Ax=b A vector of voltages b vector of currents x is slope 1/R=v†i

Best line

Reasonable Question What if I considered v=iR? Errors assumed in v now! R=i †v How do the measured resistances compare?

Comparison of Methods

Errors in Both A has errors (actual is A+dA) Want to minimize distance min ||(A+dA)x-b||2 Need to know something about dA Worst dA in bounded region Best dA in bounded region The dA that makes Ax=b consistent

Worst in a Bounded Region Keep worst case ok, rest will be fine ||dA||< (bounded region) Projection to farthest A+dA (A+dA)x-b b dA A (A+dA)x

Best in a Bounded Region Pick best dA but limit options ||dA||< (bounded region) Projection to nearest A+dA (A+dA)x-b b dA A (A+dA)x

Consistent Equation (TLS) Called Total Least Squares Projection nearest to A and b in new space No bound on dA, as big as need! b A (A+dA)x

General Regression Problems All of the techniques mentioned so far fall into the general category of regression (including least squares) Find a solution for most by taking the gradient and setting it equal to zero x=(ATA+I)-1ATb Equation for , which is solved by finding the roots of the equation (Newton’s or bisection)

Resistor by TLS

Simple Picture Consider a city skyline. Nice one dimensional picture. Only consider outline of buildings. Height is a function of horizontal distance. Nice one dimensional picture.

Hazy Day Smog and haze blur the image. Rounds the corners off. Want to get the corners back.

Least Squares Fails! Blurring works like a Gaussian distribution Don’t know the exact blur

TLS Too Optimistic! TLS assumes things are consistent Allows dA to be large

More Robust Solutions Picking a solution with some restrictions yields good results.

Conclusions Least Squares has nice properties and generally works well. Problems can arise in simple problems. Fundamental errors Must account for errors in basic system. Robust ~ works well for all nearby systems Can’t do as well or as bad (compromise)