Agenda Project 2- Due this Thursday Office Hours Wed 10:30-12 Image blending Background – Constrained optimization.

Slides:



Advertisements
Similar presentations
Curved Trajectories towards Local Minimum of a Function Al Jimenez Mathematics Department California Polytechnic State University San Luis Obispo, CA
Advertisements

The Easiest solution isn’t always the best solution, even in Math Should we always believe what we are taught in the classroom?
Linear Programming. Introduction: Linear Programming deals with the optimization (max. or min.) of a function of variables, known as ‘objective function’,
Model base human pose tracking. Papers Real-Time Human Pose Tracking from Range Data Simultaneous Shape and Pose Adaption of Articulated Models using.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
Separating Hyperplanes
Jue Wang Michael F. Cohen IEEE CVPR Outline 1. Introduction 2. Failure Modes For Previous Approaches 3. Robust Matting 3.1 Optimized Color Sampling.
The Most Important Concept in Optimization (minimization)  A point is said to be an optimal solution of a unconstrained minimization if there exists no.
Agenda Review of Proj 1 Concepts for Proj 2 – Probability – Morphology Image Blending.
Optimization in Engineering Design 1 Lagrange Multipliers.
Prénom Nom Document Analysis: Linear Discrimination Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Motion Analysis (contd.) Slides are from RPI Registration Class.
CES 514 – Data Mining Lecture 8 classification (contd…)
ENGR 351 Numerical Methods Instructor: Dr. L.R. Chevalier
Support Vector Machines Formulation  Solve the quadratic program for some : min s. t.,, denotes where or membership.  Different error functions and measures.
COMP322/S2000/L221 Relationship between part, camera, and robot (cont’d) the inverse perspective transformation which is dependent on the focal length.
CSSE463: Image Recognition Day 30 Due Friday – Project plan Due Friday – Project plan Evidence that you’ve tried something and what specifically you hope.
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Background vs. foreground segmentation of video sequences = +
Optimization Linear Programming and Simplex Method
Unconstrained Optimization Problem
Lecture outline Support vector machines. Support Vector Machines Find a linear hyperplane (decision boundary) that will separate the data.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
Tier I: Mathematical Methods of Optimization
Today Wrap up of probability Vectors, Matrices. Calculus
Quadratics       Solve quadratic equations using multiple methods: factoring, graphing, quadratic formula, or square root principle.
Graphing Linear Equations
Linear Programming: Basic Concepts
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
1 THE MATHEMATICS OF OPTIMIZATION Copyright ©2005 by South-Western, a division of Thomson Learning. All rights reserved. Walter Nicholson, Microeconomic.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
1 Prune-and-Search Method 2012/10/30. A simple example: Binary search sorted sequence : (search 9) step 1  step 2  step 3  Binary search.
Linear Equations in Two Variables A Linear Equation in Two Variables is any equation that can be written in the form where A and B are not both zero.
Part 4 Nonlinear Programming 4.5 Quadratic Programming (QP)
Non-Bayes classifiers. Linear discriminants, neural networks.
D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.
Hosted By The Math Institute
SWBAT…analyze the characteristics of the graphs of quadratic functions Wed, 2/15 Agenda 1. WU (10 min) 2. Characteristics of quadratic equations (35 min)
Lecture 4 Linear machine
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
SWBAT…analyze the characteristics of the graphs of quadratic functions Wed, 2/15 Agenda 1. WU (10 min) 2. Characteristics of quadratic equations (35 min)
1  Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Table of Contents First get all nonzero terms on one side. Quadratic Equation: Solving by factoring Example: Solve 6x 2 – 13x = 8. 6x 2 – 13x – 8 = 0 Second.
Regress-itation Feb. 5, Outline Linear regression – Regression: predicting a continuous value Logistic regression – Classification: predicting a.
Gradient Methods In Optimization
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Quadratic Functions and their Characteristics Unit 6 Quadratic Functions Math II.
1 Optimization Techniques Constrained Optimization by Linear Programming updated NTU SY-521-N SMU EMIS 5300/7300 Systems Analysis Methods Dr.
Optimization and Lagrangian. Partial Derivative Concept Consider a demand function dependent of both price and advertising Q = f(P,A) Analyzing a multivariate.
Calculus III Hughes-Hallett Chapter 15 Optimization.
Section 15.3 Constrained Optimization: Lagrange Multipliers.
Motion / Optical Flow II Estimation of Motion Field Avneesh Sud.
Machine Vision Edge Detection Techniques ENT 273 Lecture 6 Hema C.R.
SWBAT… analyze the characteristics of the graphs of quadratic functions Wed, 6/3 Agenda 1. WU (5 min) 2. Notes on graphing quadratics & properties of quadratics.
Section 2.1 Determinants by Cofactor Expansion. THE DETERMINANT Recall from algebra, that the function f (x) = x 2 is a function from the real numbers.
Parameter estimation class 5 Multiple View Geometry CPSC 689 Slides modified from Marc Pollefeys’ Comp
Numerical Analysis – Data Fitting Hanyang University Jong-Il Park.
1 Optimization Linear Programming and Simplex Method.
Amir Yavariabdi Introduction to the Calculus of Variations and Optical Flow.
Lecture 10: Image alignment CS4670/5760: Computer Vision Noah Snavely
Geology 6600/7600 Signal Analysis Last time: Linear Systems The Frequency Response or Transfer Function of a linear SISO system can be estimated as (Note.
Lecture 16: Image alignment
LINEAR CLASSIFIERS The Problem: Consider a two class task with ω1, ω2.
Computational Optimization
The Lagrange Multiplier Method
Functions.
Support Vector Machines
Solutions of Linear Functions
The Image The pixels in the image The mask The resulting image 255 X
Presentation transcript:

Agenda Project 2- Due this Thursday Office Hours Wed 10:30-12 Image blending Background – Constrained optimization

Recall: goal

Formulation: find the best patch f Given vector field v (pasted gradient), find the value of f in unknown region that optimize: Pasted gradient Mask Background unknown region

Notation Destination image: f* (table) Source image: g (table) Output image: f (table)  : list of (i,j) pixel coordinates from f* we want to replace d  : list of (i,j) pixel coordinates on border of  We’ll use p = (i,j) to denote a pixel location – g p is a pixel value at p = (i,j) from source image, – f  is the set of pixels we’re trying to find

Notation Destination image: f* (table) Source image: g (table) Output image: f (table)  : set of (i,j) pixel coordinates from f we want to replace (list of pairs) d  : set of (i,j) pixel coordinates on border of  (list of pairs) We’ll use p = (i,j) to denote a pixel location – g p is a pixel value at p = (i,j) from source image, – f  is the set of pixels we’re trying to find With constraint that, for p in d  sum over all pairs of neighbors in 

Optimization What is optimal f  without above constraint? What is known versus unknown? Variational formulation of solution: The best patch is the one that produces the lowest score, subject to the constraint Drop subscript for all p in dOmega

Optimization Pretend constraint wasn’t there: how to find lowest scoring f  ? 1)Brute-force search -Keep guessing different patches f and score them -Output the best-scoring one 2)Gradient descent -Guess a patch f. Update guess with f = f -

How to estimate gradient? In general, we can always do it numerically For above quadratic function, we can calculate in closed form

How to estimate gradient? In general, we can always do it numerically For above quadratic function, we can calculate in closed form

Constrained optimization 1)Brute-force search -Keep guessing different patches f and score them -Output the best-scoring one 2)Gradient descent -Guess a patch f. Update guess with f = f - What happens when gradient is zero?

Optimization 1)Brute-force search -Keep guessing different patches f and score them -Output the best-scoring one 2)Gradient descent -Guess a patch f. Update guess with f = f – 3) Closed-form solution (for simple functions)

Constrained optimization How to handle constraints? 1)Brute-force search -Keep guessing different patches f and score them -Output the best-scoring one 2)Gradient descent -Guess a patch f. Update guess with f = f - Correct f p = f* p after a gradient update

Constrained optimization How to handle constraints? 1)Brute-force search -Keep guessing different patches f and score them -Output the best-scoring one 2)Gradient descent -Guess a patch f. Update guess with f = f - What happens when gradient is zero?

Lagrangian optimization If there was no constraint, we’d have a closed- form solution Is there a way to get closed-form solutions using the constraint?

Lagrangian optimization min f(x,y) such that g(x,y) = 0 Imagine we want to synthesize a “two-pixel” patch

Lagrangian optimization min f(x,y) such that g(x,y) = 0 and g(x,y) = 0

Write conditions with single equation (just for convenience) At minimum of F, the its gradient is 0 Therefore, the following conditions hold

Multiple constraints min f(x,y) such that g1(x,y) = 0, g2(x,y) = 0 What is f(x,y) in our case? g1(x,y)?

Lagrangian optimization for p in d  (border pixels) for all other p in  Since S is quadratic in f, the above yeilds a set of linear equations Af =b f = inv(A)b