Easy Optimization Problems, Relaxation, Local Processing for a single variable.

Slides:



Advertisements
Similar presentations
Mutigrid Methods for Solving Differential Equations Ferien Akademie 05 – Veselin Dikov.
Advertisements

Multilevel Hypergraph Partitioning Daniel Salce Matthew Zobel.
Solve a System Algebraically
1 OR II GSLM Outline  some terminology  differences between LP and NLP  basic questions in NLP  gradient and Hessian  quadratic form  contour,
Optimization 吳育德.
Engineering Optimization
1 Iterative Solvers for Linear Systems of Equations Presented by: Kaveh Rahnema Supervisor: Dr. Stefan Zimmer
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Geometric (Classical) MultiGrid. Hierarchy of graphs Apply grids in all scales: 2x2, 4x4, …, n 1/2 xn 1/2 Coarsening Interpolate and relax Solve the large.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Chapter 4: Network Layer
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Visual Recognition Tutorial
Algebraic MultiGrid. Algebraic MultiGrid – AMG (Brandt 1982)  General structure  Choose a subset of variables: the C-points such that every variable.
Easy Optimization Problems, Relaxation, Local Processing for a single variable.
MIT and James Orlin © Nonlinear Programming Theory.
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Linear Regression  Using a linear function to interpolate the training set  The most popular criterion: Least squares approach  Given the training set:
The Widrow-Hoff Algorithm (Primal Form) Repeat: Until convergence criterion satisfied return: Given a training set and learning rate Initial:  Minimize.
Constrained Optimization
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Optimization Methods One-Dimensional Unconstrained Optimization
Unconstrained Optimization Problem
Hard Optimization Problems: Practical Approach DORIT RON Tel Ziskind room #303
Geometric (Classical) MultiGrid. Linear scalar elliptic PDE (Brandt ~1971)  1 dimension Poisson equation  Discretize the continuum x0x0 x1x1 x2x2 xixi.
Stochastic Relaxation, Simulating Annealing, Global Minimizers.
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Lesson 5 Method of Weighted Residuals. Classical Solution Technique The fundamental problem in calculus of variations is to obtain a function f(x) such.
Optimization Methods One-Dimensional Unconstrained Optimization
Solving Equations In Quadratic Form There are several methods one can use to solve a quadratic equation. Sometimes we are called upon to solve an equation.
SOLVING WORD PROBLEMS LESSON 3.
Tier I: Mathematical Methods of Optimization
Monte Carlo Methods in Partial Differential Equations.
Part 2.  Review…  Solve the following system by elimination:  x + 2y = 1 5x – 4y = -23  (2)x + (2)2y = 2(1)  2x + 4y = 2 5x – 4y = -23  7x = -21.
Thinking Mathematically Algebra: Graphs, Functions and Linear Systems 7.3 Systems of Linear Equations In Two Variables.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University COT 5410 – Spring 2004.
ENCI 303 Lecture PS-19 Optimization 2
Markov Decision Processes1 Definitions; Stationary policies; Value improvement algorithm, Policy improvement algorithm, and linear programming for discounted.
Principles of Equal a Priori Probability  All distributions of energy are equally probable  If E = 5 and N = 5 then                 
Solving a System of Equations by SUBSTITUTION. GOAL: I want to find what x equals, and what y equals. Using substitution, I can say that x = __ and y.
Substitution Method: 1. Solve the following system of equations by substitution. Step 1 is already completed. Step 2:Substitute x+3 into 2 nd equation.
Chapter 24 Sturm-Liouville problem Speaker: Lung-Sheng Chien Reference: [1] Veerle Ledoux, Study of Special Algorithms for solving Sturm-Liouville and.
Ch. 1-5 Absolute Value Equations and Inequalities.
Particle Swarm Optimization by Dr. Shubhajit Roy Chowdhury Centre for VLSI and Embedded Systems Technology, IIIT Hyderabad.
Unique Games Approximation Amit Weinstein Complexity Seminar, Fall 2006 Based on: “Near Optimal Algorithms for Unique Games" by M. Charikar, K. Makarychev,
Introduction to Optimization
SYSTEMS OF EQUATIONS. SYSTEM OF EQUATIONS -Two or more linear equations involving the same variable.
Linear & Nonlinear Programming -- Basic Properties of Solutions and Algorithms.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
The Substitution Method Objectives: To solve a system of equations by substituting for a variable.
Warm-Up 1) Determine whether (-1,7) is a solution of the system. 4 minutes 3x – y = -10 2) Solve for x where 5x + 3(2x – 1) = 5. -x + y = 8.
7.7 Solving Radical Equations and Inequalities. Vocabulary Radical equations/inequalities: equations/inequalities that have variables in the radicands.
1 Multi Scale Markov Random Field Image Segmentation Taha hamedani.
The Pure Birth Process Derivation of the Poisson Probability Distribution Assumptions events occur completely at random the probability of an event occurring.
Graphing and solving quadratic inequalities
Computational Optimization
MultiGrid.
Solve a system of linear equation in two variables
Class Notes 11.2 The Quadratic Formula.
Solve Systems of Equations by Elimination
Instructor :Dr. Aamer Iqbal Bhatti
Introduction to Simulated Annealing
Solving Systems of Equation by Substitution
Relaxation Technique CFD
EMIS 8373: Integer Programming
Solving systems using substitution
Warm Up Check to see if the point is a solution for the
Numerical Computation and Optimization
4 minutes Warm-Up 1) Determine whether (-1,7) is a solution of the system. 3x – y = -10 -x + y = 8 2) Solve for x where 5x + 3(2x – 1) = 5.
Presentation transcript:

Easy Optimization Problems, Relaxation, Local Processing for a single variable

Multiscale solvers Coarsening: create a hierarchy of problems graphs, equations, systems of particles, etc.

Original system 1 st coarsening 2 nd coarsening 3 rd coarsening

Multiscale solvers Coarsening: create a hierarchy of problems graphs, equations, systems of particles, etc. Solve the coarsest level

Coarsest level solution

Multiscale solvers Coarsening: create a hierarchy of problems graphs, equations, systems of particles, etc. Solve the coarsest level Uncoarsening:  Initialize the solution on a finer level from the coarser level by interpolation  Improve the initial solution by local processing

Local processing Main assumption The solution of the larger scales has been obtained by the coarser levels  At each level apply only local changes  Since done iteratively, need not solve to the optimum, just approach it

Variable by variable strict unconstrained minimization  Discrete (combinatorial) case : Ising model

2D Ising spins  Minimize  Periodic boundary condition  Initialize randomly: with probability.5

Exc#1: 2D Ising spins exercise  Minimize  Periodic boundary condition  Initialize randomly: with probability.5 1.Go over the grid in lexicographic order, for each spin choose 1 or -1 whichever minimizes the energy (choose with probability ½ when the two possibilities have the same energy) until no changes are observed. 2. Repeat 3 times for each of the 4 possibilities of (h 1,h 2 ). 3. Is the global minimum achievable? 4. What local minima do you observe?

Variable by variable strict unconstrained minimization  Discrete (combinatorial) case : Ising model  Quadratic case : P=2

Necessary optimality conditions Let be a local minimum of and assume is continuously differentiable in some domain, then the 1 st order Necessary Condition is If in addition is twice continuously differentiable within, then the 2 nd order Necessary Condition is positive semidefinite

Sufficient optimality conditions Let be twice continuously differentiable in domain and let satisfy the conditions, positive definite then is a strict unconstrained local minimum of. If, in addition, is quadratic, the local minimum is also the global unique minimum.

Pointwise relaxation for P=2  Minimize  Pick a variable, fix all at  Minimize Quadratic functional in one variable – easy to solve!

Pointwise relaxation for P=2 (cont.)  Check the 2 nd derivative: => Unique minimum! Put at the weighted average location of its graph neighbors  Go over all variables in lexicographic order Problem: Does not preserve the volume demands! Reinforce volume demands at the end of each sweep

Variable by variable strict unconstrained minimization  Discrete (combinatorial) case : Ising model  Quadratic case : P=2  General functional : P=1, P>2

Exc#2: Pointwise relaxation for P=1  Minimize  Pick a variable, fix all at  Minimize  Find the optimal location for