Common Intersection of Half-Planes in R 2 2 PROBLEM (Common Intersection of half- planes in R 2 ) Given n half-planes H 1, H 2,..., H n in R 2 compute.

Slides:



Advertisements
Similar presentations
UNC Chapel Hill Lin/Foskey/Manocha Steps in DP: Step 1 Think what decision is the “last piece in the puzzle” –Where to place the outermost parentheses.
Advertisements

Incremental Linear Programming Linear programming involves finding a solution to the constraints, one that maximizes the given linear function of variables.
Solving LP Models Improving Search Special Form of Improving Search
LIAL HORNSBY SCHNEIDER
Approximations of points and polygonal chains
Advanced Topics in Algorithms and Data Structures Lecture 7.2, page 1 Merging two upper hulls Suppose, UH ( S 2 ) has s points given in an array according.
Introduction to Algorithms
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Prune-and-Search Method
I. The Problem of Molding Does a given object have a mold from which it can be removed? object not removable mold 1 object removable Assumptions The object.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
How should we define corner points? Under any reasonable definition, point x should be considered a corner point x What is a corner point?
Advanced Topics in Algorithms and Data Structures Lecture 7.1, page 1 An overview of lecture 7 An optimal parallel algorithm for the 2D convex hull problem,
8/29/06CS 6463: AT Computational Geometry1 CS 6463: AT Computational Geometry Spring 2006 Convex Hulls Carola Wenk.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
Thursday, April 25 Nonlinear Programming Theory Separable programming Handouts: Lecture Notes.
Basic Feasible Solutions: Recap MS&E 211. WILL FOLLOW A CELEBRATED INTELLECTUAL TEACHING TRADITION.
Convexity of Point Set Sandip Das Indian Statistical Institute.
Intersections. Intersection Problem 3 Intersection Detection: Given two geometric objects, do they intersect? Intersection detection (test) is frequently.
Steps in DP: Step 1 Think what decision is the “last piece in the puzzle” –Where to place the outermost parentheses in a matrix chain multiplication (A.
CSCI 3160 Design and Analysis of Algorithms Tutorial 6 Fei Chen.
MIT and James Orlin © Nonlinear Programming Theory.
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
Design and Analysis of Algorithms
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
Prune-and-search Strategy
1 Chapter 8: Linearization Methods for Constrained Problems Book Review Presented by Kartik Pandit July 23, 2010 ENGINEERING OPTIMIZATION Methods and Applications.
Linear Programming Computational Geometry, WS 2007/08 Lecture 7, Part II Prof. Dr. Thomas Ottmann Algorithmen & Datenstrukturen, Institut für Informatik.
Job Scheduling Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines There are n jobs, each job has: a processing time p(i,j) (the time to.
Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract.
Daniel Kroening and Ofer Strichman Decision Procedures An Algorithmic Point of View Deciding ILPs with Branch & Bound ILP References: ‘Integer Programming’
Chapter 5 Linear Inequalities and Linear Programming Section R Review.
Decision Procedures An Algorithmic Point of View
C&O 355 Lecture 2 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A.
Linear Programming Piyush Kumar. Graphing 2-Dimensional LPs Example 1: x y Feasible Region x  0y  0 x + 2 y  2 y  4 x  3 Subject.
UNC Chapel Hill M. C. Lin Linear Programming Reading: Chapter 4 of the Textbook Driving Applications –Casting/Metal Molding –Collision Detection Randomized.
RA PRESENTATION Sublinear Geometric Algorithms B 張譽馨 B 汪牧君 B 李元翔.
The Complexity of Optimization Problems. Summary -Complexity of algorithms and problems -Complexity classes: P and NP -Reducibility -Karp reducibility.
5 -1 Chapter 5 The Divide-and-Conquer Strategy A simple example finding the maximum of a set S of n numbers.
© The McGraw-Hill Companies, Inc., Chapter 6 Prune-and-Search Strategy.
Linear Programming Data Structures and Algorithms A.G. Malamos References: Algorithms, 2006, S. Dasgupta, C. H. Papadimitriou, and U. V. Vazirani Introduction.
Systems of Inequalities in Two Variables Sec. 7.5a.
1 Prune-and-Search Method 2012/10/30. A simple example: Binary search sorted sequence : (search 9) step 1  step 2  step 3  Binary search.
Notes 7.5 – Systems of Inequalities. I. Half-Planes A.) Given the inequality, the line is the boundary, and the half- plane “below” the boundary is the.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
§1.4 Algorithms and complexity For a given (optimization) problem, Questions: 1)how hard is the problem. 2)does there exist an efficient solution algorithm?
Soham Uday Mehta. Linear Programming in 3 variables.
Chapter 3 Linear Programming Methods
3.4: Linear Programming Objectives: Students will be able to… Use linear inequalities to optimize the value of some quantity To solve linear programming.
OR Chapter 8. General LP Problems Converting other forms to general LP problem : min c’x  - max (-c)’x   = by adding a nonnegative slack variable.
UNC Chapel Hill M. C. Lin Randomized Linear Programming For any set of H of half-planes, there is a good order to treat them. Thus, we can improve the.
Linear Programming Chap 2. The Geometry of LP  In the text, polyhedron is defined as P = { x  R n : Ax  b }. So some of our earlier results should.
Linear Programming Piyush Kumar Welcome to CIS5930.
Approximation Algorithms based on linear programming.
Computational Geometry
1 Chapter 4 Geometry of Linear Programming  There are strong relationships between the geometrical and algebraic features of LP problems  Convenient.
Linear Programming Many problems take the form of maximizing or minimizing an objective, given limited resources and competing constraints. specify the.
Computation of the solutions of nonlinear polynomial systems
6.5 Stochastic Prog. and Benders’ decomposition
Algorithm design techniques Dr. M. Gavrilova
Linear Programming.
Integer Programming (정수계획법)
CMPS 3130/6130 Computational Geometry Spring 2017
Integer Programming (정수계획법)
6.5 Stochastic Prog. and Benders’ decomposition
Graphical solution A Graphical Solution Procedure (LPs with 2 decision variables can be solved/viewed this way.) 1. Plot each constraint as an equation.
Branch-and-Bound Algorithm for Integer Program
Chapter 6. Large Scale Optimization
Complexity Theory: Foundations
Presentation transcript:

Common Intersection of Half-Planes in R 2 2 PROBLEM (Common Intersection of half- planes in R 2 ) Given n half-planes H 1, H 2,..., H n in R 2 compute their intersection H 1  H 2 ...  H n. There is a simple O(n 2 ) algorithm for computing the intersection of n half-planes in R 2. Theorem: The intersection of n half-planes in R 2 can be found in  (n log n) time, and this is optimal.

Common Intersection of Half-Planes in R 2 1 Theorem: The intersection of n half-planes in R 2 can be found in  (n log n) time, and this is optimal. Proof. (1)To show upper bound, we solve it by Divide-and-Conquer T(n) = 2T(n/2) + O(n) = O(n log n) Merge the solutions to sub-problems solutions by finding the intersection of two resulting convex polygons. (2)To prove the lower bound we show that Sorting  O(n) Common intersection of half-planes. Given n real numbers x 1,..., x n Let Hi: y  2x i x – x i 2 Once P = H 1  H 2 ...  H n is formed, we may read off the x.'s in sorted Order by reading the slope of successive edges of P.

Linear Programming in R 2 14 PROBLEM (2-variable LP) Minimize ax + by, subject to a i x + b i y + c i  0, i= 1,...,n. 2-variable LP  O(n) Common intersection of half-planes in R 2 Theorem: A linear program in two variables and n constraints can be solved in O(n log n) time.

Linear Programming in R 2 13 Theorem: A linear program in two variables and n constraints can be solved in  (n). It can be solved by Prune-and-Search technique. This technique not only discards redundant constraints (i.e. those that are also irrelevant to the half-plane intersection task) but also those constraints that are guaranteed not to contain a vertex extremizing the objective function (referred to as the optimum vertex).

The 2-variable LP problem Minimize ax + by subject to a i x + b i y + c i  0, i= 1,...,n. (LP1) can be transformed by setting Y=ax+by & X=x as follows: O(n) Minimize Y subject to  i X +  i Y + c i  0, i= 1,...,n. (LP2) where  i =(a i -(a/b)b i ) &  i = b i /b. Linear Programming in R 2 12

In the new form we have to compute the smallest Y of the vertices of the convex polygon P (feasible region) determined by the constraints. Y X P Optimum vertex Linear Programming in R 2 11

To avoid the construction to the entire boundary of P, we proceed as follows. Depending upon whether  i is zero, negative, or positive we partition the index set {1, 2, …, n} into sets I 0, I , I +. Y X P u1u1 u2u2 F  (X) F + (X) Linear Programming in R 2 10

I 0 : All constraints in I 0 are vertical lines and determine the feasible interval for X u 1  X  u 2 u 1 = max{-c i /  i : i  I 0,  i <0} u 2 = min{-c i /  i : i  I 0,  i >0} I + : All constraints in I + define a piecewise upward- convex function F + = min i  I+ (  i X+  i ), where  i = - (  i /  i ) &  i = - (c i /  i ) I - : All constraints in I - define a piecewise downward- convex function F - = min i  I- (  i X+  i ), where  i = - (  i /  i ) &  i = - (c i /  i ) Linear Programming in R 2 9

Our problem so becomes: O(n) Minimize F - (X) subject to F - (X)  F + (X) (LP3) u 1  X  u 2 Given X ’ of X, the primitive, called evaluation, F + (X’) & F - (X’) can be executed in O(n) if H(X ’ ) = F - (X’) - F + (X’) > 0, then X ’ infeasible if H(X ’ ) = F - (X’) - F + (X’)  0, then X ’ feasible Linear Programming in R 2 8

Given X ’ of X in [u 1, u 2 ], we are able to reach one of the following conclusions in time O(n) X ’ infeasible & no solution to the problem; X ’ infeasible & we know in which side of X ’ (right or left) any feasible value of X may lies; X ’ feasible & we know in which side of X ’ (right or left) the minimum of F - (X) lies; X ’ achieves the minimum of F - (X); Linear Programming in R 2 7

We should try to choose abscissa X ’ where evaluation takes place s.t. if the algorithm does not immediately terminate, at least a fixed fraction  of currently active constraints can be pruned. We get the overall running time T(n)   i k(1-  ) i-1 n<kn/  =O(n) Linear Programming in R 2 6

We show that the value  =1/4 as follows: At a generic stage assume the stage has M active constraints let I + & I - be the index set as defined earlier, with | I + |+| I - |=M. We partition each of I + & I - into pairs of constraints. For each pair i, j of I +, O(M) If  i =  j (i.e. the corresponding straight lines are parallel) then one can be eliminated. (Fig a) Otherwise, let X ij denote the abscissa of their intersection If (X ij u 1 ) then one can be eliminated. (Fig b) If (u 1  X ij  u 2 ) then we retain X ij with no elimination. (Fig c) For each pair i, j of I -, it is similar to I + O(M) Linear Programming in R 2 5

Y=  i X+  i Y=  j X+  j Eliminated Fig a X ij < u 1 u 2 Fig b Eliminated u1u1 u 2 < X ij Fig c Eliminated Linear Programming in R 2 4

For all pairs, neither member of which has been eliminated, we compute the abscissa of their abscissa of their intersection. Thus, if k constraints have been eliminated, we have obtained a set S of (M-k)/2 intersection abscissae. O(M) Find the median X 1/2 of S O(M) If X 1/2 is not the extreminzing abscissa, then We test which side of X 1/2 the optimum lies. O(M) So half of the X ij ‘ s lie in the region which are known not to contain the optimum. For each X ij in the region, one constraint can be eliminated O(M) (Fig d) This concludes the stage, with the result that at least k+ [(M-k)/2]/2  M/4 constraints have been eliminated. Linear Programming in R 2 3

Y X P u1u1 u2u2 F  (X) F + (X) X 1/2 X ij Eliminated Fig d: optimal lies on the left side of X 1/2 Linear Programming in R 2 2

Prune & Search Algorithm for 2-variable LP problem Transform (LP1) to (LP2) & (LP3) O(M) For each pair of constraints if (  i =  i or X ij u 2 ), then eliminate one constraint O(M) Let S be all the pairs of constraints s.t. u 1  X ij  u 2, Find the median X 1/2 of S & test which side of X 1/2 the optimum lies O(M) Half of the X ij ‘ s lie in the region which are known not to contain the optimum. For each X ij in the region, one constraint can be eliminated. O(M) Linear Programming in R 2 1

Common Intersection Common Intersection of half-planes in R 2 :  (n log n) 2-varialbe Linear Programming:  (n)

We must point out that explicit construction of the feasible polytope is not a viable approach to linear programming in higher dimensions because the number of vertices can grow exponentially with dimension. For example, n-dim hypercube has 2 n vertices. The size of Common Intersection of half-spaces in R k is exponential in k, but the time complexity for k- variable linear programming is polynomial in k. These two problems are not equivalent in higher dimensions.