Minicourse on parameterized algorithms and complexity Part 4: Linear programming Dániel Marx (slides by Daniel Lokshtanov) Jagiellonian University in Kraków.

Slides:



Advertisements
Similar presentations
Bart Jansen 1.  Problem definition  Instance: Connected graph G, positive integer k  Question: Is there a spanning tree for G with at least k leaves?
Advertisements

1 LP Duality Lecture 13: Feb Min-Max Theorems In bipartite graph, Maximum matching = Minimum Vertex Cover In every graph, Maximum Flow = Minimum.
Introduction to Kernel Lower Bounds Daniel Lokshtanov.
Generalization and Specialization of Kernelization Daniel Lokshtanov.
Totally Unimodular Matrices
Bart Jansen, Utrecht University. 2  Max Leaf  Instance: Connected graph G, positive integer k  Question: Is there a spanning tree for G with at least.
Fixed Parameter Complexity Algorithms and Networks.
Bart Jansen, Utrecht University. 2  Max Leaf  Instance: Connected graph G, positive integer k  Question: Is there a spanning tree for G with at least.
LP-Based Parameterized Algorithms for Separation Problems D. Lokshtanov, N.S. Narayanaswamy V. Raman, M.S. Ramanujan S. Saurabh.
Fast FAST By Noga Alon, Daniel Lokshtanov And Saket Saurabh Presentation by Gil Einziger.
Complexity 16-1 Complexity Andrei Bulatov Non-Approximability.
CSC5160 Topics in Algorithms Tutorial 2 Introduction to NP-Complete Problems Feb Jerry Le
Complexity 11-1 Complexity Andrei Bulatov NP-Completeness.
Computability and Complexity 23-1 Computability and Complexity Andrei Bulatov Search and Optimization.
Introduction to Approximation Algorithms Lecture 12: Mar 1.
Computational problems, algorithms, runtime, hardness
Minicourse on parameterized algorithms and complexity Part 8: The Strong Exponential-Time Hypothesis Dániel Marx (slides by Daniel Lokshtanov) Jagiellonian.
Linear Programming and Approximation
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
Vertex Cover, Dominating set, Clique, Independent set
Approximation Algorithms
1 Maximum matching Max Flow Shortest paths Min Cost Flow Linear Programming Mixed Integer Linear Programming Worst case polynomial time by Local Search.
Computability and Complexity 24-1 Computability and Complexity Andrei Bulatov Approximation.
Job Scheduling Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines There are n jobs, each job has: a processing time p(i,j) (the time to.
Integer Programming Difference from linear programming –Variables x i must take on integral values, not real values Lots of interesting problems can be.
Linear Programming and Parameterized Algorithms. Linear Programming n real-valued variables, x 1, x 2, …, x n. Linear objective function. Linear (in)equality.
Linear Programming – Max Flow – Min Cut Orgad Keller.
1 Introduction to Approximation Algorithms Lecture 15: Mar 5.
C&O 355 Mathematical Programming Fall 2010 Lecture 17 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
C&O 355 Lecture 2 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A.
Fixed Parameter Complexity Algorithms and Networks.
Fixed Parameter Complexity Algorithms and Networks.
Nattee Niparnan. Easy & Hard Problem What is “difficulty” of problem? Difficult for computer scientist to derive algorithm for the problem? Difficult.
NP-Completeness: 3D Matching
1 The TSP : NP-Completeness Approximation and Hardness of Approximation All exact science is dominated by the idea of approximation. -- Bertrand Russell.
Design Techniques for Approximation Algorithms and Approximation Classes.
Approximating Minimum Bounded Degree Spanning Tree (MBDST) Mohit Singh and Lap Chi Lau “Approximating Minimum Bounded DegreeApproximating Minimum Bounded.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Uib.no UNIVERSITY OF BERGEN A Near-Optimal Planarization Algorithm Bart M. P. Jansen Daniel Lokshtanov University of Bergen, Norway Saket Saurabh Institute.
1 Bart Jansen Vertex Cover Kernelization Revisited: Upper and Lower Bounds for a Refined Parameter STACS 2011, Dortmund March 10 th, 2011 Joint work with.
Complexity 25-1 Complexity Andrei Bulatov Counting Problems.
1 Bart Jansen Independent Set Kernelization for a Refined Parameter: Upper and Lower bounds ALGORITMe Staff Colloquium, Utrecht September 10 th, 2010 Joint.
EMIS 8373: Integer Programming NP-Complete Problems updated 21 April 2009.
Techniques for Proving NP-Completeness Show that a special case of the problem you are interested in is NP- complete. For example: The problem of finding.
Unit 9: Coping with NP-Completeness
Uib.no UNIVERSITY OF BERGEN A Near-Optimal Planarization Algorithm Bart M. P. Jansen Daniel Lokshtanov University of Bergen, Norway Saket Saurabh Institute.
Flipping letters to minimize the support of a string Giuseppe Lancia, Franca Rinaldi, Romeo Rizzi University of Udine.
CPS Computational problems, algorithms, runtime, hardness (a ridiculously brief introduction to theoretical computer science) Vincent Conitzer.
Data Reduction for Graph Coloring Problems Bart M. P. Jansen Joint work with Stefan Kratsch August 22 nd 2011, Oslo.
Algorithms for hard problems Parameterized complexity – definitions, sample algorithms Juris Viksna, 2015.
1 Approximation algorithms Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij TexPoint fonts used in EMF. Read the TexPoint manual.
NP Completeness Piyush Kumar. Today Reductions Proving Lower Bounds revisited Decision and Optimization Problems SAT and 3-SAT P Vs NP Dealing with NP-Complete.
C&O 355 Lecture 19 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A.
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
Kernel Bounds for Path and Cycle Problems Bart M. P. Jansen Joint work with Hans L. Bodlaender & Stefan Kratsch September 8 th 2011, Saarbrucken.
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
ICS 353: Design and Analysis of Algorithms NP-Complete Problems King Fahd University of Petroleum & Minerals Information & Computer Science Department.
Lap Chi Lau we will only use slides 4 to 19
Joint work with Hans Bodlaender
Topics in Algorithms Lap Chi Lau.
Computability and Complexity
Richard Anderson Lecture 28 Coping with NP-Completeness
Linear Programming and Approximation
Linear Programming Duality, Reductions, and Bipartite Matching
Problem Solving 4.
Dániel Marx (slides by Daniel Lokshtanov)
CPS 173 Computational problems, algorithms, runtime, hardness
Lecture 19 Linear Program
Presentation transcript:

Minicourse on parameterized algorithms and complexity Part 4: Linear programming Dániel Marx (slides by Daniel Lokshtanov) Jagiellonian University in Kraków April 21-23, 2015 Insert «Academic unit» on every page: 1 Go to the menu «Insert» 2 Choose: Date and time 3 Write the name of your faculty or department in the field «Footer» 4 Choose «Apply to all"

Linear Programming n real-valued variables, x 1, x 2, …, x n. Linear objective function. Linear (in)equality constraints. Solvable in polynomial time.

Integer Linear Programming n integer-valued variables, x 1, x 2, …, x n. Linear objective function. Linear (in)equality constraints. NP-complete. Lingo: Linear Programs (LP’s), Integer Linear Programs (ILP’s)

Vertex Cover Have seen a kernel with O(k 2 ) vertices, will see a kernel with 2k vertices.

Vertex Cover (I)LP

Nemhauser Trotter Theorem

Matchings and Hall Sets A matching in a graph is a set of edges that do not share any endpoints. A matching saturates a vertex set S if every vertex in S is incident to a matching edge. A vertex set S is a Hall set if it is independent and |N(S)| < |S|. A Hall set may never be saturated!

Hall’s Theorem Theorem: A bipartite graph has a matching such that every left hand side vertex is saturated ⇔ there is no Hall set on the left hand side.

Hall’s Theorem Example Matching (so no Hall set) Hall set (so no matching)

Nemhauser Trotter Theorem

Nemhauser Trotter Proof This clearly proves (a), but why does it prove (b)? Left Right

Reduction Rule If exists optimal LP solution that sets x v to 1, then exists optimal vertex cover that selects v.  Remove v from G and decrease k by 1. Correctness follows from Nemhauser Trotter Polynomial time by LP solving.

Kernel No vertex is 1. No vertex is 0 (remove isolated vertices)

Above LP Vertex Cover

Vertex Cover Above LP

Reduction Rule Recall the reduction rules from the kernel for Vertex Cover: – If exists optimal LP solution that sets x v to 1, then exists optimal vertex cover that selects v. – Remove v from G and decrease k by 1. – Remove vertices of degree 0.

Reduction affects k-OPT LP ? Reduction rule: If exists optimal LP solution that sets x v to 1  Remove v and decrease k by 1. OPT LP decreases by exactly 1. Why? v Feasible LP Solution to G\u 1 k-OPT LP is unchanged!

Branching

Branching - Analysis

Vertex Cover recap Is this useful when compared to a 1.38 k algorithm?

Almost 2-SAT * Remove all clauses that contain the variable

Odd Cycle Transversal (OCT) Will give algorithms for Almost 2-SAT and OCT, using FPT-reductions to Vertex Cover above LP!

Odd Cycle Transversal  Almost 2-Sat xy z xy z

Almost 2-SAT  Vertex Cover/k-LP

Consequences

LP versus ILP We saw an application of LP’s in parameterized algorithms. ILP solving is NP-hard. Useless for algorithms? No! We can use parameterized algorithms for Integer Linear Programming.

Integer Linear Programming Theorem: k 4.5k poly(L) time algorithm, where k is the number of variables, and L is the number of bits encoding the instance.

Closest String Note: the parameter is the number of strings, not k

Closest String as Hit & Miss For every position, need to choose the letter of solution string s. For all strings s differs from at that position, increase distance by one. Can’t miss any string more than k times.

Closest String Alphabet Reduction Can assume that alphabet size is at most n

Column Types

Closest String ILP After alphabet reduction, there are at most n n column types. Count the number of columns of each column type.

ILP For each column type, make n variables, one for each letter. Constraints: For each column type t, the chosen letters add up to the number of type t.

Objective Function For a string s i and column type t, let s i [t] be the letter of s i in columns of type t. For each string s i, its distance from the solution string s is Objective is Minimize Max d i

Algorithm for Closest String