Moran Feldman The Open University of Israel

Slides:



Advertisements
Similar presentations
Submodular Set Function Maximization via the Multilinear Relaxation & Dependent Rounding Chandra Chekuri Univ. of Illinois, Urbana-Champaign.
Advertisements

Minimum Clique Partition Problem with Constrained Weight for Interval Graphs Jianping Li Department of Mathematics Yunnan University Jointed by M.X. Chen.
Fast Algorithms For Hierarchical Range Histogram Constructions
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
1 Maximal Independent Set. 2 Independent Set (IS): In a graph G=(V,E), |V|=n, |E|=m, any set of nodes that are not adjacent.
A Fairy Tale of Greedy Algorithms Yuli Ye Joint work with Allan Borodin, University of Toronto.
4/5/05Tucker, Sec Applied Combinatorics, 4rth Ed. Alan Tucker Section 4.3 Graph Models Prepared by Jo Ellis-Monaghan.
Truthful Approximation Mechanisms for Scheduling Selfish Related Machines Motti Sorani, Nir Andelman & Yossi Azar Tel-Aviv University.
The Submodular Welfare Problem Lecturer: Moran Feldman Based on “Optimal Approximation for the Submodular Welfare Problem in the Value Oracle Model” By.
1 Ecole Polytechnque, Nov 7, 2007 Scheduling Unit Jobs to Maximize Throughput Jobs:  all have processing time (length) = 1  release time r j  deadline.
Maximum likelihood (ML) and likelihood ratio (LR) test
Matroids, Secretary Problems, and Online Mechanisms Nicole Immorlica, Microsoft Research Joint work with Robert Kleinberg and Moshe Babaioff.
Maximum Flows Lecture 4: Jan 19. Network transmission Given a directed graph G A source node s A sink node t Goal: To send as much information from s.
1 Combinatorial Dominance Analysis The Knapsack Problem Keywords: Combinatorial Dominance (CD) Domination number/ratio (domn, domr) Knapsack (KP) Incremental.
Computability and Complexity 24-1 Computability and Complexity Andrei Bulatov Approximation.
Minimum Cost Flow Lecture 5: Jan 25. Problems Recap Bipartite matchings General matchings Maximum flows Stable matchings Shortest paths Minimum spanning.
Maximum Entropy Model LING 572 Fei Xia 02/07-02/09/06.
(work appeared in SODA 10’) Yuk Hei Chan (Tom)
10/31/02CSE Greedy Algorithms CSE Algorithms Greedy Algorithms.
Approximation algorithms for sequential testing of Boolean functions Lisa Hellerstein Polytechnic Institute of NYU Joint work with Devorah Kletenik (Polytechnic.
Approximating Minimum Bounded Degree Spanning Tree (MBDST) Mohit Singh and Lap Chi Lau “Approximating Minimum Bounded DegreeApproximating Minimum Bounded.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Approximation Algorithms for NP-hard Combinatorial Problems Magnús M. Halldórsson Reykjavik University Local Search, Greedy and Partitioning
Randomized Composable Core-sets for Submodular Maximization Morteza Zadimoghaddam and Vahab Mirrokni Google Research New York.
Maximizing the Spread of Influence through a Social Network Authors: David Kempe, Jon Kleinberg, É va Tardos KDD 2003.
Submodular Maximization with Cardinality Constraints Moran Feldman Based On Submodular Maximization with Cardinality Constraints. Niv Buchbinder, Moran.
Frequency Capping in Online Advertising Moran Feldman Technion Joint work with: Niv Buchbinder,The Open University of Israel Arpita Ghosh,Yahoo! Research.
Improved Competitive Ratios for Submodular Secretary Problems ? Moran Feldman Roy SchwartzJoseph (Seffi) Naor Technion – Israel Institute of Technology.
A Unified Continuous Greedy Algorithm for Submodular Maximization Moran Feldman Roy SchwartzJoseph (Seffi) Naor Technion – Israel Institute of Technology.
Maximization Problems with Submodular Objective Functions Moran Feldman Publication List Improved Approximations for k-Exchange Systems. Moran Feldman,
Vasilis Syrgkanis Cornell University
Deterministic Algorithms for Submodular Maximization Problems Moran Feldman The Open University of Israel Joint work with Niv Buchbinder.
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
Matroids, Secretary Problems, and Online Mechanisms Nicole Immorlica, Microsoft Research Joint work with Robert Kleinberg and Moshe Babaioff.
Aspects of Submodular Maximization Subject to a Matroid Constraint Moran Feldman Based on A Unified Continuous Greedy Algorithm for Submodular Maximization.
Maximizing Symmetric Submodular Functions Moran Feldman EPFL.
Iterative Rounding in Graph Connectivity Problems Kamal Jain ex- Georgia Techie Microsoft Research Some slides borrowed from Lap Chi Lau.
Polyhedral Optimization Lecture 5 – Part 3 M. Pawan Kumar Slides available online
1 The instructor will be absent on March 29 th. The class resumes on March 31 st.
Approximation Algorithms Greedy Strategies. I hear, I forget. I learn, I remember. I do, I understand! 2 Max and Min  min f is equivalent to max –f.
Unconstrained Submodular Maximization Moran Feldman The Open University of Israel Based On Maximizing Non-monotone Submodular Functions. Uriel Feige, Vahab.
Submodularity Reading Group Matroids, Submodular Functions M. Pawan Kumar
Impact of Interference on Multi-hop Wireless Network Performance
Contention Resolution Schemes: Offline and Online
LINEAR CLASSIFIERS The Problem: Consider a two class task with ω1, ω2.
Vitaly Feldman and Jan Vondrâk IBM Research - Almaden
Maximum Matching in the Online Batch-Arrival Model
The Price of information in combinatorial optimization
Distributed Submodular Maximization in Massive Datasets
Combinatorial Prophet Inequalities
Chapter 4 Linear Programming: The Simplex Method
Framework for the Secretary Problem on the Intersection of Matroids
Chapter 6. Large Scale Optimization
Hidden Markov Models Part 2: Algorithms
Bin Fu Department of Computer Science
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
Coverage Approximation Algorithms
Lecture 11 Overview Self-Reducibility.
Lecture 11 Overview Self-Reducibility.
Approximation Algorithms
Submodular Maximization Through the Lens of the Multilinear Relaxation
The Byzantine Secretary Problem
EE5900 Advanced Embedded System For Smart Infrastructure
Submodular Function Maximization with Packing Constraints via MWU
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Submodular Maximization in the Big Data Era
Submodular Maximization with Cardinality Constraints
Multiple Products.
Chapter 6. Large Scale Optimization
Guess Free Maximization of Submodular and Linear Sums
Presentation transcript:

Moran Feldman The Open University of Israel First sample, then be greedy: an improved way to use the greedy algorithm Moran Feldman The Open University of Israel Based on the paper: “Greed Is Good: Near-Optimal Submodular Maximization via Greedy Optimization”. To appear in COLT 2017. Joint work with Christopher Harshaw and Amin Karbasi.

The size of S must be at most k. Problems of Interest max f(S) s.t. S  N S obeys a constraint C Given a ground set N. The greedy algorithm is often used for such problems in practice. The size of S must be at most k. S must be a legal matching. The Greedy Algorithm While there are more elements that can be added to the solution, pick the element among them whose addition to the solution increases the value by the most, and add it.

Why does it Work? Intuitively, means that given an element e and a feasible set S, to make S + e feasible we need to remove up to k elements of S. Examples: Matroid constraint Matching constraint Any intersection of such constraints Theoretical results show that the greedy algorithm guarantees a good approximation ratio when: The objective function is submodular. The constraint belongs to one of a few general families of constraints. A general family of functions generalizing linear functions. Will be discussed more in a minute… Here we consider a family named: “k-extendible systems”.

Submodular Functions Intuition A function is submodular if it captures the concept of diminishing returns: Adding an element to a small set increase the objective by more compared to adding this element to a large set. Definition A set function f: 2N  ℝ is submodular if: f(A + u) – f(A) ≥ f(B + u) – f(B) ∀ A  B  N, u  B or f(A) + f(B) ≥ f(A  B) + f(A  B) ∀ A, B  N . Submodular functions can be found in: Combinatorics (2 examples soon) Algorithmic Game Theory Image Processing Machine Learning

Example 1: Cut Function A directed graph G = (V, E) with capacities ce  0 on the arcs. For every S  V: Observation: f(S) is a non-negative submodular function. f(S) = 3

Example 2: Coverage Function Elements E = {e1, e2, …, en} and sets s1, s2, …, sm  E. For every S = {si1, si2, …, sik}: Observation: f(S) is a non-negative monotone submodular function. S1 S2 S5 S3 S4 Adding elements can only increase the values of sets. For sets A  B  N, f(A) ≤ f(B).

Known Results If f is a non-negative monotone submodular function. [Fisher et al. (1978), Jenkyns (1976)] If f is a non-negative monotone submodular function. C is a k-extendible system Then Then, the greedy algorithm is a (k + 1)-approximation algorithm for the problem max f(S) s.t. S  N S obeys a constraint C Remark If f is linear, the approximation ratio improves to k.

For non-negative montone submodular functions: Our Contribution Our algorithm Create a sample set S containing every element of N with probability p, independently. Run the greedy algorithm on S (instead of on N). If p = 1/k or p = 1/(k + 1), there is no loss in the approximation ratio, just a speed up. What do we get? Runs faster than the greedy algorithm. Approximation ratio: Another advantage will be described later… For linear functions: max{k, 1/p} For non-negative montone submodular functions: max{k + 1, 1/p}

What we can potentially get. Greedy Analysis Idea The presentation of the analysis assumes a linear function. For every iteration of the greedy algorithm, we are interested in two entities: The gain of the iteration is the increase in the value of the solution during that iteration. The damage of the iteration is the decrease, during the iteration, in the difference between: The value of the solution. The maximum value of a feasible set containing the solution. What we can potentially get. If c ∙ Gain ≥ Damage, then the approximation ratio is at most c.

Greedy Analysis Idea (cont.) Greedy picks correctly It picks an element from the (current) optimal solution. No change in the value of the optimal solution. Greedy picks incorrectly It picks an element outside of the (current) optimal solution. At most k other elements have to be removed from the optimal solution. The removed elements are less valuable than the added one. Not balanced! Both are equal to the change in the value of the solution. Change in OPT k ∙ ≥ k ∙ ≤

How does the Sampling Comes in? An alternative view on our algorithm Run greedy. When greedy tries to select an element: With probability p, select it. Otherwise, remove it from the instance. Observation Iterations in which the element is added behave like in standard greedy. Big question: what happens in iterations in which the element is removed?

Analyzing Non-selecting Iterations The solution does not change: Greedy picks correctly The dismissal removes an element from the current optimal solution. Let e denote the element greedy tries to select. Greedy picks incorrectly The dismissal does not change the current optimal solution. f(e) ≤

Summing It All Up ≤ ≤ ≤ k ∙ f(e) max{k, 1/p} ∙ f(e) p∙ f(e) k ∙ f(e) ≤ Greedy picks correctly Greedy picks incorrectly f(e) Selecting iteration (probability p) k ∙ ≤ In both cases max{k, 1/p} ∙ ≤ Non-selecting iteration (probability 1 – p) f(e) ≤ p∙ f(e) k ∙ ≤ Expectation f(e) ≤

Result for Non-monotone Functions For non-negative (non-monotone) submodular functions: The greedy algorithm has no theoretical guarantee, even in the absence of a constraint. Intuitively, the algorithm works because no single “bad” element can be selected with high probability. Our algorithm (sampling + greedy) works for this case as well. For an appropriate choice of the parameter p, it achieves an approximation ratio of k + 2 + 1/k.

Experiment

Questions ?