Aspects of Submodular Maximization Subject to a Matroid Constraint Moran Feldman Based on A Unified Continuous Greedy Algorithm for Submodular Maximization.

Slides:



Advertisements
Similar presentations
Truthful Mechanisms for Combinatorial Auctions with Subadditive Bidders Speaker: Shahar Dobzinski Based on joint works with Noam Nisan & Michael Schapira.
Advertisements

Submodular Set Function Maximization A Mini-Survey Chandra Chekuri Univ. of Illinois, Urbana-Champaign.
Submodular Set Function Maximization via the Multilinear Relaxation & Dependent Rounding Chandra Chekuri Univ. of Illinois, Urbana-Champaign.
Incremental Linear Programming Linear programming involves finding a solution to the constraints, one that maximizes the given linear function of variables.
Online Mechanism Design (Randomized Rounding on the Fly)
Randomized Sensing in Adversarial Environments Andreas Krause Joint work with Daniel Golovin and Alex Roper International Joint Conference on Artificial.
Parallel Double Greedy Submodular Maxmization Xinghao Pan, Stefanie Jegelka, Joseph Gonzalez, Joseph Bradley, Michael I. Jordan.
How Bad is Selfish Routing? By Tim Roughgarden Eva Tardos Presented by Alex Kogan.
Maximizing the Spread of Influence through a Social Network
Dependent Randomized Rounding in Matroid Polytopes (& Related Results) Chandra Chekuri Jan VondrakRico Zenklusen Univ. of Illinois IBM ResearchMIT.
Basic Feasible Solutions: Recap MS&E 211. WILL FOLLOW A CELEBRATED INTELLECTUAL TEACHING TRADITION.
Learning Submodular Functions Nick Harvey University of Waterloo Joint work with Nina Balcan, Georgia Tech.
Optimal Marketing Strategies over Social Networks Jason Hartline (Northwestern), Vahab Mirrokni (Microsoft Research) Mukund Sundararajan (Stanford)
Welfare Maximization in Congestion Games Liad Blumrosen and Shahar Dobzinski The Hebrew University.
The Submodular Welfare Problem Lecturer: Moran Feldman Based on “Optimal Approximation for the Submodular Welfare Problem in the Value Oracle Model” By.
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
CPSC 689: Discrete Algorithms for Mobile and Wireless Systems Spring 2009 Prof. Jennifer Welch.
Randomized Algorithms and Randomized Rounding Lecture 21: April 13 G n 2 leaves
Sensor placement applications Monitoring of spatial phenomena Temperature Precipitation... Active learning, Experiment design Precipitation data from Pacific.
Approximation Algorithms
[1][1][1][1] Lecture 5-7: Cell Planning of Cellular Networks June 22 + July 6, Introduction to Algorithmic Wireless Communications David Amzallag.
On Testing Convexity and Submodularity Michal Parnas Dana Ron Ronitt Rubinfeld.
Efficiently handling discrete structure in machine learning Stefanie Jegelka MADALGO summer school.
A General Approach to Online Network Optimization Problems Seffi Naor Computer Science Dept. Technion Haifa, Israel Joint work: Noga Alon, Yossi Azar,
Approximation Algorithms: Bristol Summer School 2008 Seffi Naor Computer Science Dept. Technion Haifa, Israel TexPoint fonts used in EMF. Read the TexPoint.
Fast Algorithms for Submodular Optimization
Learning Submodular Functions
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Minimizing general submodular functions
Submodular Functions Learnability, Structure & Optimization Nick Harvey, UBC CS Maria-Florina Balcan, Georgia Tech.
Stochastic Protection of Confidential Information in SDB: A hybrid of Query Restriction and Data Perturbation ( to appear in Operations Research) Manuel.
The Integers. The Division Algorithms A high-school question: Compute 58/17. We can write 58 as 58 = 3 (17) + 7 This forms illustrates the answer: “3.
Randomized Composable Core-sets for Submodular Maximization Morteza Zadimoghaddam and Vahab Mirrokni Google Research New York.
Greedy is not Enough: An Efficient Batch Mode Active Learning Algorithm Chen, Yi-wen( 陳憶文 ) Graduate Institute of Computer Science & Information Engineering.
Maximizing the Spread of Influence through a Social Network Authors: David Kempe, Jon Kleinberg, É va Tardos KDD 2003.
Online Social Networks and Media
Linear Program Set Cover. Given a universe U of n elements, a collection of subsets of U, S = {S 1,…, S k }, and a cost function c: S → Q +. Find a minimum.
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Submodular Maximization with Cardinality Constraints Moran Feldman Based On Submodular Maximization with Cardinality Constraints. Niv Buchbinder, Moran.
Frequency Capping in Online Advertising Moran Feldman Technion Joint work with: Niv Buchbinder,The Open University of Israel Arpita Ghosh,Yahoo! Research.
Improved Competitive Ratios for Submodular Secretary Problems ? Moran Feldman Roy SchwartzJoseph (Seffi) Naor Technion – Israel Institute of Technology.
Hedonic Clustering Games Moran Feldman Joint work with: Seffi Naor and Liane Lewin-Eytan.
Unique Games Approximation Amit Weinstein Complexity Seminar, Fall 2006 Based on: “Near Optimal Algorithms for Unique Games" by M. Charikar, K. Makarychev,
A Unified Continuous Greedy Algorithm for Submodular Maximization Moran Feldman Roy SchwartzJoseph (Seffi) Naor Technion – Israel Institute of Technology.
Maximization Problems with Submodular Objective Functions Moran Feldman Publication List Improved Approximations for k-Exchange Systems. Moran Feldman,
Non-Preemptive Buffer Management for Latency Sensitive Packets Moran Feldman Technion Seffi Naor Technion.
Submodular Set Function Maximization A Mini-Survey Chandra Chekuri Univ. of Illinois, Urbana-Champaign.
Deterministic Algorithms for Submodular Maximization Problems Moran Feldman The Open University of Israel Joint work with Niv Buchbinder.
Maximizing Symmetric Submodular Functions Moran Feldman EPFL.
Approximation Algorithms based on linear programming.
Unconstrained Submodular Maximization Moran Feldman The Open University of Israel Based On Maximizing Non-monotone Submodular Functions. Uriel Feige, Vahab.
Submodularity Reading Group Matroid Polytopes, Polymatroid M. Pawan Kumar
Approximation algorithms for combinatorial allocation problems
Contention Resolution Schemes: Offline and Online
Near-optimal Observation Selection using Submodular Functions
Vitaly Feldman and Jan Vondrâk IBM Research - Almaden
Moran Feldman The Open University of Israel
Distributed Submodular Maximization in Massive Datasets
Combinatorial Prophet Inequalities
Framework for the Secretary Problem on the Intersection of Matroids
Data Integration with Dependent Sources
Coverage Approximation Algorithms
(22nd August, 2018) Sahil Singla
Submodular Maximization Through the Lens of the Multilinear Relaxation
Online Algorithms via Projections set cover, paging, k-server
Submodular Function Maximization with Packing Constraints via MWU
Submodular Maximization in the Big Data Era
Submodular Maximization with Cardinality Constraints
Guess Free Maximization of Submodular and Linear Sums
Presentation transcript:

Aspects of Submodular Maximization Subject to a Matroid Constraint Moran Feldman Based on A Unified Continuous Greedy Algorithm for Submodular Maximization. Moran Feldman, Joseph (Seffi) Naor and Roy Schwartz (FOCS 2011). Submodular Maximization with Cardinality Constraints. Niv Buchbinder, Moran Feldman, Joseph (Seffi) Naor and Roy Schwartz (SODA 2014). Comparing Apples and Oranges: Query Tradeoff in Submodular Maximization. Niv Buchbinder, Moran Feldman and Roy Schwartz (SODA 2015).

2 Submodular Maximization Subject to a Matroid Constraint What? Why? Generalizes Classical Problems  Max-SAT, Max-Cut, k-cover, GAP… Applications  Machine Learning  Image Processing  Algorithmic Game Theory

Region Sensor Coverage 3 Sensors k L – Large sensors k S – Small sensors Objective Cover as much as possible of the region with sensors. Observation Coverage exhibits a diminishing returns.

Influence in Social Networks Objective: Sell SuperPhones. Means: Give k phones for free. S S S S S Again, we have diminishing returns

Image Summarization 5 Objective Select k pictures representing as much as possible of the trip. Once more, diminishing returns. Can be quantified by image processing techniques. [Tschiatschek et al., NIPS14]

The 3 Components of the Problem 6 1.A ground set N of elements  Possible positions for large/small sensors  Social network users  Images A valuation function f  Assigns numerical values to subsets  Exhibits diminishing returns  Such a function is called submodular

Example and Formal Definition A  B  N, u  B A A B u f(A+u) f(A) f(B+u) f(B) N

The Third Component 8 Matroid Constraint  k k  k k Cardinality ConstraintPartition Matroid Constraint  k1 k1  k2 k2  k3 k3 More “fancy” constraints: Graphical MatroidLinear Matroid … x y z

S The Problem 9 Generalizes NP-hard problems. – Exact algorithm is unlikely. Approximation algorithms. Objective Find a subset S  N obeying the constraint and maximizing f(S). Instance A ground set N A submodular function f A matroid constraint  -Approx. Algorithm  -Approx. Algorithm S is feasible f(S)    f(OPT) Randomized S is feasible E[f(S)]    f(OPT)

Which additional properties of f can help us? How fast can it be done? 10 Submodular Maximization Subject to a Matroid Constraint Aspects of What can be done in polynomial time? Changing the model: online, streaming, secretary…

Remarks If negative values are allowed, f(OPT) can be assumed to be 0. Non-zero multiplicative approximation implies an exact algorithm. Non-negativity Assumption If f is represented by a table, the problem becomes trivial. Assume access to a value oracle. Polynomial time complexity in |N|. Value Oracle 11 0 Sf(S)f(S)  0 {a}{a}1 {b}{b}2 {a, b}2

The field of “Submodular Maximization” Our problem is an important representative problem. Demonstrating: – Techniques – Aspects Has many applications and a long history: 12 Very active field in recent years. As early as the 1970’s [Fisher et al., Math. Program. Stud. 1978].

First Case – Cardinality Constraint 13 Constraint A  B  NA  B  N A A B Objective Function f(A)  f(B)  k k  k k Cardinality Submodular Non-negative Some results assume monotone objective.

Hardness results in this presentation are unconditional. They are based on information theoretic arguments. Summary of Results The greedy algorithm – Natural algorithm – Approximation ratio of 1 – 1/e for monotone functions [Fisher et al., Math. Program. Stud. 1978] – The best possible [Nemhauser and Wolsey, Math. Oper. Res. 1978] Random Greedy [Buchbinder et al., SODA 2014] – A simple variant of greedy achieving an approximation ratio of 1/e for non-monotone functions. – In the same work we achieve also the state of the art approximation ratio of 1/e – The corresponding hardness is [Oveis Gharan and Vondrak, SODA 2011] 14

Faster Algorithms… 15 Monotone FunctionsNon-monotone Functions Ratio:1 - 1/e - ε1/e - ε Previous result: [Badanidiyurua and Vondrak, SODA 2014] O(nk) (Random Greedy) Our result: O(n ln ε -1 ) Oracle Queries as a Complexity Measure Queries are nontrivial in many applications. Independent of the model. Typically, the oracle queries represent the time complexity up to polylogarithmic factors. [Buchbinder et al., SODA 2015] State of the art

The Greedy Algorithm 16 1.Start with the empty solution. 2.Do k times: 3.Add to the solution the element contributing the most. Analysis OPT S The Current Solution Value f(S)f(S) f(S  OPT)  f(OPT)

Analysis (cont.) 17 Conclusion Some element increases the value by at least: Observation The elements of OPT \ S (together) increase the value by at least: Submodularity f(S0)f(S0) f(OPT) f(S1)f(S1) 1/k1-1/k f(S2)f(S2) 1/k(1-1/k)(1-1/k) 2 f(Sk)f(Sk) (1-1/k) k  1/e

The Average Observation 18 Conclusion Some element increases the value by at least: Observation The elements of OPT \ S (together) increase the value by at least: Submodularity Recall Let M be the set of the k elements with the largest marginal contributions to S. Conclusion A random element of M increases the value, in expectation, by at least: This simple observation has applications for: Non-monotone functions. Fast algorithms.

The Random Greedy Algorithm 1.Start with the empty solution. 2.Do k times: 3.Let M be the set of the k elements with the largest marginal contributions to the solution. 4.Add a random element of M to the solution. 19 Analysis For monotone functions, approximation ratio of 1 – 1/e, in expectation, by the “Average Observation” and the above analysis. For non-monotone functions, the analysis fails only because we can no longer bound: f(S  OPT)  f(OPT) 

Intuition - Why is Randomness Important There might be an element u which looks good – has a large marginal contribution. However, this element might be evil – any solution containing it is poor. The (deterministic) greedy might be tempted to take u. A randomized algorithm has a chance to avoid u. 20

How Bad Can f(S  OPT) Be? All the elements of (together) can only decrease the value to 0. What happens if S contains every element with probability at most p? 21 p E[f(OPT  S)] Concave by Submodularity Theorem: If S contains every element u with probability at most p, then:

Random Greedy and Non-monotone Functions 22 An element is selected with probability at most 1/k in every iteration. Each element belongs to S i with probability at most: Thus, Plugging into the above analysis gives an approximation ratio of: 1/e 

Making Random Greedy Fast The elements of M in decreasing marginal contribution order: 23 M: u1u1 u2u2 ukuk  Probability to be added For Monotone Functions p 1 + p 2 + … + p k = 1 p 1  p 2  …  p k For Non-Monotone Functions p 1 = p 2 = … = p k = 1/k p1p1 p2p2 pkpk 

A Fast Algorithm for Monotone Functions 1.Start with the empty solution. 2.Do k times: 3.Randomly choose a subset A containing elements. 4.Add to the solution the element of A contributing the most. 24 Approximation Ratio A contains in expectation ln  -1 elements of M.  With probability at least 1-  : A  M  . Hence, By symmetry: p 1  p 2  …  p k p 1 + p 2 + … + p k  1-  Approximation ratio of 1 - 1/e -  S S How Fast is this Algorithm Each iteration requires O(n/k  log  -1 ) oracle queries. k iterations. In total: O(n  log  -1 ) oracle queries.

Fast Algorithm for Non-monotone Functions Why not use the same algorithm? If |A  M| > 1, then we always select the best element. Consequently, p 1 >> p 2 >> … >> p k, although we need them to be (roughly) equal. Desired Solution Select an element u  A  M uniformly at random. Unfortunately, determining |A  M| requires us to look at all the elements – too costly. Solution 25

Fast Algorithm for Non-monotone Functions (cont.) To make |A  M| concentrate, we need A to be larger. Algorithm 1.Start with the empty solution. 2.Do k times: 3.Randomly choose a subset A containing elements. 4.Let B be the set of the best elements in A. 5.Add a uniformly random element of B to the solution. 26 |A  M||A  M| A Marginal Gain (|B| = E[|A  M|]) B

What did We Get? 27 Approximation Ratio The algorithm almost mimics the random greedy, and it approximation ratio is e -1 - . Oracle Queries Each iteration requires O(n  -2 /k  log  -1 ) oracle queries. k iterations. In total: O(n  -2  log  -1 ) oracle queries. Q.E.D.

General Matroid Constraint 28 General Algorithmic Scheme Solve a fractional relaxationRound the solution In the next slides we will: Define the relaxation. Explain how to approximately solve it. Rounding can be done without loss. Pipage rounding [Calinescu et al., SIAM J. Comp. 2011] Swap rounding [Chekuri et al., FOCS 2010]

Summary of Results Continuous Greedy [Calinescu et al., SIAM J. Comp. 2011] – An (1-1/e)-approximation for monotone functions. – Optimal even for cardinality constraints [Nemhauser and Wolsey, Math. Oper. Res. 1978]. Measured Continuous Greedy [Feldman et al., FOCS 2011] – An 1/e-approximation for non-monotone functions. – State of the art – previous result was a approximation [Chekuri et al., STOC 2011]. – Hardness – [Oveis Gharan and Vondrak, SODA 2011] 29 – Improved results for special cases when the function is monotone: Submodular Max-SAT Submodular Welfare

Relaxation 30 Matroid constraint Matroid polytope Convex hull of the feasible solutions set characteristic vector Objective function Multilinear extension For a vector x, R(x) is a random set containing every element u  N with probability x u. The extension is: F(x) = E[f(R(x))]. For integral points: f and F agree. Ground set: N = {a, b, c} {a, b}(1, 1, 0) Optimize over [0, 1] N

Before Presenting the Algorithms… The algorithms we describe are continuous processes. – An implementation has to discretize the process by working in small steps. The multilinear extension F: – Cannot be evaluated exactly (in general). – Can be approximated arbitrary well by sampling. 31

The Continuous Greedy For every time point t  [0, 1]: – Consider the directions corresponding to the feasible sets. – Move in the best (locally) direction at a speed of 1. Feasibility: the output is a convex combination of feasible sets. 32  y Approximation Ratio – OPT is a good direction y y + OPT Real direction y  OPT Imaginary direction Monotonicity: “real” is better than “imaginary”. Concave by Submodularity F(y  OPT) F(y)F(y)

Approximation Ratio - Analysis 33 By the above discussion: Monotonicity By non-negativity, f(y)  0 at time 0. Hence, At time t = 1, for monotone functions:

Mesured Continuous Greedy 34 y x x  yy  Px  yy  P x  Px  P Gain In some cases, allows running for more time. Recall: Allows optimal approximation ratios for:  Submodular Max-SAT  Submodular Welfare Main Idea Find the best “imaginary” direction. Walk in the found “imaginary” direction. Feasibility The matroid polytope is down-monotone. Reducing the step cannot get us outside of the polytope. Approximation Ratio The previous analysis still works.

Non-monotone Functions 35 Uses of Monotonicity in the Analysis The “real” direction is better than the “imaginary” one. Bounding F(y  OPT). At time t, for every element u, y u  1-e -t (y u  t in the continuous greedy). Every element of appears with probability at most 1-e -t in R(y). F(y  OPT)  e -t  f(OPT) An Old Trick

Non-monotone Functions (cont.) 36 By the above discussion: By non-negativity, f(y)  0 at time 0. Hence, At time t = 1:

Future Work Monotone Functions The basic question is answered (what can be done in polynomial time). Many open problems in other aspects: – Fast algorithms – almost linear time algorithms for more general constraints. – Online and streaming algorithms – getting tight bounds. Important for big data applications. 37 ?

Future Work (cont.) Non-monotone Functions Largely terra-incognita. 38 Here be dragons Optimal approximation ratio For cardinality constraint? For general matroids? Properties that can help Symmetry? Others? Other Aspects Fast algorithms, online, …

Other Main Fields of Interest Online and Secretary algorithms – State of the art result for the Matroid Secretary Problem. [Feldman et al., SODA 2015] Algorithmic Game Theory – Mechanism Design – Analysis of game models inspired by combinatorial problems. 39

Additional Results on Submodular Maximization Nonmonotone Submodular Maximization via a Structural Continuous Greedy Algorithm. Moran Feldman, Joseph (Seffi) Naor and Roy Schwartz (ICALP 2011). Improved Competitive Ratios for Submodular Secretary Problems. Moran Feldman, Joseph (Seffi) Naor and Roy Schwartz (APPROX 2011). Improved Approximations for k-Exchange Systems. Moran Feldman, Joseph (Seffi) Naor, Roy Schwartz and Justin Ward (ESA 2011). A Tight Linear Time (1/2)-Approximation for Unconstrained Submodular Maximization. Niv Buchbinder, Moran Feldman, Joseph (Seffi) Naor and Roy Schwartz (FOCS 2012). Online Submodular Maximization with Preemption. Niv Buchbinder, Moran Feldman and Roy Schwartz (SODA 2015).