submodular optimization
-
Faster Submodular Maximization for Several Classes of MatroidsMonika Henzinger, Paul Liu, Jan Vondrak, Da Wei Zheng
[pdf]In this work, we develop algorithms for monotone submodular maximization constrained by graphic, transversal matroids, or laminar matroids in time near-linear in the size of their representation. Our algorithms achieve an optimal approximation of 1−1/e−ϵ and both generalize and accelerate the results of Ene and Nguyen [ICALP '19]. In fact, the running time of our algorithm cannot be improved within the fast continuous greedy framework of Badanidiyuru and Vondrák [SODA '14].
-
Streaming Submodular Maximization under Matroid ConstraintsMoran Feldman, Paul Liu, Ashkan Norouzi-Fard, Ola Svensson, Jan Vondrák, and Rico Zenklusen
[pdf]Recent progress in (semi-)streaming algorithms for monotone submodular function maximization has led to tight results for a simple cardinality constraint. However, current techniques fail to give a similar understanding for natural generalizations, including matroid constraints. Our paper improves on the previously best approximation guarantees of $1/4$ and $1/2$ for single-pass and multi-pass streaming algorithms, respectively. In fact, our multi-pass streaming algorithm is tight in that any algorithm with a better guarantee than $1/2$ must make several passes through the stream and any algorithm that beats our guarantee of $1-1/e$ must make linearly many passes (as well as an exponential number of value oracle queries).
-
A Polynomial Lower Bound on Adaptive Complexity of Submodular MaximizationWenzheng Li, Paul Liu, Jan Vondrák
[pdf] [slides] [video]
Informally, the adaptive model measures complexity by the longest chain of sequentially dependent calls to $f$ in the algorithm. The focus of the adaptive model is to reward highly parallel algorithms, and penalize algorithms that have high 'sequentiality'. Much work has been devoted recently to creating efficient algorithms for submodular optimization (SUBMOD) in the adaptive model. We focus instead on the lower bound, and provide the first lower bound showing that any approximation for monotone SUBMOD takes polynomially many rounds as the approximation factor approaches $1-1/e$.
For the unconstrained non-monotone maximization problem, we show a positive result. For every instance, and every $\delta> 0$, either we obtain a $(1/2-\delta)$-approximation in 1 round, or a $(1/2 + \Omega(\delta^2))$-approximation in $O(1/\delta^2)$ rounds. Unlike the cardinality-constrained case, there cannot be an instance where (i) it is impossible to achieve an approximation factor better than $1/2$ regardless of the number of rounds, and (ii) it takes $r$ rounds to achieve a factor of $1/2 − O(1/r)$.
-
Submodular Optimization in the MapReduce ModelPaul Liu and Jan Vondrák
[pdf]
Submodular optimization has received significant attention in both practice and theory, as a wide array of problems in machine learning, auction theory, and combinatorial optimization have submodular structure. In practice, these problems often involve large amounts of data, and must be solved in a distributed way. One popular framework for running such distributed algorithms is MapReduce. In this paper, we present two simple algorithms for cardinality constrained submodular optimization in the MapReduce model: the first is a $(1/2 - o(1))$-approximation in 2 MapReduce rounds, and the second is a $(1 - 1/e - \epsilon)$-approximation in $\frac{1+o(1)}{\epsilon}$ MapReduce rounds.