Сư³æ´«Ã½

Operations Research Seminar: Dan Dadush

November 20, 2018

2:30 p.m.

CORE, b-135

A friendly Smoothed Analysis of the Simplex Method

Dan Dadush, CWI

Explaining the excellent practical performance of the simplex method for linear programming has been a major topic of research for over 50 years. One of the most successful frameworks for understanding the simplex method was given by Spielman and Teng (JACM `04), who the developed the notion of smoothed analysis. Starting from an arbitrary linear program with d variables and n constraints, Spielman and Teng analyzed the expected runtime over random perturbations of the LP (smoothed LP), where variance sigma^2 Gaussian noise is added to the LP data. In particular, they gave a two-stage shadow vertex simplex algorithm which uses an expected O(d^55 n^86 sigma^{−30}) number of simplex pivots to solve the smoothed LP. Their analysis and runtime was substantially improved by Deshpande and Spielman (FOCS `05) and later Vershynin (SICOMP `09). The fastest current algorithm, due to Vershynin, solves the smoothed LP using an expected O(d^3 log^3 n sigma^{−4}) number of pivots (for small sigma), improving the dependence on n from polynomial to logarithmic.
While the original proof of Spielman and Teng has now been substantially simplified, the resulting analyses are still quite long and complex and the parameter dependencies far from optimal. In this work, we make substantial progress on this front, providing an improved and simpler analysis of the shadow simplex method, where our main algorithm requires an expected O(d^2 sqrt(log n) sigma^{-2}) number of simplex pivots. We obtain our results via an improved shadow bound, key to earlier analyses as well, combined with algorithmic techniques of Borgwardt (ZOR `82) and Vershynin. As an added bonus, our analysis is completely modular, allowing us to obtain non-trivial bounds for perturbations beyond Gaussians, such as Laplace perturbations.

This is joint work with Sophie Huiberts (CWI): 

Categories Events: