Statistical Query Algorithms for Stochastic Convex Optimization

Research paper by Vitaly Feldman, Cristobal Guzman, Santosh Vempala

Indexed on: 30 Dec '15Published on: 30 Dec '15Published in: Computer Science - Learning


Stochastic convex optimization, where the objective is the expectation of a random convex function, is an important and widely used method with numerous applications in machine learning, statistics, operations research and other areas. We study the complexity of stochastic convex optimization given only statistical query (SQ) access to the objective function. We show that well-known and popular methods, including first-order iterative methods and polynomial-time methods, can be implemented using only statistical queries. For many cases of interest we derive nearly matching upper and lower bounds on the estimation (sample) complexity including linear optimization in the most general setting. We then present several consequences for machine learning, differential privacy and proving concrete lower bounds on the power of convex optimization based methods. A new technical ingredient of our work is SQ algorithms for estimating the mean vector of a distribution over vectors in $\mathbb{R}^d$ with optimal estimation complexity. This is a natural problem and we show that our solutions can be used to get substantially improved SQ versions of Perceptron and other online algorithms for learning halfspaces.