Feb 26, 2021
Standard results in stochastic convex optimization bound the number of samples that an algorithm needs to generate a point with small function value in expectation. More nuanced high probability guarantees are rare, and typically either rely on “light-tail” noise assumptions or exhibit worse sample complexity. In this work, we show that a wide class of stochastic optimization algorithms for strongly convex problems can be augmented with high confidence bounds at an overhead cost that is only logarithmic in the confidence level and polylogarithmic in the condition number. The procedure we propose, called proxBoost, is elementary and builds on two well-known ingredients: robust distance estimation and the proximal point method. We discuss consequences for both streaming (online) algorithms and offline algorithms based on empirical risk minimization.
Written by
Damek Davis
Dmitriy Drusvyatskiy
Lin Xiao
Junyu Zhang
February 27, 2025
Pascal Kesseli, Peter O'Hearn, Ricardo Silveira Cabral
February 27, 2025
November 08, 2022
Ari Morcos, Shashank Shekhar, Surya Ganguli, Ben Sorscher, Robert Geirhos
November 08, 2022
November 30, 2020
Nicolas Usunier, Clément Calauzènes
November 30, 2020
November 30, 2020
Rama Vedantam, David Schwab, Douwe Kiela, Yann Dubois
November 30, 2020
May 08, 2019
May 08, 2019
April 30, 2018
Hongyi Zhang, Moustapha Cisse, Yann Dauphin, David Lopez-Paz
April 30, 2018
March 12, 2018
Leon Bottou, Martin Arjovsky, David Lopez-Paz, Maxime Oquab
March 12, 2018
June 10, 2019
Mahmoud Assran, Nicolas Loizou, Nicolas Ballas, Mike Rabbat
June 10, 2019
Foundational models
Our approach
Latest news
Foundational models