Feb 26, 2021
Standard results in stochastic convex optimization bound the number of samples that an algorithm needs to generate a point with small function value in expectation. More nuanced high probability guarantees are rare, and typically either rely on “light-tail” noise assumptions or exhibit worse sample complexity. In this work, we show that a wide class of stochastic optimization algorithms for strongly convex problems can be augmented with high confidence bounds at an overhead cost that is only logarithmic in the confidence level and polylogarithmic in the condition number. The procedure we propose, called proxBoost, is elementary and builds on two well-known ingredients: robust distance estimation and the proximal point method. We discuss consequences for both streaming (online) algorithms and offline algorithms based on empirical risk minimization.
Written by
Damek Davis
Dmitriy Drusvyatskiy
Lin Xiao
Junyu Zhang
November 08, 2022
Ari Morcos, Shashank Shekhar, Surya Ganguli, Ben Sorscher, Robert Geirhos
November 08, 2022
November 30, 2020
Nicolas Usunier, Clément Calauzènes
November 30, 2020
November 30, 2020
Rama Vedantam, David Schwab, Douwe Kiela, Yann Dubois
November 30, 2020
May 03, 2019
Sho Yaida
May 03, 2019
May 08, 2019
May 08, 2019
March 12, 2018
Leon Bottou, Martin Arjovsky, David Lopez-Paz, Maxime Oquab
March 12, 2018
April 30, 2018
Hongyi Zhang, Moustapha Cisse, Yann Dauphin, David Lopez-Paz
April 30, 2018
June 09, 2019
Vikas Verma, Alex Lamb, Christopher Beckham, Amir Najafi, Ioannis Mitliagkas, David Lopez-Paz, Yoshua Bengio
June 09, 2019
Foundational models
Latest news
Foundational models