THEORY

CORE MACHINE LEARNING

The Road Less Scheduled

November 06, 2024

Abstract

Existing learning rate schedules that do not require specification of the optimization stopping step T are greatly out-performed by learning rate schedules that depend on T. We propose an approach that avoids the need for this stopping time by eschewing the use of schedules entirely, while exhibiting state-of-the-art performance compared to schedules across a wide family of problems ranging from convex problems to large-scale deep learning problems. Our Schedule-Free approach introduces no additional hyper-parameters over standard optimizers with momentum. Our method is a direct consequence of a new theory we develop that unifies scheduling and iterate averaging. Schedule-Free AdamW is the core algorithm behind our winning entry to the MLCommons 2024 AlgoPerf Algorithmic Efficiency Challenge Self-Tuning track.

Download the Paper

AUTHORS

Written by

Aaron Defazio

Alice Yang

Harsh Mehta

Konstantin Mishchenko

Ahmed Khaled

Ashok Cutkosky

Publisher

NeurIPS

Research Topics

Theory

Core Machine Learning

Related Publications

August 16, 2024

THEORY

REINFORCEMENT LEARNING

Dual Approximation Policy Optimization

Zhihan Xiong, Maryam Fazel, Lin Xiao

August 16, 2024

August 12, 2024

CORE MACHINE LEARNING

Contrastive Predict-and-Search for Mixed Integer Linear Programs

Arman Zharmagambetov, Yuandong Tian, Aaron Ferber, Bistra Dilkina, Taoan Huang

August 12, 2024

August 09, 2024

CORE MACHINE LEARNING

Benchmarking Attacks on Learning with Errors

Emily Wenger, Eshika Saxena, Mohamed Malhou, Ellie Thieu, Kristin Lauter

August 09, 2024

August 02, 2024

CORE MACHINE LEARNING

GenCO: Generating Diverse Designs with Combinatorial Constraints

Arman Zharmagambetov, Yuandong Tian

August 02, 2024

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.