THEORY

RESEARCH

Fluctuation-dissipation relations for stochastic gradient descent

May 03, 2019

Abstract

The notion of the stationary equilibrium ensemble has played a central role in statistical mechanics. In machine learning as well, training serves as generalized equilibration that drives the probability distribution of model parameters toward stationarity. Here, we derive stationary fluctuation-dissipation relations that link measurable quantities and hyperparameters in the stochastic gradient descent algorithm. These relations hold exactly for any stationary state and can in particular be used to adaptively set training schedule. We can further use the relations to efficiently extract information pertaining to a loss-function landscape such as the magnitudes of its Hessian and anharmonicity. Our claims are empirically verified.

Download the Paper

AUTHORS

Written by

Sho Yaida

Publisher

ICLR

Research Topics

Theory

Related Publications

March 28, 2024

THEORY

CORE MACHINE LEARNING

On the Identifiability of Quantized Factors

Vitoria Barin Pacela, Kartik Ahuja, Simon Lacoste-Julien, Pascal Vincent

March 28, 2024

July 08, 2023

THEORY

NLP

Language acquisition: do children and language models follow similar learning stages?

Linnea Evanson, Yair Lakretz, Jean Remi King

July 08, 2023

May 01, 2023

THEORY

CORE MACHINE LEARNING

Meta-Learning in Games

Keegan Harris, Ioannis Anagnostides, Gabriele Farina, Mikhail Khodak, Zhiwei Steven Wu, Tuomas Sandholm, Maria-Florina Balcan

May 01, 2023

November 30, 2022

THEORY

A Simple Convergence Proof of Adam and Adagrad

Alexandre Defossez, Leon Bottou, Nicolas Usunier, Francis Bach

November 30, 2022

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.