Rich Feature Construction for the Optimization-Generalization Dilemma

June 30, 2022


There often is a dilemma between ease of optimization and robust out-of-distribution (OoD) generalization. For instance, many OoD methods rely on penalty terms whose optimization is challenging. They are either too strong to optimize reliably or too weak to achieve their goals. We propose to initialize the networks with a rich representation containing a palette of potentially useful features, ready to be used by even simple models. On the one hand, a rich representation provides a good initialization for the optimizer. On the other hand, it also provides an inductive bias that helps OoD generalization. Such a representation is constructed with the Rich Feature Construction (RFC) algorithm, also called the Bonsai algorithm, which consists of a succession of training episodes. During discovery episodes, we craft a multi-objective optimization criterion and its associated datasets in a manner that prevents the network from using the features constructed in the previous iterations. During synthesis episodes, we use knowledge distillation to force the network to simultaneously represent all the previously discovered features. Initializing the networks with Bonsai representations consistently helps six OoD methods achieve top performance on COLOREDMNIST benchmark (Arjovsky et al., 2020). The same technique substantially outperforms comparable results on the Wilds CAME- LYON17 task (Koh et al., 2021), eliminates the high result variance that plagues other methods, and makes hyperparameter tuning and model selection more reliable.

Download the Paper


Written by

Leon Bottou

David Lopez-Paz

Jianyu Zhang



Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.