RESEARCH

ML APPLICATIONS

Open Source Evolutionary Structured Optimization

May 11, 2020

Abstract

Nevergrad is a derivative-free optimization platform gathering both a wide range of optimization methods and a wide range of test functions to evaluate them upon. Some of these functions have very particular structures which standard methods are not able to use. The most recent feature of Nevergrad is the ability to conveniently define a search domain, so that many algorithms in Nevergrad can automatically rescale variables and/or take into account their possibly logarithmic nature or their discrete nature, but also take into account any user-defined mutation or recombination operator. Since many problems are efficiently solved using specific operators, Nevergrad therefore now enables using specific operators within generic algorithms: the underlying structure of the problem is user-defined information that several families of optimization methods can use and benefit upon. We explain how this API can help analyze optimization methods and how to use it for the optimization of a structured Photonics physical testbed, and show that this can produce significant improvements.

Download the Paper

AUTHORS

Written by

Jérémy Rapin

Daniel Haziza

Olivier Teytaud

Antoine Moreau

Emmanuel Centeno

Pauline Bennet

Publisher

Evolutionary Computation Software Systems Workshop at ​GECCO

Related Publications

June 11, 2025

RESEARCH

COMPUTER VISION

IntPhys 2: Benchmarking Intuitive Physics Understanding In Complex Synthetic Environments

Florian Bordes, Quentin Garrido, Justine Kao, Adina Williams, Mike Rabbat, Emmanuel Dupoux

June 11, 2025

June 11, 2025

RESEARCH

COMPUTER VISION

A Shortcut-aware Video-QA Benchmark for Physical Understanding via Minimal Video Pairs

Benno Krojer, Mojtaba Komeili, Candace Ross, Quentin Garrido, Koustuv Sinha, Nicolas Ballas, Mido Assran

June 11, 2025

June 11, 2025

ROBOTICS

RESEARCH

V-JEPA 2: Self-Supervised Video Models Enable Understanding, Prediction and Planning

Mido Assran, Adrien Bardes, David Fan, Quentin Garrido, Russell Howes, Mojtaba Komeili, Matthew Muckley, Ammar Rizvi, Claire Roberts, Koustuv Sinha, Artem Zholus, Sergio Arnaud, Abha Gejji, Ada Martin, Francois Robert Hogan, Daniel Dugas, Piotr Bojanowski, Vasil Khalidov, Patrick Labatut, Francisco Massa, Marc Szafraniec, Kapil Krishnakumar, Yong Li, Xiaodong Ma, Sarath Chandar, Franziska Meier, Yann LeCun, Michael Rabbat, Nicolas Ballas

June 11, 2025

May 14, 2025

RESEARCH

CORE MACHINE LEARNING

UMA: A Family of Universal Models for Atoms

Brandon M. Wood, Misko Dzamba, Xiang Fu, Meng Gao, Muhammed Shuaibi, Luis Barroso-Luque, Kareem Abdelmaqsoud, Vahe Gharakhanyan, John R. Kitchin, Daniel S. Levine, Kyle Michel, Anuroop Sriram, Taco Cohen, Abhishek Das, Ammar Rizvi, Sushree Jagriti Sahoo, Zachary W. Ulissi, C. Lawrence Zitnick

May 14, 2025

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.