July 09, 2021
Fixed-point iterations are at the heart of numerical computing and are often a computational bottleneck in real-time applications, which typically instead need a fast solution of moderate accuracy. Classical acceleration methods for fixed-point problems focus on designing algorithms with theoretical guarantees that apply to any fixed-point problem. We present neural fixed-point acceleration, a framework to automatically learn to accelerate convex fixed-point problems that are drawn from a distribution, using ideas from meta-learning and classical acceleration algorithms. We apply our framework to SCS, the state-of-the-art solver for convex cone programming, and design models and loss functions to overcome the challenges of learning over unrolled optimization and acceleration instabilities. Our work brings neural acceleration into any optimization problem expressible with CVXPY.
Publisher
NeurIPS AutoML Workshop
Research Topics
Core Machine Learning
August 12, 2024
Arman Zharmagambetov, Yuandong Tian, Aaron Ferber, Bistra Dilkina, Taoan Huang
August 12, 2024
August 09, 2024
Emily Wenger, Eshika Saxena, Mohamed Malhou, Ellie Thieu, Kristin Lauter
August 09, 2024
August 02, 2024
August 02, 2024
July 29, 2024
Rohit Girdhar, Mannat Singh, Andrew Brown, Quentin Duval, Samaneh Azadi, Saketh Rambhatla, Mian Akbar Shah, Xi Yin, Devi Parikh, Ishan Misra
July 29, 2024
Foundational models
Latest news
Foundational models