December 12, 2020
Bridging logical and algorithmic reasoning with modern machine learning techniques is a fundamental challenge with potentially transformative impact. On the algorithmic side, many NP-Hard problems can be expressed as integer programs, in which the constraints play the role of their combinatorial specification. In this work, we aim to fully integrate integer programming solvers into neural network architecture by providing gradient update rules for both the objective and the constraints. The resulting end-to-end trainable architectures have the power of jointly extracting features from raw data and of solving a suitable (learned) combinatorial problem with state-of-the-art integer programming solvers. We experimentally validate our approach in multiple ways: on random constraints, on solving Knapsack instances from their description in natural language, and on a popular computer vision benchmark regarding keypoint matching.
Publisher
NeurIPS Workshop on Learning Meets Combinatorial Optimization
Research Topics
Core Machine Learning
November 20, 2024
Igor Fedorov, Kate Plawiak, Lemeng Wu, Tarek Elgamal, Naveen Suda, Eric Smith, Hongyuan Zhan, Jianfeng Chi, Yuriy Hulovatyy, Kimish Patel, Zechun Liu, Yangyang Shi, Tijmen Blankevoort, Mahesh Pasupuleti, Bilge Soran, Zacharie Delpierre Coudert, Rachad Alao, Raghuraman Krishnamoorthi, Vikas Chandra
November 20, 2024
November 14, 2024
Zhaoyu Li, Jialiang Sun, Logan Murphy, Qidong Su, Zenan Li, Xian Zhang, Kaiyu Yang, Xujie Si
November 14, 2024
November 06, 2024
Aaron Defazio, Alice Yang, Harsh Mehta, Konstantin Mishchenko, Ahmed Khaled, Ashok Cutkosky
November 06, 2024
August 16, 2024
Zhihan Xiong, Maryam Fazel, Lin Xiao
August 16, 2024
Foundational models
Latest news
Foundational models