December 17, 2018
Linear least-squares regression with a “design” matrix A approximates a given matrix B via minimization of the spectral- or Frobenius-norm discrepancy ||AX − B|| over every conformingly sized matrix X. Also popular is low-rank approximation to B through the “interpolative decomposition,” which traditionally has no supervision from any auxiliary matrix A. The traditional interpolative decomposition selects certain columns of B and constructs numerically stable (multi)linear interpolation from those columns to all columns of B, thus approximating all of B via the chosen columns. Accounting for regression with an auxiliary matrix A leads to a “regression-aware interpolative decomposition,” which selects certain columns of B and constructs numerically stable (multi)linear interpolation from the corresponding least-squares solutions to the least-squares solutions X minimizing ||AX − B|| for all columns of B. The regression-aware decompositions reveal the structure inherent in B that is relevant to regression against A; they effectively enable supervision to inform classical dimensionality reduction, which classically has been restricted to strictly unsupervised learning.
November 27, 2022
Nicolas Ballas, Bernhard Schölkopf, Chris Pal, Francesco Locatello, Li Erran, Martin Weiss, Nasim Rahaman, Yoshua Bengio
November 27, 2022
November 27, 2022
Andrea Tirinzoni, Aymen Al Marjani, Emilie Kaufmann
November 27, 2022
November 16, 2022
Kushal Tirumala, Aram H. Markosyan, Armen Aghajanyan, Luke Zettlemoyer
November 16, 2022
November 10, 2022
Unnat Jain, Abhinav Gupta, Himangi Mittal, Pedro Morgado
November 10, 2022
April 08, 2021
Caner Hazirbas, Joanna Bitton, Brian Dolhansky, Jacqueline Pan, Albert Gordo, Cristian Canton Ferrer
April 08, 2021
April 30, 2018
Tomer Galanti, Lior Wolf, Sagie Benaim
April 30, 2018
April 30, 2018
Yedid Hoshen, Lior Wolf
April 30, 2018
December 11, 2019
Eliya Nachmani, Lior Wolf
December 11, 2019
Foundational models
Latest news
Foundational models