December 10, 2020
To better conform to data geometry, recent deep generative modelling techniques adapt Euclidean constructions to non-Euclidean spaces. In this paper, we study normalizing flows on manifolds. Previous work has developed flow models for specific cases; however, these advancements hand craft layers on a manifold-by-manifold basis, restricting generality and inducing cumbersome design constraints. We overcome these issues by introducing Neural Manifold Ordinary Differential Equations, a manifold generalization of Neural ODEs, which enables the construction of Manifold Continuous Normalizing Flows (MCNFs). MCNFs require only local geometry (therefore generalizing to arbitrary manifolds) and compute probabilities with continuous change of variables (allowing for a simple and expressive flow construction). We find that leveraging continuous manifold dynamics produces a marked improvement for both density estimation and downstream tasks.
Research Topics
Core Machine learning
November 10, 2022
Unnat Jain, Abhinav Gupta, Himangi Mittal, Pedro Morgado
November 10, 2022
November 06, 2022
Filip Radenovic, Abhimanyu Dubey, Dhruv Mahajan
November 06, 2022
October 25, 2022
Mustafa Mukadam, Austin Wang, Brandon Amos, Daniel DeTone, Jing Dong, Joe Ortiz, Luis Pineda, Maurizio Monge, Ricky Chen, Shobha Venkataraman, Stuart Anderson, Taosha Fan, Paloma Sodhi
October 25, 2022
October 22, 2022
Naila Murray, Lei Wang, Piotr Koniusz, Shan Zhang
October 22, 2022
April 30, 2018
Yedid Hoshen, Lior Wolf
April 30, 2018
December 11, 2019
Eliya Nachmani, Lior Wolf
December 11, 2019
April 30, 2018
Yedid Hoshen, Lior Wolf
April 30, 2018
November 01, 2018
Yedid Hoshen, Lior Wolf
November 01, 2018
Foundational models
Latest news
Foundational models