December 11, 2020
We present a new class of learnable Riemannian manifolds with a metric parameterized by a deep neural network. The core manifold operations--specifically the Riemannian exponential and logarithmic maps--are solved using approximate numerical techniques. Input and parameter gradients are computed with an adjoint sensitivity analysis. This enables us to fit geodesics and distances with gradient-based optimization of both on-manifold values and the manifold itself. We demonstrate our method's capability to model smooth, flexible metric structures in graph embedding tasks.
Publisher
NeurIPS Workshop on Differential Geometry for ML
Research Topics
Core Machine Learning
November 20, 2024
Igor Fedorov, Kate Plawiak, Lemeng Wu, Tarek Elgamal, Naveen Suda, Eric Smith, Hongyuan Zhan, Jianfeng Chi, Yuriy Hulovatyy, Kimish Patel, Zechun Liu, Yangyang Shi, Tijmen Blankevoort, Mahesh Pasupuleti, Bilge Soran, Zacharie Delpierre Coudert, Rachad Alao, Raghuraman Krishnamoorthi, Vikas Chandra
November 20, 2024
November 14, 2024
Zhaoyu Li, Jialiang Sun, Logan Murphy, Qidong Su, Zenan Li, Xian Zhang, Kaiyu Yang, Xujie Si
November 14, 2024
November 06, 2024
Aaron Defazio, Alice Yang, Harsh Mehta, Konstantin Mishchenko, Ahmed Khaled, Ashok Cutkosky
November 06, 2024
August 16, 2024
Zhihan Xiong, Maryam Fazel, Lin Xiao
August 16, 2024
Foundational models
Latest news
Foundational models