RESEARCH

NLP

Neural Database Operator Model

October 21, 2020

Abstract

Our goal is to answer queries over facts stored in a text memory. The key challenge in NeuralDBs(Thorne et al., 2020), compared to open-book NLP such as question answering (Rajpurkar et al., 2016,inter alia), is that possibly thousands of facts must be aggregated to provide a single answer, without direct supervision. The challenges represented in NeuralDBs are important for both the NLP and database communities alike: discrete reasoning over text (Dua et al., 2019), retriever-based QA(Dunn et al., 2017) and multi-hop QA (Welbl et al.,2018; Yang et al., 2018) are common components.

Download the Paper

AUTHORS

Written by

James Thorne

Alon Halevy

Majid Yazdani

Marzieh Saeidi

Sebastian Riedel

Publisher

WeCNLP

Related Publications

September 05, 2024

CONVERSATIONAL AI

NLP

Transfusion: Predict the Next Token and Diffuse Images with One Multi-Modal Model

Chunting Zhou, Lili Yu, Arun Babu, Kushal Tirumala, Michihiro Yasunaga, Leonid Shamis, Jacob Kahn, Luke Zettlemoyer, Omer Levy, Xuezhe Ma

September 05, 2024

August 20, 2024

CONVERSATIONAL AI

NLP

Lumos : Empowering Multimodal LLMs with Scene Text Recognition

Ashish Shenoy, Yichao Lu, Srihari Jayakumar, Debojeet Chatterjee, Mohsen Moslehpour, Pierce Chuang, Abhay Harpale, Vikas Bhardwaj, Di Xu (SWE), Shicong Zhao, Ankit Ramchandani, Luna Dong, Anuj Kumar

August 20, 2024

August 11, 2024

NLP

LM Transparency Tool: Interactive Tool for Analyzing Transformer Language Models

Igor Tufanov, Karen Hambardzumyan, Javier Ferrando, Lena Voita

August 11, 2024

August 11, 2024

NLP

MuTox: Universal MUltilingual Audio-based TOXicity Dataset and Zero-shot Detector

Marta R. Costa-jussa, Mariano Coria Meglioli, Pierre Andrews, David Dale, Kae Hansanti, Elahe Kalbassi, Christophe Ropers, Carleigh Wood

August 11, 2024

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.