Linguistic Structure from Deep Learning

May 04, 2020


Modern deep neural networks achieve impressive performance in tasks demanding extensive linguistic skills, such as machine translation. This has revived interest in probing the extent to which these models are inducing genuine grammatical knowledge from the raw data they are exposed to, and on whether they can thus shed new light on long-standing issues about the fundamental priors necessary for language acquisition. In this article, we survey representative studies investigating the syntactic abilities of deep networks, and we discuss the broader implications that similar work has for theoretical linguistics.

Download the Paper


Written by

Marco Baroni

Tal Linzen


Annual Review of Linguistics

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.