Are Transformers a Modern Version of ELIZA? Observations on French Object Verb Agreement

TitleAre Transformers a Modern Version of ELIZA? Observations on French Object Verb Agreement
Publication TypeArticle dans des actes
Année de la conférence2021
AuthorsLi, Bingzhi, Guillaume Wisniewski, and Benoît Crabbé
Nom de la conférenceProceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Pagination4599–4610
Date de publicationnov
PublisherAssociation for Computational Linguistics
Conference LocationOnline and Punta Cana, Dominican Republic
Abstract

Many recent works have demonstrated that unsupervised sentence representations of neural networks encode syntactic information by observing that neural language models are able to predict the agreement between a verb and its subject. We take a critical look at this line of research by showing that it is possible to achieve high accuracy on this agreement task with simple surface heuristics, indicating a possible flaw in our assessment of neural networks' syntactic ability. Our fine-grained analyses of results on the long-range French object-verb agreement show that contrary to LSTMs, Transformers are able to capture a non-trivial amount of grammatical structure.