Title | Are Transformers a Modern Version of ELIZA? Observations on French Object Verb Agreement |
Publication Type | Article dans des actes |
Année de la conférence | 2021 |
Authors | Li, Bingzhi, Guillaume Wisniewski, and Benoît Crabbé |
Nom de la conférence | Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing |
Pagination | 4599–4610 |
Date de publication | nov |
Publisher | Association for Computational Linguistics |
Conference Location | Online and Punta Cana, Dominican Republic |
Abstract | Many recent works have demonstrated that unsupervised sentence representations of neural networks encode syntactic information by observing that neural language models are able to predict the agreement between a verb and its subject. We take a critical look at this line of research by showing that it is possible to achieve high accuracy on this agreement task with simple surface heuristics, indicating a possible flaw in our assessment of neural networks' syntactic ability. Our fine-grained analyses of results on the long-range French object-verb agreement show that contrary to LSTMs, Transformers are able to capture a non-trivial amount of grammatical structure. |