Unifying Parsing and Tree-Structured Models for Generating Sentence Semantic Representations

TitleUnifying Parsing and Tree-Structured Models for Generating Sentence Semantic Representations
Publication TypeArticle dans des actes
Année de la conférence2022
AuthorsSimoulin, Antoine, and Benoît Crabbé
Nom de la conférenceProceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop
Pagination267–276
PublisherAssociation for Computational Linguistics
Conference LocationHybrid: Seattle, Washington + Online
Abstract

We introduce a novel tree-based model that learns its composition function together with its structure. The architecture produces sentence embeddings by composing words according to an induced syntactic tree. The parsing and the composition functions are explicitly connected and, therefore, learned jointly. As a result, the sentence embedding is computed according to an interpretable linguistic pattern and may be used on any downstream task. We evaluate our encoder on downstream tasks, and we observe that it outperforms tree-based models relying on external parsers. In some configurations, it is even competitive with Bert base model. Our model is capable of supporting multiple parser architectures. We exploit this property to conduct an ablation study by comparing different parser initializations. We explore to which extent the trees produced by our model compare with linguistic structures and how this initialization impacts downstream performances. We empirically observe that downstream supervision troubles producing stable parses and preserving linguistically relevant structures.