|Title||Unifying Parsing and Tree-Structured Models for Generating Sentence Semantic Representations|
|Publication Type||Article dans des actes|
|Année de la conférence||2022|
|Authors||Simoulin, Antoine, and Benoît Crabbé|
|Nom de la conférence||Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop|
|Publisher||Association for Computational Linguistics|
|Conference Location||Hybrid: Seattle, Washington + Online|
We introduce a novel tree-based model that learns its composition function together with its structure. The architecture produces sentence embeddings by composing words according to an induced syntactic tree. The parsing and the composition functions are explicitly connected and, therefore, learned jointly. As a result, the sentence embedding is computed according to an interpretable linguistic pattern and may be used on any downstream task. We evaluate our encoder on downstream tasks, and we observe that it outperforms tree-based models relying on external parsers. In some configurations, it is even competitive with Bert base model. Our model is capable of supporting multiple parser architectures. We exploit this property to conduct an ablation study by comparing different parser initializations. We explore to which extent the trees produced by our model compare with linguistic structures and how this initialization impacts downstream performances. We empirically observe that downstream supervision troubles producing stable parses and preserving linguistically relevant structures.