News and Events: Conferences

Please note that this newsitem has been archived, and may contain outdated information or links.

15 - 16 August 2022, ESSLLI Workshop "End-to-End Compositional Models of Vector-Based Semantics"

composition.png
Date: 15 - 16 August 2022
Location: Galway, Ireland
Deadline: Monday 16 May 2022

This workshop focuses on end-to-end implementations of vector-based compositional architectures. This means not only the elementary word embeddings are obtained from data, but also the categories/types and their internal composition so that neural methods can then be applied to learn how the structure of syntactic derivations can be systematically mapped to operations on the data-driven word representations. For this last step, the workshop invites approaches that do not require the semantic operations to be linear maps since restricting the meaning algebra to finite dimensional vector spaces and linear maps means that vital information encoded in syntactic derivations may be lost in translation.

On the evaluation side, we welcome work on modern NLP tasks for evaluating sentence embeddings such as Natural Language Inference, sentence-level classification, and sentence disambiguation tasks. Special interest goes out to work that uses compositionality to investigate the syntactic sensitivity of large-scale language models.

The workshop welcomes but is not limited to contributions addressing the following topics:
- End-to-end models of compositional vector-based semantics
- Supervised and unsupervised models for wide-coverage supertagging and parsing
- Approaches to learning word/sentence representations
- Tasks and datasets requiring or benefiting from syntax
- Analysis of model performance on syntactically motivated tasks
- Multi-task learning/joint training of syntactic and semantic representations
- Using compositional methods to assess neural network behaviour
- Explainable models of sentence representation

For more information, see https://compositioncalculus.sites.uu.nl/workshop or contact Gijs Wijnholds at .

Please note that this newsitem has been archived, and may contain outdated information or links.