BEGIN:VCALENDAR
VERSION:2.0
PRODID:ILLC Website
X-WR-TIMEZONE:Europe/Amsterdam
BEGIN:VTIMEZONE
TZID:Europe/Amsterdam
X-LIC-LOCATION:Europe/Amsterdam
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:19700329T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:19701025T030000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
UID:/NewsandEvents/Archives/2022/newsitem/13478/15
 ---16-August-2022-ESSLLI-Workshop-End-to-End-Compo
 sitional-Models-of-Vector-Based-Semantics-
DTSTAMP:20220308T131630
SUMMARY:ESSLLI Workshop "End-to-End Compositional 
 Models of Vector-Based Semantics"
DTSTART;VALUE=DATE:20220815
DTEND;VALUE=DATE:20220816
LOCATION:Galway, Ireland
DESCRIPTION:This workshop focuses on end-to-end im
 plementations of vector-based compositional archit
 ectures. This means not only the elementary word e
 mbeddings are obtained from data, but also the cat
 egories/types and their internal composition so th
 at neural methods can then be applied to learn how
  the structure of syntactic derivations can be sys
 tematically mapped to operations on the data-drive
 n word representations. For this last step, the wo
 rkshop invites approaches that do not require the 
 semantic operations to be linear maps since restri
 cting the meaning algebra to finite dimensional ve
 ctor spaces and linear maps means that vital infor
 mation encoded in syntactic derivations may be los
 t in translation.  On the evaluation side, we welc
 ome work on modern NLP tasks for evaluating senten
 ce embeddings such as Natural Language Inference, 
 sentence-level classification, and sentence disamb
 iguation tasks. Special interest goes out to work 
 that uses compositionality to investigate the synt
 actic sensitivity of large-scale language models. 
  The workshop welcomes but is not limited to contr
 ibutions addressing the following topics:  - End-t
 o-end models of compositional vector-based semanti
 cs  - Supervised and unsupervised models for wide-
 coverage supertagging and parsing  - Approaches to
  learning word/sentence representations  - Tasks a
 nd datasets requiring or benefiting from syntax  -
  Analysis of model performance on syntactically mo
 tivated tasks  - Multi-task learning/joint trainin
 g of syntactic and semantic representations  - Usi
 ng compositional methods to assess neural network 
 behaviour  - Explainable models of sentence repres
 entation
X-ALT-DESC;FMTTYPE=text/html:<div>\n  <p>This work
 shop focuses on <em>end-to-end</em> implementation
 s of vector-based compositional architectures. Thi
 s means not only the elementary word embeddings ar
 e obtained from data, but also the categories/type
 s and their internal composition so that neural me
 thods can then be applied to learn how the structu
 re of syntactic derivations can be systematically 
 mapped to operations on the data-driven word repre
 sentations. For this last step, the workshop invit
 es approaches that do not require the semantic ope
 rations to be linear maps since restricting the me
 aning algebra to finite dimensional vector spaces 
 and linear maps means that vital information encod
 ed in syntactic derivations may be lost in transla
 tion.</p>\n\n  <p>On the evaluation side, we welco
 me work on modern NLP tasks for evaluating sentenc
 e embeddings such as Natural Language Inference, s
 entence-level classification, and sentence disambi
 guation tasks. Special interest goes out to work t
 hat uses compositionality to investigate the synta
 ctic sensitivity of large-scale language models.</
 p>\n</div><div>\n  <p>The workshop welcomes but is
  not limited to contributions addressing the follo
 wing topics:<br>\n  - End-to-end models of composi
 tional vector-based semantics<br>\n  - Supervised 
 and unsupervised models for wide-coverage supertag
 ging and parsing<br>\n  - Approaches to learning w
 ord/sentence representations<br>\n  - Tasks and da
 tasets requiring or benefiting from syntax<br>\n  
 - Analysis of model performance on syntactically m
 otivated tasks<br>\n  - Multi-task learning/joint 
 training of syntactic and semantic representations
 <br>\n  - Using compositional methods to assess ne
 ural network behaviour<br>\n  - Explainable models
  of sentence representation</p>\n</div>
URL:https://compositioncalculus.sites.uu.nl/worksh
 op
CONTACT:Gijs Wijnholds at g.j.wijnholds at uu.nl
END:VEVENT
END:VCALENDAR
