BEGIN:VCALENDAR
VERSION:2.0
PRODID:ILLC Website
X-WR-TIMEZONE:Europe/Amsterdam
BEGIN:VTIMEZONE
TZID:Europe/Amsterdam
X-LIC-LOCATION:Europe/Amsterdam
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:19700329T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:19701025T030000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
UID:/NewsandEvents/Archives/2020/newsitem/11616/19
 -February-2020-Computational-Linguistics-Seminar-J
 onas-Groschwitz
DTSTAMP:20200213T150039
SUMMARY:Computational Linguistics Seminar, Jonas G
 roschwitz
ATTENDEE;ROLE=Speaker:Jonas Groschwitz (Saarland U
 niversity)
DTSTART;VALUE=DATE:20200219
LOCATION:ILLC Seminar Room F1.15, Science Park 107
 , Amsterdam
DESCRIPTION:In this talk, I will discuss our parse
 r for semantic graphs such as Abstract Meaning Rep
 resentation (AMR). Our approach combines neural mo
 dels with mechanisms from compositional semantic c
 onstruction. Key to this approach is the Apply-Mod
 ify (AM) algebra, which we developed to both refle
 ct linguistic principles and yield a simple parsin
 g model. In particular, the AM algebra allows us t
 o find consistent latent compositional structures 
 for our training data, which is crucial when train
 ing a compositional parser. The parser then employ
 s neural supertagging and dependency models to pre
 dict interpretable, meaningful operations that con
 struct the semantic graph. The result is a semanti
 c parser with strong performance across diverse gr
 aphbanks, that also provides insights to the compo
 sitional patterns of the graphs.
X-ALT-DESC;FMTTYPE=text/html:\n  <p>In this talk, 
 I will discuss our parser for semantic graphs such
  as Abstract Meaning Representation (AMR). Our app
 roach combines neural models with mechanisms from 
 compositional semantic construction. Key to this a
 pproach is the Apply-Modify (AM) algebra, which we
  developed to both reflect linguistic principles a
 nd yield a simple parsing model. In particular, th
 e AM algebra allows us to find consistent latent c
 ompositional structures for our training data, whi
 ch is crucial when training a compositional parser
 . The parser then employs neural supertagging and 
 dependency models to predict interpretable, meanin
 gful operations that construct the semantic graph.
  The result is a semantic parser with strong perfo
 rmance across diverse graphbanks, that also provid
 es insights to the compositional patterns of the g
 raphs.</p>\n
URL:http://projects.illc.uva.nl/LaCo/CLS/
END:VEVENT
END:VCALENDAR
