BEGIN:VCALENDAR
VERSION:2.0
PRODID:ILLC Website
X-WR-TIMEZONE:Europe/Amsterdam
BEGIN:VTIMEZONE
TZID:Europe/Amsterdam
X-LIC-LOCATION:Europe/Amsterdam
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:19700329T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:19701025T030000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
UID:/NewsandEvents/Archives/2018/newsitem/9912/15-
 May-2018-Computational-Linguistics-Seminar-Arianna
 -Bisazza
DTSTAMP:20180423T133600
SUMMARY:Computational Linguistics Seminar, Arianna
  Bisazza
ATTENDEE;ROLE=Speaker:Arianna Bisazza (Leiden Univ
 ersity)
DTSTART;TZID=Europe/Amsterdam:20180515T110000
LOCATION:Room F1.15, Science Park 107, Amsterdam
DESCRIPTION:What makes recurrent neural networks w
 ork so well for next word prediction? Do neural tr
 anslation models learn to extract linguistic featu
 res from raw data and exploit them in any explicab
 le way? In this talk I will give an overview of re
 cent work, including my own, that aims at answerin
 g these questions. I will also present recent expe
 riments on the importance of recurrency for captur
 ing hierarchical structure with sequential models.
X-ALT-DESC;FMTTYPE=text/html:\n  <p>What makes rec
 urrent neural networks work so well for next word 
 prediction? Do neural translation models learn to 
 extract linguistic features from raw data and expl
 oit them in any explicable way? In this talk I wil
 l give an overview of recent work, including my ow
 n, that aims at answering these questions. I will 
 also present recent experiments on the importance 
 of recurrency for capturing hierarchical structure
  with sequential models.</p>\n
URL:http://projects.illc.uva.nl/LaCo/CLS/
END:VEVENT
END:VCALENDAR
