News and Events: Conferences

Please note that this newsitem has been archived, and may contain outdated information or links.

17 - 20 December 2020, 2nd Tsinghua Interdisciplinary Workshop on Logic, Language, and Meaning (TLLM 2020), Online

Date: 17 - 20 December 2020
Location: Online
Deadline: Saturday 30 November 2019

Monotonicity, in various forms, is a pervasive phenomenon in logic, linguistics, and related areas. In theoretical linguistics, monotonicity properties are relevant to a large array of semantic phenomena  and to the presence of pragmatic inferences such as scalar implicatures. In logic and mathematics, monotonicity guarantees the existence of fixed points and the well-formedness of inductive definitions. Also, monotonicity is closely tied to reasoning, in formal as well as natural languages. Recent logical and linguistic work on monotonicity has also found its way into computation systems for natural language processing and cognitive models of human reasoning. The goal of our workshop is to bring together researchers working on monotonicity or related properties, from different fields and perspectives.

The first day of the workshop were to be devoted to two tutorials:
1. Jakub Szymanik (University of Amsterdam): Monotonicity in Logic
2. Gennaro Chierchia (Harvard University): Monotonicity in Language
The remaining two days were to consist of invited and contributed talks.

Due to the ongoing pandemic, the organizers have decided to reschedule TLLM 2020 online in December.

The Programme Committee cordially invites all researchers to submit their papers for presentation. Abstracts are not to exceed two pages of A4 or letter-sized paper, including data and references, preferably with 1? (2.54cm) margins on all sides, set in a font no smaller than 11 points. The abstract should have a clear title and should not identify the author(s). The abstract must be submitted electronically in PDF format, via EasyChair.

For more information, see

Please note that this newsitem has been archived, and may contain outdated information or links.