%Nr: DS-1995-19
%Author: Erik Aarts
%Title: Investigations in Logic, Language and Computation
As the title says, this PhD thesis is on the interface between logic,
language and computation. The main perspective of this thesis is that
of complexity theory. The merit of applying complexity theory to certain
problems is that complexity analyses provide different kinds of information.
Complexity analysis does not only show how difficult a problem is, but also
why it is difficult, and possibly how we can change the problem such that
it becomes less difficult.
The thesis is about the complexity of two grammar formalisms and a program
ming language. The grammar formalisms are categorial grammar based on Lambek
calculi and a restricted form of the contextsensitive rewrite grammars: the
acyclic contextsensitive grammars. The programming language is Prolog.
Complexity theory distinguishes two types of complexity: time complexity and
space complexity. Time complexity is defined as the number of steps that
some machine needs to solve a problem. Space complexity is the amount of
memory the machine needs. For an introduction in complexity theory see
Garey and Johnson (1979). The problems we are going to analyze (i.e.
estimating how much space and time it costs to solve these problems) are the
following.
For the grammar formalisms we have some generator G that generates
sentences. In categorial grammar, the generator consists of a lexicon of
words associated with types, and some (Lambek based) calculus. In acyclic
contextsensitive grammars, the generator consists of a contextsensitive
grammar. Besides the generator we also have an input sentence. The problem
we try to solve is the following: "Is the input sentence generated by the
generator?".
A Prolog program consists of a number of facts and rules. The rules say how
we can derive new conclusions from the given facts. The problem we will
discuss in this thesis is: "Given a program with rules and facts, is
conclusion C derivable?".
At first sight there seems to be no relation between the recognition problem
for grammar formalisms and the derivability problem in Prolog. But there are
similarities. The Prolog rules can be seen as a representation of infinitely
many contextfree rewrite rules. Every substitution for the variables gives
another rewrite rule. The facts can be seen as infinitely many socalled
epsilonrules, i.e. they rewrite to the empty string. Suppose we want to
know whether a conclusion C is derivable from a program. We transform the
program to an "infinite contextfree grammar" with start symbol C. Then the
derivability problem is equivalent to the question whether the empty string
is grammatical. We have reduced the derivability problem to a recognition
problem.
The fact that the grammar is infinite makes that we can not simply use
parsing theory for Prolog theorem provers. But we can use elements of
parsing theory to obtain theorem provers. The relationship between Prolog
and grammars has been worked out in (Deransart and Maluszynski 1993). They
show that Prolog programs are equivalent to Wgrammars.
Organization of the thesis:
The thesis consists of two parts that can be read separately:
"Grammar Formalisms", and "Programming in Logic".
Part I consists of the following chapters:
- Chapter 1 Introduction
- Chapter 2 Nonassociative Lambek calculus NL
- Chapter 3 The second order fragment of Lambek calculus
- Chapter 4 Acyclic ContextSensitive Grammars
Part II consists of the following chapters
- Chapter 5 Complexity of Prolog programs
- Chapter 6 Proof of the time complexity result
- Chapter 7 The link between the first part and the second part. It shows
how we can use the method from chapter 5 for proving that membership for
fragments of L can be computed in polynomial time.
Appendix A contains an implementation in Prolog of the efficient meta
interpreter described in chapter 6. Appendix B contains a reduction from
3SAT to ACSG.