Post on 08-Apr-2018
From syntactic structures to logical semantics
Alexandre Dikovsky, LINA-FRE CNRS 2729, Université de Nantes Christian Retoré LABRI-CNRS, INRIA-Futurs & Université Bordeaux 1
ESSLLI 2006 MALAGA
July 31st August 4th 2006
These lecture notes (slides) contain a presentation of the syntax semantics
interface according to various syntactic and semantic models. It contains five parts:
1. An introductory case for Syntax and Logical Semantics 2. Classical notions and results for TAGs and for dependency grammars 3. Categorial and generating dependency grammars 4. Underspecified semantics and compositional grammar interface 5. Categorial minimalist grammars
This does not mean that no other subject will be addressed, in particular if
time allows, we will probably present more generative semantics, for instance a recent computational approach to binding theory for pronoun interpretation (Bonato 2006).
An introductory case for Syntax and Logical Semantics:categorial grammars
Christian Retoré, Université Bordeaux 1
Équipe Signes linguistiques, grammaire et sens :algorithmique logique de la langue
INRIA-Futurs, LaBRI -C.N.R.S. et Département des Sciences du Langage Université Bordeaux 3
Linguistic domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2Mathematical models in linguistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5A syntax classic:formal language theory . . . . . . . . . . . . . . . . . . . . . . . . . . 8Grammar and Logic:a natural and a traditional link . . . . . . . . . . . . . . . . . 14Syntax and semantics in type theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Linguistic domainsLanguage is wide subject, hence one of the main outcomes of linguistics has
been to divide its study into domains:
phonetics studying the sounds of languageAccoustics - Phonatory/auditory system
phonology Abstract sounds, a discrete system
Bali / Paris indistinguishable for Japanese speakers
prosody pauses, intonation, stress
"Je serai très heureux de venir parler au LaBRI, laboratoire auquel jedois ma formation initiale en informatique, par exemple sur lalambda-DRT.""Je serai très heureux de venir parler au LaBRI — laboratoire auquel jedois ma formation initiale en informatique — par exemple sur lalambda-DRT."
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
morphologie word structure
derivational morphology [word creation]prefix, suffix, compoundscategory may change
noble→noblesse petit→petitessemaison→maisonnette camion→camionnette carpe→carpette?
flexional morphology [conjugation, agreement, case marking]no category change
arriver → arriv[er][ons]cheval → chevaux
syntax sentence structure
*Je fais la réparerJe la fais réparer* [[Peter [eats an]] apple]Pierre [eats [an apple]]
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
semantics word meaning, sentence meaning
lexical semantics word meanings and relation between them
livre, print (physical object), read (information)
logical semantics two independant aspectstruthconditional semantics under which conditions is a statement true(meaning = logical formula)compositional semantics computing the meaning of a compound expres-sionout of the meaning of its parts (λ-calculus handles meaning compositionand substitution)
pragmatics using language to communicate in a given situation1st and 2nd persons, now, here, there , . . .
Let’s rather go in this restaurant.
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Mathematical models in linguistics
probabilities, statistics
example: part of speech tagging using the previous words : if theprevious words are article, noun, adjective the next word is probably notan article
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
formal grammars
• morphology: automata, transducers
numbers, dateschanterons → chanter 1pers. pl. , futur
• generative syntax, formal language theory
[La [petite brise]] [la glace][La petite] [[brise [la glace]]Il [regarde [une passante]] [avec des lunettes noires].Il [regarde [une [passante [avec des lunettes noires]]]Elle [[a trouvé] [son [parapluie bizarre]]]Elle [[[a trouvé] [son parapluie]] [bizarre]]Syntaxe étendue vers la sémantique:* Hei liked two books that Chomskyi wrote.How many books that Chomskyi wrote did hei like?
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
logic
• logic for semantics
all doctors are drivers(therefore) all French doctors are French drivers*(therefore) all good doctors are good drivers
I had three coins. I lost one of them. I look for it.I had three coins. I lost two of them. *I put it into my pocket.
• a particularity with linear logic and resource logic:logic also describing syntaxparse structure = proof, deduction (= graph)
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
A syntax classic: formal language theory
In linguistics, computer science, mathematics, biology:compilation, concurrence, group theory,
formal grammar: a linguistic notion du to Noam Chomsky (with mathematicalcontributions by Marcel-Paul Schützenberger)
first formal grammar (context free) Pan. ini, 5e s. av. J.C.
A human language is not the set of the utterances of its speakers :new sentences identified as correct by its speakers can always be producedfor instance Ei+1 = He believes that Ei
→ Hypothesis: A human language is a set of (unconscious) rules :children over generalize when they have acquired a rule: "holded"
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
hence, formal grammars and Chomsky’s hierarchy
• Distinction competence /performance : grammar / our actual use
The wolf ate the goat.
The goat that the wolf ate ate the cabbage.
? The the cabbage that the goat that the wolf ate atebelonged to theferryman.
?? The ferryman to whom the cabbage that the goat that the wolf ate atebelonged owns boats.
??? The boats that the ferryman to whom the cabbage that the goat thatthe wolf ate ate belonged owns are green.
(nevertheless correct, with a pencil and time)
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
what do rules (expressing speakers’s competence) look like,c.-à-d.where are human languages in the hierarchy?
two principles
sentences can be analysed (understandable) in a sensible (=polyno-mial?) time
grammar should be learnable from positive examples only (criterion of-ten left out)
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
sentences can be analysed in polynomial time
I regular languages are not enough:(previous example) Subject1 Subject2 Subject3 ... Verb3 Verb2 Verb1
I context free language are not enough(completives in dutch) Subject1 Subject2 Subject3 ... Verb1 Verb2 Verb3
I more than hors-context, but can be parsed in polynomial timeTAG or context free grammars with movements, with unification
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Some properties of universal grammars (assumed because of learnability):
• A noun phrase receives a caseand only finite verbs give a case
It seems that Mary is arriving.Mary seems to arrive.*It seems (that) Mary (to) arrive.
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
• A pronoun must be bound (syntactic position) by its antecedent.(configuration in the parse tree)
Carlotta’s dog thinks that he hates him.he6= him other equalities are possibleCarlotta’s dog thinks that he hates himself.he=himself all other equalities are possible
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Grammar and Logic: a natural and a traditional link
Ancient times (Aristotle, Denis from Thrax)middle age (scholastics), eighteenth century (Port-Royal)...The sentence has a logical structure.
My children will have a pizza.∀x (child(x) ⇒ (∃y pizza(y) ∧ eat−future(x,y) ) )
∃y (pizza(y) ∧ ∀x (enfant(x) ⇒ eat−future(x,y) ) )
A pizza will be served to my children.∃y (pizza(y) ∧ ∀x (enfant(x) ⇒ eat−future(x,y) ) )
∀x (child(x) ⇒ (∃y pizza(y) ∧ eat−future(x,y) ) )
Subtle differences: every, each, all, ... a ,some, ...
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Generalised quantifiersMany quantifiers in natural language:most of, many, a few
Most politicians read a book.
numbers also have scope properties.
Put eight drops into three spoons of water.3x8=24 drops?8 drops?
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Interpretation, possible worlds, intentionalityTruth conditional semantics:the meaning of a statement is the set of the possible words in which it is true.
This student thinks that Chomsky is a computer scientist. .
In any world in which the belliefs of this studnet are true, Chomsky is a com-puter scientist.
Lectures de re et de dicto
James Bond believes that there is a spy in the lab.James Bond thinks that Professor Busquets is a spy.James Bond found a microphone in the drawer.
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
CompositionalityFrege: the meaning of compound expression is a function from the meaning of
its parts.
Paul, that I know, has not yet arrived.
Limits:If a farmer owns a donkey, then hei beats itj.If (∃f∃d Donkey(d) ∧ Farmer(f) ∧Own(f,d)) then B(f,d).Second d and second f FREE??
Limits of purely logical approach
I had three coins. I lost one of them. I look for it.I had three coins. I lost two of them. *I put it into my pocket.
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Semantical aspects of syntactic categoriesCategories and part of speech have a semantical or logical counterpart.Parallel between main categories Verbs et Nounsand Predicates et les Individuals in logic.
I Noun phrases: individuals (real individual or quantified individual variables)
I Verbs, verbal groups: predicates
I Adjectives: nouns (agreement) or verbs (express a property)
I Prepositional phrase: neither nouns or verbs.
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Syntax and semantics in type theory
We focus on the red domains
phonetics
phonologyprosody
derivational morphology
flexional morphology
syntax
semantics
logical semanticslexical semantics
pragmatique
formal grammardeductive systemlambda calculushigher order logic
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Objectives
• Parsing producing semantic representationsNatural language data-base querytranslation
• Generating sentences from semantic representationsText synthesis or generationnatrual language answers to data base queriestranslation
Key points: syntax and compositional semanticsStrengths of categorial grammars
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Composition
Semantically oriented syntactic formalism: categorial grammars
Natural implementation of compositional semantics :higher order logic in simply typed lambda calculus.
Morphologyagreement categoriesor feature unification?or preprocessing and already modified categoriesor module with transducers
On the syntactic side, we could compile the categorial grammar into a moreefficient one like (RCG)
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
General pictureOn the syntactic side:
I Words are mapped to formulae (syntactic categories) describing their syn-tactic behavior.
I A logical calculus which allows to map compound expression to syntacticcategories.(Rules are common to all languages.)
I if the category of a sequence of words is S, then it is a sentence.
On the semantics side:
I Each syntactic category a is mapped to a semantic type a∗
I Each word of syntactic category u is associated with a λ-term of type u∗
I Every syntactic composition operation corresponds to an operation on thesemantic λ terms.
I Each expression of syntactic category u is mapped into a λ-term of type u
I Sentences are mapped into logical formulae.Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Lambek grammars (1958)A Lambek grammar= a lexicon LexLex(word) = a finite set of categories (depicting the syntactic behavior of word)
Syntactic categories L ::= P = S,sn,n,... | L− L | L − L
m1 · · ·mn ∈ Language(Lex) ssi ∀i ∃ti ∈ Lex(mi) t1, . . . ,tn ` S
Lexicalised grammar = rules are universal(only the lexicon may vary to describe different languages)
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Regarding semantics: we use two types to describe logical formulae
I e individuals
I t truth values
Sleep: e → t function from individuals which maps individuals to truth valuesLike: e → (e → t) function from pairs of individuals to truth values
Notation λx.uthe function which maps the variable x to the term uTo compute (λx.u) t we replace x with t in u.
Example (λxλy.((like y)x))MariaPeter= λy.((like y)Maria)= likePeterMarialike: e → (e → t)Peter, Maria: e
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Some examples of syntactic categories
• noun phrases, proper names: np
• common nouns: apple, car : n
• a, the, : np − n
• sleeps : np− S
• eats: (np− S) − np, np− S
• red : n− n
• who, that (subject relative pronouns),: (n− n) − (np− S)
Some examples of expected syntactic categories
• car: n
• the car that just passed: np
• eats an apple: np− S
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Some examples of semantic types
• noun phrases, proper names: e
• common nouns: apple, car : e → t
• a, the, : (e → t) → e (or quantifier raised form)
• sleeps : e → t
• eats: e → (e → t), e → t
• red : (e → t) → (e → t)
• who, that (subject relative pronouns),: (e → t) → ((e → t) → (e → t))
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Rules of the syntactic systemA =leftmostfreehypothesis
. . . [A] . . . . . .
B −i bounds AA−B
∆A
ΓA−B −eB
A =rightmostfreehypohtesis
. . . . . . [A] . . .
B −i bounds AB − A
ΓB − A
∆A −e
B
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Meaning of the connectives − et ‖to and of the rulesu : B − A means that if u is followed by v : A then uv : B
u : B − A means that if u is before v : A then vu : B
example: the : np − n car : n permet d’obtenir the car : np
(modus ponens or elimination)
introduction rules allows to introduce and erase virtual constituents :if u : A −B and v : B − C we obtain with introduction rules uv : A − C
indeed taking x : Cwe have uvx : C with two introduction rulesand uv : A − C with an introduction rule
example: very : (n− n) − (n− n) yields veryvery : (n− n) − (n− n)
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
relation CFG ↔ Lambek grammarsWord: terminalsCategories and subcategories: non terminals.
• CFG → Lambek grammars
I CFG → CFG in Greibach normal formI if X → aTUV add to the lexicon a : ((X − V ) − U) − T
I The Lambek grammar is weakly equivalent to the CFG (rather easy)
• Lambek grammar → CFG
I Lambek grammars, maximum size of a category in the lexicon: k
I for every provable sequent A,B ` C where A,B,C smaller than k add arule C → AB
I for each a : A in the lexicon add a rule A → a
I The obtained CFG (in Chomsky normal form) is weakly equivalent to theoriginal Lambek grammars (conjectured in Chomsky 63, proof by Pentusin 93)
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Syntactic categories and semantic types (Montague 1970)Logical formulae in typed λ with 2 types:
individuals e, truth values t.
n-ary predicate : e → (e → (e → (· · · → t)))n argument function : e → (e → (e → (· · · → e)))
Logical constants:
∧, ∨ , ⇒ : t → (t → t)∃,∀ : (e → t) → t
Morphism from syntactic types to semantic types
S∗ = t sentences: truth valuesnp∗ = e individualsn∗ = e → t unary predicates
(A−B)∗ = (B − A)∗ = A∗ → B∗ inductive extension of the morphisms to all syn-tactic categories
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
An example of a tiny lexicon We provide a lexicon analysing a single (!) sen-tence: « Some statements speak about themselves. »
word Syntactic category uSemantic type u∗
Semantic representation : λ-term of type u∗
xv variable or constant x of type vstatements n = St"statements" is a common noun
e → t = St∗
semantically statements is a unary predicate
λxe(statemente→t x)this predicate maps x onto the truth value of "x is a statement"
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
word Syntactic category uSemantic type u∗
Semantic representation : λ-term of type u∗
xv variable or constant x of type vspeak_about (np− S) − np = SpAspeak_about wiats for a noun phrase on its right, and then for a noun phrase onits left.
e → (e → t) = SpA∗
semantically "speak_about" is a binary predicate
λye λxe ((speak_aboute→(e→t) x)y)a function of two individuals which is true whenever the second one (the subject)speak about the first one (the object)
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
word Syntactic category uSemantic type u∗
Semantic representation : λ-term of type u∗
xv variable or constant x of type vthemselves ((np− S) − np)− (np− S) = X"themselves" (object) waits for a transitive verb on its en left and then produces asentence missing a subject
(e → (e → t)) → (e → t) = X∗
semantically, "themselves" maps a binary predicate P (x,y) (the transitive verb)into aunary predicate
λP e→(e→t) λxe ((P x)x)which is obtained by asking the two arguments to be equals P (x,y) est P (x,x)
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
word Syntactic category uSemantic type u∗
Semantic representation : λ-term of type u∗
xv variable or constant x of type vsome (S − (np− S)) − n = E"some" (subject) waits for a noun on its right and then for a sentence missing itssubject on its right and yields a sentence.
(e → t) → ((e → t) → t) = E∗
given a unary predicate P (noun) and a unary predicate Q (verb phrase) "some"builds a closed formulae
λP e→t λQe→t (∃(e→t)→t (λxe(∧t→(t→t)(P x)(Q x))))the formula built by "some" is ∃xP (x) ∧Q(x)
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
word Syntactic category uSemantic type u∗
Semantic representation : λ-term of type u∗
xv variable or constant x of type vsome (S − (np− S)) − n = S
(e → t) → ((e → t) → t) = S∗
λP e→t λQe→t (∃(e→t)→t (λxe(∧t→(t→t)(P x)(Q x))))statements n = St
e → t = St∗
λxe(statemente→t x)speak_about (np− S) − np = SpA
e → (e → t) = SpA∗
λye λxe ((speak_aboute→(e→t) x)y)themselves ((np− S) − np)− (np− S) = X
(e → (e → t)) → (e → t) = X∗
λP e→(e→t) λxe ((P x)x)
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Syntactic analysis
(S − (np− S)) − n , n , (np− S) − np , ((np− S) − np)− (np− S) ` S ?
E ` (S−(np−S))−n St ` n −eE,St ` (S−(np−S))
SpA ` (np−S)−np X ` ((np−S)−np)−(np−S) −eSpA,X ` (np−S) −e
E,St,SpA,X ` S
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Semantic skeleton of the sentence
E∗ ` (e → t) → (e → t) → t St∗ ` e → t →eE∗,St∗ ` (e → t) → t
SpA∗ ` e → e → t X∗ ` (e → e → t) → e → t →eSpA∗,X∗ ` e → t →e
E∗,S∗,SpA∗,X∗ ` t
corresponding λ-term :
((eE∗sSt∗)(xX∗
pSpA∗))
t
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Computing the semantic representationvariable := semantic λ-terms (having the same type)
((λP e→t λQe→t (∃(e→t)→t (λxe(∧(P x)(Q x)))))(λxe(statemente→t x))
)((λP e→(e→t) λxe ((P x)x))(λye λxe ((speak_aboute→(e→t) x)y))
)↓ β
(λQe→t (∃(e→t)→t (λxe(∧t→(t→t)(statemente→t x)(Q x)))))(λxe ((speak_aboute→(e→t) x)x))
↓ β
(∃(e→t)→t (λxe(∧(statemente→t x)((speak_aboute→(e→t) x)x))))
in other words :
∃x : e (statement(x) ∧ speak_about(x,x))
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Summary
• Grammar = lexicon providing words with (complex) syntactic categories
• Syntactic analysis = typing the sentence with S
= proof of S in a resource sensitive logic
• Morphism from syntactic categories to semantic typessyntactic analysis = non commutative resource sensitive proof→intuitionistic proof= λ-term expressing the compositional structure (actually still linear)
• Replace variables with λ-terms from the lexicon (non linear)perform β-reduction→formula representing the meaning of the analysed sentence
• Here syntactically poor system. Natural extension: more sophisticated logiclike Moorgat’multimodal extension with modalities and postulates.
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Conclusion
I For a small fragment, we have a computable relation between some aspectsof meaning (roughly speaking who does what and quantifier scopes) and thesyntactic structure of a statement.
I From a syntactic viewpoint the fragment is much too restricted:
• discontinuous constituents
Je ne sais pas .
• medial extraction
The woman whoi called ti yesterday will call again.
• clitic pronouns in romance languages
Je la fais réparer.Je sais la réparer.
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
I Semantically restricted as well:
• Times, space are not well taken into account.
• To keep compositionality one must modify the semantic logic.
• Coreference between pronouns and antecedent is not handled.
• L’ajout de paramètres complique le modèle.
• Possible worlds interpretation is not very satisfactory.
• Incorporating lexical semantics, (relation between higher order predi-cates) is not easy to introduce.
I Practically, there are not many logical-categorial resources(except Grail for Dutch)
• Large lexicons? (automated acquisition from annotated corpora, Grail)
• Efficient parsing algorithms? (supertagging, distance minimization, Grail)
• Difficulties to cope with incorrect sentences.
Many open research directions.
Università di VeronaMaggio 2006
Plan Début Fin Préc. Suiv. J
Classical notions and results forTAGs and for dependency grammars
Alexander Dikovsky
LINA, University of Nantes
ESSLLI 2006 – p.1
PLAN
1. SYNSEM interface
2. Constituent structures
3. TAG, MCTAG
4. Dependency Graphs
ESSLLI 2006 – p.2
1. SYNSEM interface
sentence
Axioms (words)
Lambda-term
Compositional meaning
S
Rules
(universal
/ words?)
Questions:
- Rule to rule?
- Filter for syntax?
-Generation?
-Semantic analysis?
Syntax and semantics in a lexicalized view
Lexical
rules?
ESSLLI 2006 – p.3
1.1 Replies"Rule-to-rule?" - unrealistic
"Filter for syntax?"
inefficient (so inadequate)
"GenerationSyntAnalysis?" - Interfaces are asymmetric
Generation is simpler:lexical ambiguity: resolved (if not intentional)syntactic ambiguity: loweredanaphora: resolvedcommunicative structure: prescribed
Parsing must resolve these problems
ESSLLI 2006 – p.4
1.2 Compositional SYNSEM interfaceMorphisms between semantic and syntactic composition oprrators:
Generation: Γ : φ (semantic) f φ (syntactic)
Γφπ
1
sem, . . . , π
k
sem f φΓπ
1
sem, . . . ,Γπ
k
semParsing: Π : f (syntactic) φf (semantic)
ΠfS
1
syn, . . . , S
k
syn φf ΠS
1
syn, . . . ,ΠSk
syn
f φ
Γ :
f φ
Π :
...
φf
SkS1
f
φ
π1
πk
...
φf
...
ΠS1 ΠSk
ΓπkΓπ1 ...
ESSLLI 2006 – p.5
1.3 Compositional interface definitionalgebra Asyn of syntactic structures under syntacticcompositions
extension of Asyn to grammars G with efficient parsing
algebra Asem of semantic structures under semanticcompositions
compositional denotational semantics for Asem
extension of this semantics to Gefficient algorithms for Γ and Π
Syntactic structures: constituents and dependenciesTwo cases of grammars: TAGs/McTAGs for constituents,
D-grammars for dependencies
ESSLLI 2006 – p.6
1.4 Formal semantic structures
Logical expressions
Formulas valid in models
Proofs in formal systems
Static (intra-sentence) / Dynamic (of discourse)
Challenges:CompositionalityRealistic complexity
Underspecified semantic structures
ESSLLI 2006 – p.7
1.5 Underspecified semantics ideaEX [Hob83] (In most democratic countries) (most politicians) can fool
(most of the people) (on almost every issue) (most of the time)5 scopal NPs, 5! 120 combinations of scopes not entailedone by another, a single pertinent readingProposed solution:
to interpret the scopal NPs in situ (in their argumentpositions)
to put off the scope constraints to an extra-semantic(reasoning) level
Two cases of underspecified semantics: hole semantics for TAGs
and discourse plan semantics for D-grammars
ESSLLI 2006 – p.8
2. C-structuresCS over L ε ΣN (S N : sentence category) and F :features ( F : undefined)CS T : a labelled featured tree
U : nodes (u0: root),
(immediate dominance). C C1: C1: immediate
constituent of C
(tree PO of dominance),
(PO of precedence),
µ : UL (labelling),
ν : L FA (feature assignment)
NT: Xuf :v or Xu
v , when f is implied: u U, µu X, νX, f
v (node u labelled with X L, with f -value v)ESSLLI 2006 – p.9
2.2 Some axioms(complete axiomatization in [RVS94] )u u1 u u2 u1 u2 u2 u1
u1 u2 u1 u1 u2 u2 u1 u2
u1 u2 u1 u2
u1 u2 µu1 N
PR: defines a linear order (WO) on the leaves of TwT yield of T : string of labels of the leaves in WOEX: CS with head selection feature h:
V Ph
S
NP
John Adv V Ph
toh Mary
PPto
NPPtohwritesh
periodically Vh
ESSLLI 2006 – p.10
2.2 Substitution and adjunction
T
uB
T
uB1
SUBSTIUTION: T1 T uB T
T
uB
T
u u1
ADJUNCTION: T1 T uB a T Γ
uB2
uB1 uB
T
0
uB2
T
uB1
TuB
T0
FOOT
UT
0
UT0 u u2, T0 T
0
At the condition Γ on features in u, u1, u2
Important constraints: u: u is foot; uNA: no adjunction
REM: sub is monotone. adj can be made so
ESSLLI 2006 – p.11
3. TAG, MCTAGFor a finite set G of CS and a set of operations F,F G: the set of terminal CS in the closure of G under F ;LF Gdf wT T F
G
∆F df F
G finite G ; LF df LF G finite G
PR: Lsub LCF
TAG languages [JLT75] :
LTAG df Lsub, adj
PR: ∆sub, adj ∆adj , hence Lsub, adj Ladj
CS matrix: a finite list of CS T1, . . . , Tk
mc-adjunction: mcadjT ;u1, . . . , uk; T1, . . . , Tkdf
T u1 T1, . . . , uk Tk Γ simultaneouslymulti-component TAG languages [Wei88] :
LMCTAG df Lmcadj
ESSLLI 2006 – p.12
3.1 Expressiveness
PR: anbncndn n 0 LTAG LCF
PR: anbncndnanbncndn n 0 LMCTAG LTAGPR:
LMCTAG are NP-complete and non-semi-linear[Ram94]
They are polynomial and semi-linear under someconstraints to mcadj [VSW94]
ESSLLI 2006 – p.13
3.2 Copy language is TAGPR: COPY ww w a, b LTAG:G : α : β1 : β2 :
S
ε a S
SNA
S
NA a
b S
SNA
S
NA b
a
a a
S
SNA
b S
SNA
b
SNA
a
b
SNA
SNA
a
a
SNA
S
SNA a
bSNA
SNA a
SNA
SNA a
a
SNA
S
ε
ε
ε
ε
a
ESSLLI 2006 – p.14
3.3 TAGs for ALPlexicalized
each tree T G has a unique terminal head hT subcategorized
the head h corresponds to a predicate phT
there is a bijection between the arguments of phT andnonterminal leaves (slots) of T (EX: complements ofverbs). Hence, sub is needed .
modifiers and circumstantials are implemented byadjunction trees
semantically founded
each T has a semantics defined through that of phT
semantics is compositional
ESSLLI 2006 – p.15
3.4 Lexicalized TAG
?!?!
T1 : T2 : T3 : T4 : T5 :
G :
T :
NP
PPtowritesh
V Ph
S
S
Advh V P
V Ph
PPto
NP
John
NP
PPto
Ptoh
toh periodicallyh
Advh V P
V P NP
John
NP
Mary
NP
toh Mary
writeshperiodicallyh
T sub, adjT1, T2, T3, T4, T5John periodically writes to Mary
Ptoh
Problems:
to express discontinuous dependencies
of unnatural modifier-circonstantial dependencies
of clauses in complement slots (defined through adj andnot sub) ESSLLI 2006 – p.16
4. Dependency Graphs4.1 Projections, projectivityR: syntactic relations, Σ: words, N : nonterminals.
Dependency graph: DS of w x1...xn ΣN : a graphD x1, . . . , xn,,, ρ, where:
xixj : xj depends on xi (xi : governor, xj : subordinate)
a LO on x1, . . . , xn
ρ : R. For xixj, ρxi, xj d denoted xid
xj
wdf wD: carrier of DDomination : reflexive-transitive closure of
Projection of x : projxdf x
xx
Dependency tree (DT): x1, . . . , xn, a tree
Projective DT: all node projections are continuous intervalsESSLLI 2006 – p.17
4.2 Dependencies vs. constituentsProjective DTs can be seen as a by-product of selection ofa head constituent in non-unit constituents ( [Ior63] ,
[Gla66] , [Rob70] , [Jac77] ):
head selection: in every non-unit constituent C there isexactly one selected immediate constituent: C selC
for a unit constituent C x, headCdf x
for a non-unit constituent C, headCdf headselC
if C1 is a non-selected immediate constituent of C(C C1, C1 selC), then headC governs headC1:
headC headC1
PR: The DG induced by head selection is a projective DT
(see [DM00] for more facts).ESSLLI 2006 – p.18
4.3 DT induced by head selection
EX: projective DT derived from CS with selected heads:
V Ph
S
NP
John Adv V Ph
toh Mary
PPto
NPPtohwritesh
periodically Vh
John periodically writes to Mary
ESSLLI 2006 – p.19
4.4 Non-projective DT
EX: Many adequate DTs are not projective:
ESSLLI 2006 – p.20
4.5 Generalized Dependency Structures
gDS: k-component DG D D0, ..., Dk1, k 1, where:
one maximal connected component D0 is selected ashead component
one node in D0 is selected as head of D0 (and of D)
...
D0 D1 Dk
EX: a two-component gDS:
MORE
pred preposcompar
NGPp
preposrestrdobj
prepiobj
THANNGPpNG5Pnpers Vtr
ESSLLI 2006 – p.21
4.6 Monotone gDS compositionFor gDS δ and δ1, the composition of δ1 for a node α in δδαδ1 (or simultaneous composition δα1, ..., αnδ1, ..., δn )
unites the dependencies in δ and δ1; the head of δ1inherits dependencies of α;
preserves order: wδαδ1 xw1y, ifwδ x labelα y and wδ1 w1.
...
......
... ... ...
α
D20 D2n
DR1i D1mDL
1iD10
δ
δ1
DL1iD10 D2nD20 DR
1i D1m
δαδ1
ESSLLI 2006 – p.22
4.7 Example of gDS composition
Bδ A
aδ1
bδ2
a b b B c ca AδA,Bδ1Aδ1, δ2Bδ2
A
B c
REM: wδ uxv and wδ1 z implies wδxδ1 uzv.
So Dependency grammars defined as closure of finite gDSsets under this composition generate the CF-languages.
A more realistic solution in the next Lesson
ESSLLI 2006 – p.23
Conclusion
Compositionality of syntax is related with monotonicityof syntactic structure composition
Challenge: compositional definition of syntacticstructures expressing discontinuous dependencies
Completely compositional definition of syntacticstructures leads to grammars with efficient parsing
ESSLLI 2006 – p.24
Referen
ces
[DM
00]A
.Ja.D
ikovskijandL.S
.M
odina.D
ependencieson
theotherside
oftheC
urtain.TraitementA
utomatique
desLangues
(TAL),41(1):79–111,2000.
[Gla66]
A.V.
Gladkij.
Matem
aticeskajalingvistika
[Mathem
aticalLinguistics].
Novosibirskij
Go-
sudarstvennyjU
niversitet[N
ovossibirskS
tate
University],1966.
[Hob83]
J.H
obbs.A
nim
propertreatm
entof
quantification
inordinary
English.
InP
roc.AC
L-83,pages
57–63,
1983.
[Ior63]L.N
.Iordanskaja.
Onekotoryh
svojstvahpravil’noj
sintaksiceskojstruktury[O
nsom
eproperties
ofcor-
rectsyntacticstructure].V
oprosyyazykoznanija
[The
problems
oflinguistics],(4),1963.
[Jac77]R
.Jackendoff.
X’S
yntax:
AS
tudyofP
hraseS
truc-
ture.M
ITP
ress,Cam
bridge,MA
,1977.
[JLT75]
A.K
.Joshi,
L.S.
Levy,and
M.
Takahashi.Tree
ad-
junctgram
mars.
Journ.of
Com
put.and
Syst.
Sci.,
10(1):136–163,1975.
24-1
[Ram
94]O
.R
ambow
.Form
alandC
omputationalA
spectsof
NaturalLanguage
Syntax.
PhD
thesis,U
niversityof
Pennsylvania,1994.
[Rob70]
JaneJ.R
obinson.Dependency
structuresand
trans-
formationalrules.
Language,46(2):259–285,1970.
[RV
S94]
J.Rojers
andK
.Vijay-S
hanker.O
btainingtrees
from
theirdescriptions:
anapplication
toTree-A
djoining
Gram
mars.
Com
putationalIntelligence,
10(4):401–
421,1994.
[VS
W94]
K.V
ijay-Shanker
andD.J.
Weir.
The
Equivalence
of
Four
Extensions
ofContext-Free
Gram
mars.
Mathe-
maticalS
ystems
Theory,27:511–545,1994.
[Wei88]
D.J.
Weir.
Characterizing
mildly
context-sensitive
gramm
arformalism
s.PhD
thesis,University
ofPenn-
sylvania,1988.
24-2
Categorial and generatingdependency grammars
Alexander Dikovsky
LINA, University of Nantes
ESSLLI 2006 – p.1
PLAN
1. Categorial dependency grammars
2. Dependency structure grammars
3. CDG vs. DSG and complexity
ESSLLI 2006 – p.2
1. Categorial Dependency Grammars (CDG)( [Dik04] )1.1 Categories CatC and Valencies V C :
primitive categories C : det, pred, copul etc.
elementary categories:
neutral valencies for local dependencies: primitive C,iterative C, repetitive C and optional C? categories,
polarized valencies for non-local discontinuousdependencies:
positive for the beginnings of left C and right C,negative for the ends of left C and right C,
host valencies C, C and anchored valencies#C, #C for the (adjacent) ends of left / rightdiscontinuous dependencies.
ESSLLI 2006 – p.3
1.2 Categories and dependenciesComplex categories (1st order):
L1 LiCRj R1,
where L1, , Li, C,Rj , , R1 are elementary.
EX 1. theory modifdetdirobjattrrel
theory
modif
det attrrel
dirobj
2. etre clitdobjpredSaux
auxiliary host verb for a clitic in pre-position, expecting a left-
subordinate subject and a right-subordinate participle.
ESSLLI 2006 – p.4
1.2 Categories and dependencies
he took Tom Maryfromthanfromcandiesmore
restr compconj [#compconjcompar
compconj
la commission ne la
neg composneg
refusee
#clitdobj
pas
#clitiobj
lui aclitiobj clitdobjaux
clitiobj
clitdobj
composneg
#composneg
ESSLLI 2006 – p.5
1.3 Monoid of potentialsDual valencies v and v, v V C : together, C,C ˘Cand C,C ˘C define discontinuous dependency C.
Potential: a string of polarized valencies in V.EX: π CBACBANeutralization through reduction: ΓΓ:Γ Γ1vΓ2vΓ3, Γ
Γ1Γ2Γ3 and the condition of FirstAvailable (FA) holds: Γ2 has no occurrences of v and of v.
Monoid of potentials: being terminal and confluent, thereis the unique FA-normal form ΓFA. Henceforth, the correctproduct:
Γ1 Γ2df Γ1Γ2FA
EX: π FA BB
ESSLLI 2006 – p.6
1.4 Tree constraintsIn complex categories L1 LiCRj R1 CatC,
Li, Ri cannot be negative end valencies C,C, thoughthey may be negative hosts C, C,
C is primitive or negative C,C, or anchored negative#C,#C.
TC are waved for the generalized polarized categories(gCatC) in which all polarized valencies are factored out:
CP or L1 LiCRj R1
P , where:
C,Li, Ri are neutral and P is a potential,
C and P may be empty. Cε C.
EX: predSauxclitdobj for clitdobjpredSaux,
auxclitdobjclitiobj for clitdobj clitiobjaux.ESSLLI 2006 – p.7
1.5 CDG definition frameCDG G W,C, S, δ : W : words, C: primitive categories,S C, δ : W 2CatC a finite substitution (lexicon).Generalized CDG (gCDG): gCatC in the place of CatC.
D-form D,Γ of a sentence w a1...an:
D: a dependency structure on w,
Γ: a string of positioned categories,
initial D-form: a1 . . . an,, C11 . . . Cnn,
Ci δai, 1 i n,
terminal D-form: D, Sj.
GD,w Γ0jΓ0 δw w,,Γ0 D, Sj
: derivability in a dependency calculus.
DGdf D w GD,w, LGdf w D GD,w.ESSLLI 2006 – p.8
1.6 Generalized dependency calculusLl. V,E,Γ1C
P1iCβP2jΓ2
V,E aiC
aj,Γ1β
P1P2jΓ2, C C.
Il. V,E,Γ1CP1iC
βP2jΓ2
V,E aiC
aj,Γ1C
βP1P2jΓ2, C C.
Ωl. V,E,Γ1C
βP iΓ2 V,E,Γ1βP iΓ2.
Dl. V,E,Γ1αP1CiP CjP2Γ2
V,E aiC
aj,Γ1αP1PP2Γ2,
if CP C satisfies the FA-condition.
Exr: Write the corresponding rules for C and C?.
ESSLLI 2006 – p.9
1.7 A gCDG for a non-CF language
G1 :
a AA, AAA
b BCA, ASCA
c C, BC
TH: LG1 anbncn n 0.
A proof of a3b3c3 LG1 :
Types assignment: a3b3c3
AA1AAA2AAA3ASCA4BCA5BCA6C7BC8BC9
AAAAA
Ll
AAA AAA
Ll
AAAAASCA
BCA
BCAC
Lr
BA BCLl
CA
Lr
BAA BCLl
CAA
Ll
ASAAA
Ll
SAAAAAA
Dl 3
S ESSLLI 2006 – p.10
1.7 Corresponding dependency structure
is not a dependency tree
ESSLLI 2006 – p.11
1.8 Dependency calculus for DT [DD04]
Ll. V,E,Γ1CiCβjΓ2
V,E aiC
aj,Γ1βjΓ2, C C.
Vl. V,E,Γ1βαjΓ2 V,E,Γ1βjαjΓ2,
β C, lC, lC
Al. V,E,Γ1#αi
lαjΓ2 V,E,Γ1αiΓ2, α C,C.
Cl. V,E,Γ1 αiβj Γ2 V,E,Γ1 βjαi Γ2, where α pC
C,C,C C C C and either β X X pC
or β CAT C has no subexpressions α, α,#α, α.
Dl. V,E,Γ1CiCjΓ2 V,E aiC
aj,Γ1Γ2.
ESSLLI 2006 – p.12
1.8 Iteration rules
Il. V,E,Γ1CiC
βjΓ2
V,E aiC
aj,Γ1C
βjΓ2, C C.
Ωl. V,E,Γ1C
βiΓ2 V,E,Γ1βiΓ2.
Generalized dependency calculus - gCDG - LgCDG
Dependency calculus - CDG - LCDG
ESSLLI 2006 – p.13
1.9 CDG examples
w the red wines of Bordeaux are excellent
the det,
red modif,
Bordeaux prepos,
excellent copulattr
wines modifdetpredpostattrof postattrprepos
are predScopulattr
ESSLLI 2006 – p.14
1.9 CDG examples
EX: w les vins rouges bordelais sont excellents
les det
bordelais modif,
rouges modif,
excellents copulattr
vins detpredmodif
sont predScopulattr
ESSLLI 2006 – p.15
1.10 A DT through a proof in CDG
EX: δ : Les vins rouges bordelais sont excellents
detdetpredmodifmodif modif predScopulattrcopulattr
detdetpredmodif Ll
predmodif modif
Ir
predmodif modif
Ir
predmodif Ωr
pred
predScopul attrcopul attrLr
predSLl
S
ESSLLI 2006 – p.16
1.11 CDG for anbncn n 0
a #A, A#A
b ABC, A ASCc C, B C
Types assignment: a3b3c3
#A1A#A2A#A3A ASC4ABC5ABC6C7B C8B C9
#A, A#A, A#AA ASC
Vr,Cl
A,A, SC
ABC
Vl
A, BC
ABCC
Lr
AB
Vl
A,B BCLl
A,C
Cl,Lr
A,A,B BCLl
A,A,C
Cl, Lr
A,A,A,A,S
Vl, Ar
S
ESSLLI 2006 – p.17
1.12 CDG languagesProposition 1. For every CDG G, dependency structures inDG are trees.
Projective CGD: those not using the polarized valencies.
Proposition 2. For every projective CDG G, DS in DG areprojective DTs.
Proposition 3. Projective CDG are weakly equivalent to
CF-grammars: LCDGproj LCF .
CDG are more expressive than CF-grammars:Proposition 4. anbncn n 0 LCDG LCF .
They are weakly equivalent to gCDG.
Proposition 5. LCDG LgCDG. DCDG ! DgCDG.
ESSLLI 2006 – p.18
2. Dependency Structure Grammars( [BDF05] )
2.1 Generalized DSG (gDSG) G T,N, S,R
T : terminals
N : nonterminals
S N : axiom
R: rewriting rules A δ , where δ : a gDS over T N
with assignment of left and right potentials to nodes.
EX:restr
NG5NG6 MOREcompconjCompRestrG compconjTHAN
compar
PG
ESSLLI 2006 – p.19
2.2 Derivation treesDerivation trees generate gDS together with their potentials:
α
α1αk...
A
r
π1 δ1 πk δk
π δ
r : A δ0
wr : A X1...Xk
ΓL1 X1Γ
R1 ΓL
k XkΓRk πk ΓR
kπ π1df ΓL1
, ...,δ1 δk δneutrdf δ0α1, ..., αkδ
ΓR1 ... ΓL
k
Complete derivation tree T : πT ε
Languages: δT ∆G and wδT LG
iff T is a complete derivation tree of G.ESSLLI 2006 – p.20
2.3 Non-CF gDSG-languageS aa
G1 :
S A c
A ab c abA
LG1 anbncnn 0aa aa c c cabaa ab
S
A
ab
S
S
S
A
A
a
a
a
ε
a
a
a a
a a
a a a
a a a
ESSLLI 2006 – p.21
2.4 Derivation of discontinuous dependencies
than Maryfromhe took more candies from Tom
predpred restr
dobj
prepos compar
compconj
V G1
V G3
THANCompar
SV G
PGNGPp
NG V G4
V G7
FROMPG
NGV
NG1MOREcompconj
NG
FROMiobj
V G6
Ppprepos
compconjTHAN
ESSLLI 2006 – p.22
2.5 Dependency discontinuity measureValency deficit σG: max size of potentials in completederivation trees of G.
σG1 ". It is unlikely the case of natural languages,where σG is bounded by 2 or 3 .
!! Dutch cross-serial dependencies ( [BKPZ82] ) and thesimilar:
Wim Jan Marie omdat de kinderen zag helpen leren zwemen
infdobjinfdobjdet
Wim Jan Marie the children saw help teachbecause swim
iobj
predofinf
dobj
conj
pred
infiobj
Wimpred
zag and Janpred
helpen wouldt conflict with FA rule.
But Wimpred
zag and Janpredofinf
helpen do not!!
ESSLLI 2006 – p.23
3. DSG vs. CDG and Complexity
Strong generative power:TH DCDGproj ! DCDG ! DgCDG ! DgDSG
Weak generative power:TH LCFG LCDGproj LgDSGσω !
LCDG LgCDG LgDSG
(gDSGσω : gDSG with bounded valency deficit).
TH [DD04]
1. Worst case complexity of CDG is On54p, where p is thenumber of polarized valencies.
2. If σG is bounded by a constant , then the complexity is
On3.
ESSLLI 2006 – p.24
3.1 DSG,CDG vs. TAG,MCTAG (1)
For Lmdf d0an0d1a
n1 . . . dma
nmdm1n # 0 and for all m 4,
Lm $ LTAG.
TH [Dik04] Lm is generated by CDG Gm :
a0 D0D0Am . . . A1,
d0 SD0 and dm1 Dm,
ai #rAi
rAi, #rAiDi,
di Di1
rAi0 % i m.
Lcopydf wcw w a, b LTAG.Hypothesis: Lcopy $ LDSG.
Unlikely: LDSG closed under L and L1 L2.
ESSLLI 2006 – p.25
3.2 DSG,CDG vs. TAG,MCTAG (2)MIXdf w a, b, cwa wb wc
Hypothesis[E.Bach]: MIX $ LMCTAG.
TH [BDF05] MIX LCDG .
TABLE OF CATEGORY ASSIGNMENTS
a BCS a SCB a BSC, CSB
a BCSS a SSCB a BSSC, CSSB
b B b B
c C c C
Lcopy $? DSGTAG w LinIG w CombCG
L5
DSG w CDG
MIX $? MCTAG
CF w DSGσc w CDG
σcMCTAG w MinG w MCFG
ESSLLI 2006 – p.26
Conclusion
First-available-dual-valency rule underliescompositional definitions of discontinous dependencies
FA-rule is linguistically founded
FA-rule valency based grammars are efficientlyparsable
ESSLLI 2006 – p.27
Referen
ces
[BD
F05]
D.B
echet,A
.D
ikovsky,and
A.Foret.
Dependency
Structure
Gram
mars.
InP.
Blache
andE
.S
ta-
bler,editors,
Proc.
ofthe
5thInt.
Conf.
“Logical
Aspects
ofCom
putationalLinguistics”(LA
CL’2005),
LNA
I3492,pages18–34,2005.
[BK
PZ82]
J.Bresnan,R
.M.K
aplan,S.P
eters,andA
.Zaenen.
Cross-serial
dependenciesin
Dutch.
LinguisticIn-
quiry,13(4):613–635,1982.
[DD
04]M
ichaelD
ekhtyarand
Alexander
Dikovsky.
Cate-
gorialDependency
Gram
mars.
InM
.M
oortgatand
V.P
rince,editors,
Proc.
ofIntern.C
onf.on
Catego-
rialGram
mars,pages
76–91,Montpellier,2004.
[Dik04]
Alexander
Dikovsky.
Dependencies
asC
ategories.
In“R
ecentA
dvancesin
Dependency
Gram
mars”.
CO
LING
’04W
orkshop,pages90–97,2004.
27-1
Underspecified semantics andcompositional grammar interface
Alexander Dikovsky
LINA, University of Nantes
ESSLLI 2006 – p.1
PLAN
1. Underspecified semantics
2. TAG - hole semantics interface
3. Discourse Plans
4. DP - CDG interface
ESSLLI 2006 – p.2
1. Underspecified semanticsWhy underspecification: e.g. for generation:
generate from valid logical form leads to excessivecomplexity
in machine translation, resolution of quantifier scope hasminor effect on the translation. So to couple thecorresponding syntactic structures, a partially definedsemantic structure may suffice
on the other hand, flattening the recursive semanticstructure can lead to spurious ambiguity; so anintermediate abstraction level is needed
Ideally: underspecified semantic structure should preserve
enough information to construct all and only possible readings
ESSLLI 2006 – p.3
1.1 Underspecified MRS [CFSP99]
Language: conventional predicate calculus extended with flatconjunction
and disjunction
and generalized quantifiers(GQ) GenQx,R,Bdf λB λR. Q xφQRx, BxEX: Genx, dogx, Geny,
whitey, caty, chasex, y
for every dog chases some white cat
Partial description of scopes of GQ using handle variables H:
h0 handle, h1 : Genx, h2, h3, h2 : dogx, h4 : chasex, y, h5 : Geny, h6, h7,
h6 : whitey, h7 : caty body, constraints subsumes
h0, h1 : Genx, h2, h3, h2 : dogx, h4 : chasex, y, h5 : Geny, h6, h7,
h6 : whitey, h7 : caty, h3 h4 Constraint h3 h4 implies: h1 overscopes h4
REM: Subsumption is monotone and includes composition of MRS
ESSLLI 2006 – p.4
1.2 Hole semantics [Bos95]
Language: another extension of predicate calculus with GQ:H (hole constants), Lc (label constants), Lv (label variables),K (basic language variables and constants)Formulas F :
l : Rni1, . . . , in F (including GQ), l Lc Lv;
ij K H Lc Lv
h l F (h overscopes l), h H, l Lc
φ, ψ F for φ F and ψ F
Saturated formulas φ: without label variables
Not:Sφ: all scope (i.e. label and hole) constants in φ
ESSLLI 2006 – p.5
Hole semantics. ContinuedScoping relation of a saturated formula φ is the minimal PO
φ on Sφ such that:
k φ k for all k Sφ
k φ k
if k k in φ
k φ k
, k φ k if l : Rn. . . , k, . . . , k, . . . is in φ
Plugging is an injection P : HφLφ
Plugging P possible for φ: k P φ k k k k φ k
Possible pluggings P of φ define its scoping models, i.e. the
basic language formulas resulting from φ by specializations
of scopes consistent with φ
ESSLLI 2006 – p.6
Hole semantics. FinishedEX: Every dog chases a cat
φ : l0 : x, h1, h2, h1 l1, l1 : Dx, h2 l2, l2 : Chx, y, l3 :
y, h3, h4, h3 l4, l4 : Cy, h4 l2
the only two possible pluggings for φ:P1 h1 l1, h2 l3, h3 l4, h4 l2P2 h1 l1, h2 l2, h3 l4, h4 l0correspond to two different readings:l0 : x, l1, l3, l1 : Dx, l2 : Chx, y, l3 : y, l4, l2, l4 : Cy
l0 : x, l1, l2, l1 : Dx, l2 : Chx, y, l3 : y, l4, l0, l4 : Cy.
This technique is applied to other basic languages, in particu-
lar to DRT [KvGR]
ESSLLI 2006 – p.7
Analytic SYNSEM interface. TAG case
Π : f (syntactic) φf (semantic)
ESSLLI 2006 – p.8
2. TAG - hole semantics interface [GK03]
Interface: Predicate arguments (linked with NP-groups byidentical variables) are added through substitutionAdverbials and quantifiers are added through adjunctionEX: John loves Mary
S
NP x1V P
NP x2 NPm
NP j
loves Mary
John
namej, john
namem,maryl0 : lovex1, x2
V
resulting in:
l0 : lovej,m, namej, john, namem,mary
ESSLLI 2006 – p.9
TAG - hole semantics interface finished
EX: every dog barks
Not: N s vs. Ns: variable s unifies with superscript / underscriptterm
dog
Nx1,l1
every
Nx,s2
Det
S
Nx2,l2 V P
V
barks
N
x,s1
l0 : x, h1, h2
h1 s1, h2 s2
l1 : dogx1l2 : barkx2
resulting in:
l0 : x, h1, h2, h1 l1, h2 l2, l1 : dogx, l2 : barkx
ESSLLI 2006 – p.10
3. Discourse Plans[Dik03, DS05] Underspecified semantic structure languageDPL extending the standard logical syntax with cognitivefeatures sufficient to represent the various linguistic featuresmarked for by surface language means (prosodical,morphological, syntactic or lexical)
uses abstract situations (different from referentialsituations of [BP83] ) as elementary predications
accounts for communicative structure
is closely related with dependency structures
has a dynamic compositional semantics close to that ofDRT [Kam81, vEK97]
ESSLLI 2006 – p.11
3.1 Abstract situationsInvariants of communicative views (= semantic diatheses)
gives
VIEWS: salience, typeof actants
dthinf
canonical
given . . by
. . .
to give
giving
dthgerund
sit give
surface form
dthpassive
. . .
EX(G. Frege, "Begriffsschrift" [Fre79] ):Bei Platae siegten die Griechen uber die Perser andBei Platae wurden die Perser von den Griechen besiegtuse the same situation: siegenSBJ,OBJ
Arguments of situations vary from diatheses to diathesesaccording to their intended communicative ranks
Their types constrain values: s, n, q, c and instances(e.g., na n, scauseff s, qquant q, cintens c)
ESSLLI 2006 – p.12
3.2 Situation definition (profiles)Canonical profile EX: open:
give
SBJna ,OBJn, RCPna
scausmov
Diatheses (specified by intended commranks: T, O, , , )dthfpassive PAGT SBJ, SBJ OBJT
spsv
one of passive profiles in English:
dthfpassivegive
SBJn, PAGTna , RCPna
spsv
. . .
(The appleSBJ:T was given to AdamRCP: by EvePAGT:)Actants: arguments identified by thematic roles obligatory inat least one profileCircumstantials: all other arguments (identified by attributes)Lexical test: Circumstantials are not used in the gloss:giveSBJ : X, OBJ : Y, RCP : Z :: “X deliberately changes belonging of Y to X for itsbelonging to Z at will of Z”
Circumstantials obligatory in DP: ILLOCUTIVE STATUS (e.g. DCL,
INTERROG, CONDIT), ASP (NEUTR, PERF, PROGR), TIME (PRES, PAST)ESSLLI 2006 – p.13
3.3 DP examples (in place of DPL definition)Situation open canonical profile:
open
SBJna ,OBJn, INSTRn
seff
openseff :
ILL=DCL
ASP=PERF
TNS=PRES
DS
g Johnna
SBJna : T
x doorn
OBJn :
y keyn
INSTRn :
newqprop
Qualqprop
easilycint
Intenscint
ESSLLI 2006 – p.14
3.4 A decausation diathesis
open: dthdecausins , SBJ, SBJ INSTRT,OBJ OBJ
seff
dthdecausinsseff
openseff :
SBJna : OBJn : INSTRn : T
ILL=DCL
ASP=PERF
TNS=PRES
DS
x keyn
SBJn
newqprop
Qualqprop
y doorn
OBJn
easilycint
Intenscint
ESSLLI 2006 – p.15
3.5 Another decausation diathesis
open
dthdecausO , SBJ, SBJ OBJT, INSTR sfpd
dthdecausO
sfpd
openseff :
SBJna : OBJn : T INSTRn :
ILL=DCL
ASP=PERF
TNS=PRES
DS
x doorn
SBJn
easilycint
Intenscint
ESSLLI 2006 – p.16
3.6 Ultimate decausation diathesis
open: dthdecaus , SBJ, OBJ, INSTR
qs
leave2 canonical profile:
leave2
SBJna,OBJn, DFSq
seff
leave2
seff :
ILL=DCL
ASP=PERF
TNS=PRES
DS
g Johnna
SBJna : T
x doorn
OBJn :
dthdecaus
qs
openseff :
SBJna : OBJn : INSTRn : DFSq :
ESSLLI 2006 – p.17
3.7 Actant’s scopeScoping rule: follows the actants’ topicality order :
SBJ > OBJ > oblique roles (cf. [Cro03] )(SBJ actant outscopes the OBJ which outscopes the oblique)
take2
SBJna ,OBJn
snef
take2snef :
ILL=DCL
ASP=PERF
TNS=PRES
DS
all newsmanna :ncoll
SBJna : T
some trainn
OBJn :
g Geneva site
DESTsite
Genx,
newsmanx, Geny,
trainy, desty,Geneva, take2x, y
ESSLLI 2006 – p.18
3.8 Actant’s scope continued
take2snef :
ILL=DCL
ASP=PERF
TNS=PRES
DS
all newsmanna :ncoll
SBJna : T
x one trainn
OBJn :
g Geneva site
DESTsite
besatt :
ILL=DCL
ASP=PERF
TNS=PRES
DS
x
SBJn : T
latec
DFSc :
(represents the reading:Geny,
trainy, desty,Geneva, latey, Genx,
newsmanx, take2x, y )
ESSLLI 2006 – p.19
3.9 Abstracted DPtake1
seff :
ILL=DCL
ASP=PERF
TNS=PRES
DS
(x)na (he)
SBJna : T
candynmass
OBJn :
ιmore
Quantqrestr
ιcandy nmass
1 dthrelseff
take1seff : O
SBJna : O OBJn : O ORIGn :
2
ILL=REF
ASP=PERF
TNS=PRES
DS
x
SBJnaOBJng Mary na
ORIGn
g Tom na
ORIGn :
ESSLLI 2006 – p.20
3.10 Aggregation / Coordinationaliassfpd :
ILL=DCL
ASP=PERF
TNS=PRES
DS
ι
SBJna : T
x (itna)
1 dthrelseff
inviteseff :
SBJna : T OBJn : O
2
ILL=REF
ASP=PERF
TNS=PRES
DS
g John na
SBJna OBJn
A,contr
DFSq :
no g Maryna
1
g Janena
2
ESSLLI 2006 – p.21
Generation SEMSYN interface. CDG case
Γ : φ (semantic) f φ (syntactic)
ESSLLI 2006 – p.22
4. DP - CDG interface4.1 Transduction Dependency Grammar (TDG) ∆ Γ, G:
G Σ, Cat, S, δ: a CDG,
Γ: a top-down finite transducer from DP π to typed stringsw1 : c1 . . . wn : cn (wi Σ, ci Cat)
∆π,w1 . . . wn, c1 . . . cn, D
π, q0
Γ w1 : c1 . . . wn : cn w,, c1 . . . cn
G D,S
Transitions of Γ:
ΓK, q w : c (K a leaf of π, w Σ, c δw)
ΓKs1 : π1, . . . , sm : πm, q
Γwj1 , qj1 . . .Γπi1 , qi1 . . .ΓK, qk . . .Γπiu, qiu . . .Γwjv , qjv
Language of ∆: L∆ x Σ π,C,D∆π, x, C,DESSLLI 2006 – p.23
4.2 Examples of transition rulesNotation: States DS , GS : DS : discourse status,GS HEAD, LEFT, RIGHT, CLS, TNS, NUM, PER: grammatical status.
E.g., HEAD , LEFT , RIGHT determine categories. Namely,
α1 ... αl γ!β1!...!βr δ GS if
LEFT α1 ... αl, RIGHT β1!...!βr, HEAD γ
EX: they have recently fired twenty people
firescauseff :
ILL=“dcl”ASP=“perf”TNS=“past”
DS
ytheyna : ncoll
SBJna : T
peoplencoll
OBJna :
twentyqqnt
NUMqrestr
recentlycrestr
TMPcrestr :
ESSLLI 2006 – p.24
4.3 Main situation ruleRule pattern:
Γ
π1 SBJnT, OBJ seff, DS ILL “Dcl”
β1sf1SBJ, GS NUM n, PER p, HEAD “pred”,vf1" GS ,of1OBJ, GS HEAD “dobj”
Rule instances (under mutually exclusive conditions)Past perfect instance:π1 SBJnT, OBJ ; TMPt
seff DS .ASP “Perf” DS .TIME “Past”
β1sf1
“have”CLS VAux, NUM n, PER p, TNS DS .TIME, HEAD “S”,
LEFT sf1 . HEAD , RIGHT “aux # ppart”!“circ # time”"
TMP, HEAD “circ # time”
vf1FORM “Partc”, LEFT ε, HEAD “aux # ppart”, RIGHT “dobj”
of1ESSLLI 2006 – p.25
4.4 Main situation rule application
This rule creates the categories:
pred S!aux# ppart!circ# time" for the auxiliary verb and
aux# ppart!dobj for the participle
So it defines the analysis:
ESSLLI 2006 – p.26
ConclusionSemantics underspecification serves for semanticscompositionality and efficiency
Ideally, underspecified semantics expresses all andonly meaning components marked for by surfacelanguage means
Challenge: how to capture the adequateunderspecification level
Underspecified semantics for compositional syntaxgrammars provide natural compositional
Semantics GrammarGrammar Semantics
interfaces
Both interfaces are to be coupled in a natural way with alogical reasoning formal system
ESSLLI 2006 – p.27
Referen
ces
[Bos95]
JohanB
os.P
redicatelogic
unplugged.In
P.Dekker
andM
.S
tokhof,editors,
Proc.
ofthe
10thA
mster-
damC
olloquium,pages
133–142,1995.
[BP
83]J.
Barw
iseand
J.P
erry.S
ituationsand
Attitudes.
MIT
Press,C
ambridge,M
A,1983.
[CF
SP
99]A
nnC
opestake,D
anF
lickinger,Ivan
Sag,
andC
arlP
ollard.M
inimal
recursionse-
mantics.
An
introduction.D
raft.http://w
ww
-
csli.stanford.edu/aac/newm
rs.ps,1999.
[Cro03]
William
Croft.
Typologyand
universals(2nd
ed.).
Cam
bridgeU
niversityP
ress,Cam
bridge,2003.
[Dik03]
Alexander
Dikovsky.
LinguisticM
eaningfrom
the
LanguageA
cquisitionP
erspective.In
Proc.
of
the8th
Intern.C
onf.“F
ormal
Gram
mar
2003”(F
G
2003),pages63–76,V
ienna,Austria,A
ugust2003.
[DS
05]A
lexanderD
ikovskyand
Boris
Sm
ilga.S
eman-
ticR
olesand
Diatheses
forF
unctionalD
iscourse
Plans.
InP
roc.ofthe2nd
InternationalConference
“Meaning-TextT
heory”(M
TT
2005),pages98–109,
2005.
27-1
[Fre79]G
ottlobFrege.
Begriffsschrift,
eineder
arith-
metischen
nachgebildeteF
ormelsprache
desreinen
Denkens.
LouisN
ebert.Halle
a.S.,1879.
[GK
03]C
.Gardentand
L.Kallm
eyer.Sem
anticconstruction
inF
eature-Based
TAG
.In
Proceedings
ofE
AC
L.
Budapest,2003.
[Kam
81]H
.K
amp.
Atheory
oftruth
andsem
anticrep-
resentation.In
J.G
roenendijk,T.
Janssen,and
M.
Stokhoff,
editors,F
ormal
Methods
inthe
Study
ofLanguage.Foris,D
ordrecht,1981.
[KvG
R]
Hans
Kam
p,Josef
vanG
enabith,and
Uw
eR
eyle.
Discourse
Representation
Theory.
toappear
in
Handbook
ofPhilosophicalLogic.
[vEK
97]J.
vanE
ijkand
H.
Kam
p.R
epresentingD
iscourse
inC
ontext.In
J.van
Benthem
andA
.ter
Meulen,
editors,H
andbookof
Logicand
Language,pages
179–237.N
orth-Holland
Elsevier,
The
MIT
Press,
Am
sterdam,C
ambridge,1997.
27-2
Categorial minimalist grammars
Christian Retore, Universite Bordeaux 1
joint work withAlain Lecomte (U. Grenoble II) and Maxime Amblard (U. Bordeaux 1)
Universita degli studi di Verona – Maggio 2006
Equipe SignesINRIA-Futurs & LaBRI-C.N.R.S.
& dep. sciences du langage, Universite Bx3 Michel de Montaigne
Contents
1 General remarks 3
2 Reminder on syntax and semantics in categorial grammars 10
3 Stabler minimalist grammars 19
4 Categorial Minimalist Grammars ( a la Lambek) 25
5 Syntax/semantics 36
6 Results 43
7 In progress: CMG proofnets 44
8 Perspectives 45
1. General remarks
1.1. Syntax boundaries
• Inflectional morphology
– for: depending on the language syntactic construction with ex-plicit words OR inflection
– against: different techniques (finite state automata, transducers— although in Navajo....)
• Logical semantics (who does what)
– for: rather syntactic phenomena (logical syntax)– against: different techniques (e.g. in dependency approach
formal grammars 6= dependency graphs)
• prosody (which is related to syntactic structure)
• lexical semantics (→ restricted selection)
• encyclopaedic knowledge (→ getting rid off some ambiguities)
1.2. Linguistic theories and their mathematical models
Theories: generative grammar dependency grammar others?
Mathematical models:
• context-free grammars, tree grammars, composition different fromsubstitution and term rewriting (e.g. adjunction in TAGs)
• unification grammars
Algorithmic complexity of parsing
• Unification grammars DCG GPSG HPSG (undecidable parsing)
• Context sensitive unification grammars like LFG (decidable parsing)
• TAGs, Range Concatenation Grammars (polynomial)
Given a theory, is there a privileged model?(generative grammar −→ TAGs?)Given a model, is there and underlying theory(cf. exegeses of HPSG by Pollard & Sag)
1.3. Modelling, Parsing, Generation
Parsing or generation?
• As far as analyse is concerned:
– word order does not mind,sentences are grosso modo correct.
– transformations and empty elements are a challenge
– what do we do with parse structures?
• As far as generation is concerned
– word order if crucial.
– transformations and empty elements are welcome
– out of what kind of object do we build (parse structure of) sen-tences.
1.4. Cognitive realism, empirical coverage
What do we model?
• corpora? normed language from some norm?
• examples representative of language faculty(internal language of X)
• pathological examples (like magma study for physics)
Problems
• linguistic resources
(annotated corpora, grammars)
• are the solution to specific phenomena compatible
• surgeneration (never adressed in main stream NLP:
corpora are a priori assumed to be correct)
1.5. Convergence towards generalized categorial grammars
Structure MLLlinear λ-calculus
Phonological Form Logical Form
(word order,...) (logical structure)
• Pollard 2004: High-Order Categorical Grammar
• De Groote 2001: Abstract Categorial Grammars
• Muskens 2003: Lambdas, Language and Logic
• Lecomte Retore 2001 Minimalist Categorial grammars
• Perrier 2001: Interaction Grammars
1.6. Generative grammars
Usual criticisms:
• transformations are algorithmically untractable
(analyse / generation)
• derivation −→ representations levels −→ conditions on each
• what is Logical Form?
Awards:
• links between languages (principles and parameters)
• transformations: links between related sentences (questions / an-swers)
• syntax and semantics (coreference, (generalized) quantifier scopes)
1.7. Outcome of Stabler’s formalisation of the minimalist program
• Good computability: polynomial, like LCFRS simple positive RCG)
• Derivational formalisation (generative-enumerative syntax)
and representational formalisation (model-theoretic syntax)
(cf. Pullum et Scholz 2001)
Monnich, Morawietz et Michaelis (2001-2004):Set of an MG parse trees =image by a binary relation definable in monadic second order logicof a set of regular tree definable in monadic second order logic(hence can be analyzed with pushdown of pushdown automaton)
Very fine, but ... it’s quite difficult to write lexicalized grammar:One must follow some linguistic theory.
2. Syntax and semanticsin categorial grammars (reminder)]
2.1. Syntactic categories
B = S, sn, n, ...
F ::= B | F \ F | F / F
if u : A and f : A \B then uf : B (AB and Lambek)if u : A et f : B / A then fu : B (AB and Lambek)if u : A et uf : B then f : A \B (Lambek only)if u : A et fu : B then f : B / A (Lambek only)
2.2. Semantic types
Church 1930, Curry 1940, Montague 1970
2.2.1. Logical formulae in simply typed λ-calculus with 2 basic types:
• individual e
• truth values t
• n-ary predicate : e → (e → (e → (· · · → t)))
• n-ary function : e → (e → (e → (· · · → e)))
• logical constants ∧,∨,⇒ : t → (t → t)∃,∀ : (e → t) → t
2.2.2. Syntactic categories and semantic types
S∗ = t sentence: truth values / propositionssn∗ = e individualn∗ = e → t unary predicate
(A \B)∗ = (B / A)∗ = A∗ → B∗ propagation to every formula
2.2.3. Lexicon: example
aimer (np\S)/np e → e → t λxλy.aimer(y, x)
tout ((S/np)\S)/n (e → t) → (e → t) → t λPλQ.∀xP (x) ⇒ Q(x)
enfant n e → t λx.enfant(x)
une (S/(np\S))/n (e → t) → (e → t) → t λPλQ.∃xP (x) ∧Q(x)
institutrice n e → t λx.student(x)
2.2.4. Parsing example:
Two syntactic analysesfor two possible readings.
1. (tout enfant)(λy (une institutrice) (λx aimer (x,y)))∀z enfant(z) ∧ (∃s instit(x) ⇒ aimer(z, s))
2. (une institutrice)(λx (tout enfant)(aimer x))∃s (instit(x) ⇒ ∀zenfant(z) ∧ aimer(z, s))
S
HHHHH
S/(np\S)
HHH
S/(np\S)/ntout
nenfant
(np\S)2
S
HHHHHH
HH
S/np1
S
HHHH
np2
(np/S)
HHH
(np/S)/npaimer
np1
(S/np)\S
HHHH
(S/np)\S/nune
ninstitutrice
S
HHHH
HHH
S/np1
S
HHHHHH
S/(np\S)
HHH
S/(np\S)/ntout
nenfant
(np/S)
HHH
(np/S)/npaimer
np1
(S/np)\S
HHHH
(S/np)\S/nune
ninstitutrice
2.2.5. Explanation
Why does it works?
• syntactic analyse = proof in the Lambek calculus
• forgetting directions ⊂ proof in MLL ⊂ intuitionnistic logic
• type morphism −→ intuitionistic proof, lambda-term
• variable := lexical lambda-terms (same type)
• beta redution −→ proof of S∗ = t i.e. a proposition
2.2.6. Critics
• Too restricted syntactic formalism:
discontinuous constituents: ne...pas
middle extraction: Le livre quei [tu lis (ti) ces jours-ci] est Samarcande
• some analyses do not have a semantic counter part
(type raising is mandatory)
e.g. Joan: (e → t) → t and not e because of
Joan et tous les invit:’es sont partis.
an analysis with Joan: sn has no semantic counterpart
• the syntactic cateogry of the quantifiers depends on their syntacticposition
”tout” has a syntactic type for subject position
another for object position, etc. pour celle objet, etc...
3. Stabler minimalist grammars
3.1. Overview
• based on the minimalist program
• lexicalised grammars
• generative capacity : MC-TAG / MCFG
• polynomial parsing
• principle and parameters approach to language variation
• relation bewteen related sentences
(by movement and transformations):
? questions
(1)Combien de livres que Tabucchi a ecrit aime-t-il?
(2) Il aime trois livres que Tabucchi a ecrit.
? passive
(3) Ce livre a ete ecrit par Pavese
(4) Pavese a ecrit ce livre.
• raises some important syntactic/semantic questions:
possible or impossible coreference
il=Tabucchi (1) possible (2) impossible
3.2. Analysis structures
• binary trees
• leaves: list of features
• internal node : ”<” or ”>” leading to the head
• maximal projection of h: largest tree whose head is h
<
HHH
· · · > HH
· · · >HH· · · h
3.3. Lexicon
Features
• base d, n, v
• select =d for d in base
• licensees -case, -wh
• licensors +case, +CASE, +wh, +WH
Lexicon : list of features /mot/ (mot)
3.4. Generative rules
MERGE
<HH
t1 t2
>HH
t2 t1
if t1 ∈ Lex otherwise
MOVE
<
HHH
tete+f
•
HHHH
projection maximale−f
>
HHH
• HH
<
HHH
· •
HHHH
si +f riensi +F PF seulement
3.5. Lexicon example
aimer =d +case =d v
une =n d −case
institutrice n
tout =n d −case
enfant n
infl =v +case tcomp =v c
Exemple a faire au tableau
4. Categorial Minimalist Grammars
Only elimination rule
AB or Lambek grammars commutative product
b HH
b/a a
bHHb a\b
Merge Move
[Partially commutative linear logic, de Groote, 1996]
4.1. Some differences:
• internal subject hypothesis
(like in Radford 97 ands some other minimalist papers)
• commutative product
set of features instead of list of features
4.2. Example of a categorial minimalist lexicon
aimer (d\k1\v)/d1 =d +case =d vune k × d/n =n d −caseinstitutrice n ntout k × d/n =n d −caseenfant n n
infl (k\v)/v =v +case tcomp v/c =v c
5. Syntax/semantics
5.1. Logical system for semantics
As usual:
?logical formulae as λ-terms.Base types e and t a la Montague.
BUT moreover?λ-terms with explicit contexts:list of free variables
t
HHHH
HH
t → t t
HHHHHH
(e → t) → t e → t
e ` t
HHHHH
t → t e ` t
HHHHH
(e → t) → t e ` e → t
e, e ` t
HHHH
e ` e e ` e → t
HHH
e ` e e → e → t
5.2. Semantic rules
• application : [→]
• abstraction in the tree hosting the move
Γ, z : Z ` u : U
Γ ` (λz. u) : Z → U [EXTRACT ]
• application, for type raising
∆ ` z : (T → U) → V Γ ∪ [x : T ] ` u : U[RAISE]
∆ ∪ Γ ` z(λx.u) : V
• l’application, sans montee de type
∆ ` z : T Γ, x : T ` u : U[NORAISE]
∆ ∪ Γ ` (λx. u)z : U
5.3. Syntax/semantics
SY N , syntactic calculus
• connectives: ×, /, \
• only elimination rules (encoding move and merge
SEM , semantic calculus
• connective →
• semantic rules (derived rules)
parallel SY N ‖ SEM :
syntaxe semantique
merge : [/][\] [→]move [Extract]projection [RAISE]ou[NORAISE]
• every leaf in SEM has a coindexed part in SY M
• each step and its counterpart are executed
in the same order in their respective derivations
5.4. Sample lexicon
aimer =d +case =d v k\d\v/d ` λxλy.aimer(y, x)
une =n d −case k × d/n ` λPλQ.∃xP (x) ∧Q(x)
institutrice n n ` λx.instit(x)
tout =n d −case k × d/n ` λPλQ.∀xP (x) ⇒ Q(x)
enfant n n ` λx.enf(x)
infl =v +case t (t/k)/v ` λP.pass comp(P )comp =v c v/c ` λP.P
6. Results
• syntax/semantics correspondence
extended to a richer syntactic system.
• a single syntactic category for a quantifier,
whatever might be its syntactic position.
• understanding movement
– in the structure which host the moved constituent: λ-abstraction
– for the moved constituent : type raising
7. In progress: CMG proofnets
Minimalist grammars without movement : bounded pushdown for partialstructures, insertion only when there will be no further movement.
Word order can be reconstructed without distinct \ and /
• first application of a lexical function : argument after fucntion (lexicalmerge)
• otherwise argument before the function (non lexical merge)
Proof-nets (graphs):
• equivalent formalism
• better for product(complicated normal forms)
• avoid co-indexation of hypothesis to be cancelled simultaneously
• better algorithms for constructing analysis(e.g. minimizing axioms length, Moot 2004
• formulae −→ trees taking into account the order of the operations
8. Perspectives
• possible or impossible coreference for anaphora resolution
incremental calculus of binding principles and small clauses
or of Reinhardt/Reuland semantic binding (Bonato)
(1) Carlotta’s dog thinks that he hates him.
(2) * Ili aime trois livre que Tabuchi i a ecrit.
• semantics of questions
(Maxime Amblard)
(3) Quel train Pierre prend?
(4) Quel train prend Pierre? (plus difficile)
• clitics, clitic climbing with correct control intepretation (in progress,Amblard)
(similar to Moot/Retore 2005 for multimodal categorial grammars orto Stabler 2001)
(5) Je repare ma voiture.
(6) Je la repare.
(7) Je sais la reparer.
(8) Je la fais reparer. (”la” is being repared)
(9) Je te permets de venir. (”te” viens)
(10) Je te promets de venir (”je” viens)
Extending Montague semantics to a richer syntax.