Static and dynamic vector semantics for lambda calculus models of natural language

Mehrnoosh Sadrzadeh, Reinhard Muskens

Research output: Contribution to journalArticleScientificpeer-review

Abstract

Vector models of language are based on the contextual aspects of language, the distributions of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, compositional properties of words and how they compose to form sentences. In the truth conditional approach, the denotation of a sentence determines its truth conditions, which can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In the vector models, the degree of co-occurrence of words in context determines how similar the meanings of words are. In this paper, we put these two models together and develop a vector semantics for language based on the simply typed lambda calculus models of natural language. We provide two types of vector semantics: a static one that uses techniques familiar from the truth conditional tradition and a dynamic one based on a form of dynamic interpretation inspired by Heim’s context change potentials. We show how the dynamic model can be applied to entailment between a corpus and a sentence and provide examples.
Original languageEnglish
Pages (from-to)319-351
Number of pages33
JournalJournal of Language Modelling
Volume6
Issue number2
DOIs
Publication statusPublished - 2019

Fingerprint

Lambda Calculus
Natural Language
Language
Denotation
Possible Worlds
Contextual
Logic
Truth Value
Co-occurrence
Truth Conditions
Entailment

Keywords

  • Vector Semantics
  • truth conditional meaning
  • lambda calculus
  • dynamic logic
  • context potential update

Cite this

@article{a790546c86b145baa7a3b1f77d16fa28,
title = "Static and dynamic vector semantics for lambda calculus models of natural language",
abstract = "Vector models of language are based on the contextual aspects of language, the distributions of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, compositional properties of words and how they compose to form sentences. In the truth conditional approach, the denotation of a sentence determines its truth conditions, which can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In the vector models, the degree of co-occurrence of words in context determines how similar the meanings of words are. In this paper, we put these two models together and develop a vector semantics for language based on the simply typed lambda calculus models of natural language. We provide two types of vector semantics: a static one that uses techniques familiar from the truth conditional tradition and a dynamic one based on a form of dynamic interpretation inspired by Heim’s context change potentials. We show how the dynamic model can be applied to entailment between a corpus and a sentence and provide examples.",
keywords = "Vector Semantics, truth conditional meaning, lambda calculus, dynamic logic, context potential update",
author = "Mehrnoosh Sadrzadeh and Reinhard Muskens",
year = "2019",
doi = "10.15398/jlm.v6i2.228",
language = "English",
volume = "6",
pages = "319--351",
journal = "Journal of Language Modelling",
issn = "2299-8470",
number = "2",

}

Static and dynamic vector semantics for lambda calculus models of natural language. / Sadrzadeh, Mehrnoosh; Muskens, Reinhard.

In: Journal of Language Modelling, Vol. 6, No. 2, 2019, p. 319-351.

Research output: Contribution to journalArticleScientificpeer-review

TY - JOUR

T1 - Static and dynamic vector semantics for lambda calculus models of natural language

AU - Sadrzadeh, Mehrnoosh

AU - Muskens, Reinhard

PY - 2019

Y1 - 2019

N2 - Vector models of language are based on the contextual aspects of language, the distributions of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, compositional properties of words and how they compose to form sentences. In the truth conditional approach, the denotation of a sentence determines its truth conditions, which can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In the vector models, the degree of co-occurrence of words in context determines how similar the meanings of words are. In this paper, we put these two models together and develop a vector semantics for language based on the simply typed lambda calculus models of natural language. We provide two types of vector semantics: a static one that uses techniques familiar from the truth conditional tradition and a dynamic one based on a form of dynamic interpretation inspired by Heim’s context change potentials. We show how the dynamic model can be applied to entailment between a corpus and a sentence and provide examples.

AB - Vector models of language are based on the contextual aspects of language, the distributions of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, compositional properties of words and how they compose to form sentences. In the truth conditional approach, the denotation of a sentence determines its truth conditions, which can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In the vector models, the degree of co-occurrence of words in context determines how similar the meanings of words are. In this paper, we put these two models together and develop a vector semantics for language based on the simply typed lambda calculus models of natural language. We provide two types of vector semantics: a static one that uses techniques familiar from the truth conditional tradition and a dynamic one based on a form of dynamic interpretation inspired by Heim’s context change potentials. We show how the dynamic model can be applied to entailment between a corpus and a sentence and provide examples.

KW - Vector Semantics

KW - truth conditional meaning

KW - lambda calculus

KW - dynamic logic

KW - context potential update

U2 - 10.15398/jlm.v6i2.228

DO - 10.15398/jlm.v6i2.228

M3 - Article

VL - 6

SP - 319

EP - 351

JO - Journal of Language Modelling

JF - Journal of Language Modelling

SN - 2299-8470

IS - 2

ER -