How distractor objects trigger referential overspecification

Testing the effects of visual clutter and distractor distance

Research output: Contribution to journalArticleScientificpeer-review

Abstract

In two experiments, we investigate to what extent various visual saliency cues in realistic visual scenes cause speakers to overspecify their definite object descriptions with a redundant color attribute. The results of the first experiment demonstrate that speakers are more likely to redundantly mention color when visual clutter is present in a scene as compared to when this is not the case. In the second experiment, we found that distractor type and distractor color affect redundant color use: Speakers are most likely to overspecify if there is at least one distractor object present that has the same type, but a different color than the target referent. Reliable effects of distractor distance were not found. Taken together, our results suggest that certain visual saliency cues guide speakers in determining which objects in a visual scene are relevant distractors, and which not. We argue that this is problematic for algorithms that aim to generate human-like descriptions of objects (such as the Incremental Algorithm), since these generally select properties that help to distinguish a target from all objects that are present in a scene.
Original languageEnglish
Pages (from-to)1617-1647
Number of pages31
JournalCognitive Science
Volume40
Issue number7
DOIs
Publication statusPublished - 2016

Fingerprint

Color
Testing
Cues
Experiments

Keywords

  • Definite reference
  • Overspecification
  • Visual clutter
  • DIstractor distance
  • Computational models

Cite this

@article{6c8041504cdf4dfdbf13a3e2422ea9af,
title = "How distractor objects trigger referential overspecification: Testing the effects of visual clutter and distractor distance",
abstract = "In two experiments, we investigate to what extent various visual saliency cues in realistic visual scenes cause speakers to overspecify their definite object descriptions with a redundant color attribute. The results of the first experiment demonstrate that speakers are more likely to redundantly mention color when visual clutter is present in a scene as compared to when this is not the case. In the second experiment, we found that distractor type and distractor color affect redundant color use: Speakers are most likely to overspecify if there is at least one distractor object present that has the same type, but a different color than the target referent. Reliable effects of distractor distance were not found. Taken together, our results suggest that certain visual saliency cues guide speakers in determining which objects in a visual scene are relevant distractors, and which not. We argue that this is problematic for algorithms that aim to generate human-like descriptions of objects (such as the Incremental Algorithm), since these generally select properties that help to distinguish a target from all objects that are present in a scene.",
keywords = "Definite reference, Overspecification, Visual clutter, DIstractor distance, Computational models",
author = "Ruud Koolen and Emiel Krahmer and Marc Swerts",
year = "2016",
doi = "10.1111/cogs.12297",
language = "English",
volume = "40",
pages = "1617--1647",
journal = "Cognitive Science",
issn = "0364-0213",
publisher = "Wiley",
number = "7",

}

How distractor objects trigger referential overspecification : Testing the effects of visual clutter and distractor distance. / Koolen, Ruud; Krahmer, Emiel; Swerts, Marc.

In: Cognitive Science, Vol. 40, No. 7, 2016, p. 1617-1647.

Research output: Contribution to journalArticleScientificpeer-review

TY - JOUR

T1 - How distractor objects trigger referential overspecification

T2 - Testing the effects of visual clutter and distractor distance

AU - Koolen, Ruud

AU - Krahmer, Emiel

AU - Swerts, Marc

PY - 2016

Y1 - 2016

N2 - In two experiments, we investigate to what extent various visual saliency cues in realistic visual scenes cause speakers to overspecify their definite object descriptions with a redundant color attribute. The results of the first experiment demonstrate that speakers are more likely to redundantly mention color when visual clutter is present in a scene as compared to when this is not the case. In the second experiment, we found that distractor type and distractor color affect redundant color use: Speakers are most likely to overspecify if there is at least one distractor object present that has the same type, but a different color than the target referent. Reliable effects of distractor distance were not found. Taken together, our results suggest that certain visual saliency cues guide speakers in determining which objects in a visual scene are relevant distractors, and which not. We argue that this is problematic for algorithms that aim to generate human-like descriptions of objects (such as the Incremental Algorithm), since these generally select properties that help to distinguish a target from all objects that are present in a scene.

AB - In two experiments, we investigate to what extent various visual saliency cues in realistic visual scenes cause speakers to overspecify their definite object descriptions with a redundant color attribute. The results of the first experiment demonstrate that speakers are more likely to redundantly mention color when visual clutter is present in a scene as compared to when this is not the case. In the second experiment, we found that distractor type and distractor color affect redundant color use: Speakers are most likely to overspecify if there is at least one distractor object present that has the same type, but a different color than the target referent. Reliable effects of distractor distance were not found. Taken together, our results suggest that certain visual saliency cues guide speakers in determining which objects in a visual scene are relevant distractors, and which not. We argue that this is problematic for algorithms that aim to generate human-like descriptions of objects (such as the Incremental Algorithm), since these generally select properties that help to distinguish a target from all objects that are present in a scene.

KW - Definite reference

KW - Overspecification

KW - Visual clutter

KW - DIstractor distance

KW - Computational models

U2 - 10.1111/cogs.12297

DO - 10.1111/cogs.12297

M3 - Article

VL - 40

SP - 1617

EP - 1647

JO - Cognitive Science

JF - Cognitive Science

SN - 0364-0213

IS - 7

ER -