Abstract
We investigate the role of information-theoretic measures for compound word reading in two languages: Mandarin Chinese and English. For each language, we report the results of two analyses: a time-to-event analysis using piece-wise additive mixed models (PAMMs) and a causal inference analysis with causal additive models (CAMs). We use the PAMM analyses to gain insight into the temporal profile of the effects of information-theoretic measures in the word naming task. For both English and Mandarin Chinese, we report early effects of the entropy of both constituents, as well as temporally widespread effects of point-wise mutual information (PMI). The CAM analyses provide further insight into the relations between lexical-distributional variables. The image that emerges from the CAM analyses is that the information-theoretic measures entropy and PMI are embedded in a carefully balanced system in which lexical-distributional properties that lead to processing difficulties are offset by lexical-distributional properties that guarantee successful communication. The information-theoretic measures have a central position in this system, and are causally influenced not only by frequency, but also by the effects of other lower-level lexical-distributional variables such as visual complexity, and phonology to orthography consistency.
Original language | English |
---|---|
Article number | 104389 |
Journal | Cognition |
Volume | 205 |
DOIs | |
Publication status | Published - 2020 |
Externally published | Yes |
Keywords
- CAM
- Compounds
- Entropy
- Information theory
- Mandarin Chinese
- Mutual information
- PAMM