Rational Integration of Noisy Evidence and Prior Semantic Expectations in Sentence Interpretation
Overview
Affiliations
Sentence processing theories typically assume that the input to our language processing mechanisms is an error-free sequence of words. However, this assumption is an oversimplification because noise is present in typical language use (for instance, due to a noisy environment, producer errors, or perceiver errors). A complete theory of human sentence comprehension therefore needs to explain how humans understand language given imperfect input. Indeed, like many cognitive systems, language processing mechanisms may even be "well designed"--in this case for the task of recovering intended meaning from noisy utterances. In particular, comprehension mechanisms may be sensitive to the types of information that an idealized statistical comprehender would be sensitive to. Here, we evaluate four predictions about such a rational (Bayesian) noisy-channel language comprehender in a sentence comprehension task: (i) semantic cues should pull sentence interpretation towards plausible meanings, especially if the wording of the more plausible meaning is close to the observed utterance in terms of the number of edits; (ii) this process should asymmetrically treat insertions and deletions due to the Bayesian "size principle"; such nonliteral interpretation of sentences should (iii) increase with the perceived noise rate of the communicative situation and (iv) decrease if semantically anomalous meanings are more likely to be communicated. These predictions are borne out, strongly suggesting that human language relies on rational statistical inference over a noisy channel.
Number Attraction in Pronoun Production.
Kandel M, Wyatt C, Phillips C Open Mind (Camb). 2024; 8:1247-1290.
PMID: 39544358 PMC: 11563651. DOI: 10.1162/opmi_a_00167.
The Spatiotemporal Dynamics of Bottom-Up and Top-Down Processing during At-a-Glance Reading.
Flower N, Pylkkanen L J Neurosci. 2024; 44(48).
PMID: 39419549 PMC: 11604140. DOI: 10.1523/JNEUROSCI.0374-24.2024.
McCoy R, Yao S, Friedman D, Hardy M, Griffiths T Proc Natl Acad Sci U S A. 2024; 121(41):e2322420121.
PMID: 39365822 PMC: 11474099. DOI: 10.1073/pnas.2322420121.
Letters, Words, Sentences, and Reading.
Grainger J J Cogn. 2024; 7(1):66.
PMID: 39220856 PMC: 11363890. DOI: 10.5334/joc.396.
Neural populations in the language network differ in the size of their temporal receptive windows.
Regev T, Casto C, Hosseini E, Adamek M, Ritaccio A, Willie J Nat Hum Behav. 2024; 8(10):1924-1942.
PMID: 39187713 DOI: 10.1038/s41562-024-01944-2.