» Articles » PMID: 10391620

Attention to Facial Regions in Segmental and Prosodic Visual Speech Perception Tasks

Overview
Date 1999 Jul 3
PMID 10391620
Citations 32
Authors
Affiliations
Soon will be listed here.
Abstract

Two experiments were conducted to test the hypothesis that visual information related to segmental versus prosodic aspects of speech is distributed differently on the face of the talker. In the first experiment, eye gaze was monitored for 12 observers with normal hearing. Participants made decisions about segmental and prosodic categories for utterances presented without sound. The first experiment found that observers spend more time looking at and direct more gazes toward the upper part of the talker's face in making decisions about intonation patterns than about the words being spoken. The second experiment tested the Gaze Direction Assumption underlying Experiment 1--that is, that people direct their gaze to the stimulus region containing information required for their task. In this experiment, 18 observers with normal hearing made decisions about segmental and prosodic categories under conditions in which face motion was restricted to selected areas of the face. The results indicate that information in the upper part of the talker's face is more critical for intonation pattern decisions than for decisions about word segments or primary sentence stress, thus supporting the Gaze Direction Assumption. Visual speech perception proficiency requires learning where to direct visual attention for cues related to different aspects of speech.

Citing Articles

Audiovisual perception of interrupted speech by nonnative listeners.

Yang J, Nagaraj N, Magimairaj B Atten Percept Psychophys. 2024; 86(5):1763-1776.

PMID: 38886302 DOI: 10.3758/s13414-024-02909-3.


Visual scanning patterns of a talking face when evaluating phonetic information in a native and non-native language.

Deng X, McClay E, Jastrzebski E, Wang Y, Yeung H PLoS One. 2024; 19(5):e0304150.

PMID: 38805447 PMC: 11132507. DOI: 10.1371/journal.pone.0304150.


Primacy of mouth over eyes to perceive audiovisual Mandarin lexical tones.

Zeng B, Yu G, Hasshim N, Hong S J Eye Mov Res. 2024; 16(4).

PMID: 38585238 PMC: 10997307. DOI: 10.16910/jemr.16.4.4.


Word Learning in Deaf Adults Who Use Cochlear Implants: The Role of Talker Variability and Attention to the Mouth.

Hartman J, Saffran J, Litovsky R Ear Hear. 2023; 45(2):337-350.

PMID: 37695563 PMC: 10920394. DOI: 10.1097/AUD.0000000000001432.


Stable eye versus mouth preference in a live speech-processing task.

Viktorsson C, Valtakari N, Falck-Ytter T, Hooge I, Rudling M, Hessels R Sci Rep. 2023; 13(1):12878.

PMID: 37553414 PMC: 10409748. DOI: 10.1038/s41598-023-40017-8.