» Articles » PMID: 34902846

Neural Tracking to Go: Auditory Attention Decoding and Saliency Detection with Mobile EEG

Overview
Journal J Neural Eng
Date 2021 Dec 13
PMID 34902846
Citations 8
Authors
Affiliations
Soon will be listed here.
Abstract

Neuro-steered assistive technologies have been suggested to offer a major advancement in future devices like neuro-steered hearing aids. Auditory attention decoding (AAD) methods would in that case allow for identification of an attended speaker within complex auditory environments, exclusively from neural data. Decoding the attended speaker using neural information has so far only been done in controlled laboratory settings. Yet, it is known that ever-present factors like distraction and movement are reflected in the neural signal parameters related to attention.Thus, in the current study we applied a two-competing speaker paradigm to investigate performance of a commonly applied electroencephalography-based AAD model outside of the laboratory during leisure walking and distraction. Unique environmental sounds were added to the auditory scene and served as distractor events.. The current study shows, for the first time, that the attended speaker can be accurately decoded during natural movement. At a temporal resolution of as short as 5 s and without artifact attenuation, decoding was found to be significantly above chance level. Further, as hypothesized, we found a decrease in attention to the to-be-attended and the to-be-ignored speech stream after the occurrence of a salient event. Additionally, we demonstrate that it is possible to predict neural correlates of distraction with a computational model of auditory saliency based on acoustic features.Taken together, our study shows that auditory attention tracking outside of the laboratory in ecologically valid conditions is feasible and a step towards the development of future neural-steered hearing aids.

Citing Articles

Covert variations of a musician's loudness during collective improvisation capture other musicians' attention and impact their interactions.

Schwarz A, Faraco A, Vincent C, Susini P, Ponsot E, Canonne C Proc Biol Sci. 2025; 292(2039):20242623.

PMID: 39837509 PMC: 11750383. DOI: 10.1098/rspb.2024.2623.


hvEEGNet: a novel deep learning model for high-fidelity EEG reconstruction.

Cisotto G, Zancanaro A, Zoppis I, Manzoni S Front Neuroinform. 2025; 18:1459970.

PMID: 39759760 PMC: 11695360. DOI: 10.3389/fninf.2024.1459970.


Preprocessing choices for P3 analyses with mobile EEG: A systematic literature review and interactive exploration.

Jacobsen N, Kristanto D, Welp S, Inceler Y, Debener S Psychophysiology. 2024; 62(1):e14743.

PMID: 39697161 PMC: 11656290. DOI: 10.1111/psyp.14743.


Neural speech tracking and auditory attention decoding in everyday life.

Straetmans L, Adiloglu K, Debener S Front Hum Neurosci. 2024; 18:1483024.

PMID: 39606787 PMC: 11599177. DOI: 10.3389/fnhum.2024.1483024.


Using mobile EEG to study auditory work strain during simulated surgical procedures.

Rosenkranz M, Haupt T, Jaeger M, Uslar V, Bleichner M Sci Rep. 2024; 14(1):24026.

PMID: 39402073 PMC: 11473642. DOI: 10.1038/s41598-024-74946-9.