» Articles » PMID: 29928196

Comparison of Four Control Methods for a Five-Choice Assistive Technology

Overview
Specialty Neurology
Date 2018 Jun 22
PMID 29928196
Citations 4
Authors
Affiliations
Soon will be listed here.
Abstract

Severe motor impairments can affect the ability to communicate. The ability to see has a decisive influence on the augmentative and alternative communication (AAC) systems available to the user. To better understand the initial impressions users have of AAC systems we asked naïve healthy participants to compare two visual (a visual P300 brain-computer interface (BCI) and an eye-tracker) and two non-visual systems (an auditory and a tactile P300 BCI). Eleven healthy participants performed 20 selections in a five choice task with each system. The visual P300 BCI used face stimuli, the auditory P300 BCI used Japanese Hiragana syllables and the tactile P300 BCI used a stimulator on the small left finger, middle left finger, right thumb, middle right finger and small right finger. The eye-tracker required a dwell time of 3 s on the target for selection. We calculated accuracies and information-transfer rates (ITRs) for each control method using the selection time that yielded the highest ITR and an accuracy above 70% for each system. Accuracies of 88% were achieved with the visual P300 BCI (4.8 s selection time, 20.9 bits/min), of 70% with the auditory BCI (19.9 s, 3.3 bits/min), of 71% with the tactile BCI (18 s, 3.4 bits/min) and of 100% with the eye-tracker (5.1 s, 28.2 bits/min). Performance between eye-tracker and visual BCI correlated strongly, correlation between tactile and auditory BCI performance was lower. Our data showed no advantage for either non-visual system in terms of ITR but a lower correlation of performance which suggests that choosing the system which suits a particular user is of higher importance for non-visual systems than visual systems.

Citing Articles

EEG-based vibrotactile evoked brain-computer interfaces system: A systematic review.

Huang X, Liang S, Li Z, Lai C, Choi K PLoS One. 2022; 17(6):e0269001.

PMID: 35657949 PMC: 9165854. DOI: 10.1371/journal.pone.0269001.


Wheelchair Control in a Virtual Environment by Healthy Participants Using a P300-BCI Based on Tactile Stimulation: Training Effects and Usability.

Eidel M, Kubler A Front Hum Neurosci. 2020; 14:265.

PMID: 32754019 PMC: 7366506. DOI: 10.3389/fnhum.2020.00265.


Stimulus modality influences session-to-session transfer of training effects in auditory and tactile streaming-based P300 brain-computer interfaces.

Ziebell P, Stumpfig J, Eidel M, Kleih S, Kubler A, Latoschik M Sci Rep. 2020; 10(1):11873.

PMID: 32681134 PMC: 7368044. DOI: 10.1038/s41598-020-67887-6.


Neural mechanisms of training an auditory event-related potential task in a brain-computer interface context.

Halder S, Leinfelder T, Schulz S, Kubler A Hum Brain Mapp. 2019; 40(8):2399-2412.

PMID: 30693612 PMC: 6865430. DOI: 10.1002/hbm.24531.

References
1.
Schreuder M, Hohne J, Blankertz B, Haufe S, Dickhaus T, Tangermann M . Optimizing event-related potential based brain-computer interfaces: a systematic evaluation of dynamic stopping methods. J Neural Eng. 2013; 10(3):036025. DOI: 10.1088/1741-2560/10/3/036025. View

2.
Kalika D, Collins L, Caves K, Throckmorton C . Fusion of P300 and eye-tracker data for spelling using BCI2000. J Neural Eng. 2017; 14(5):056010. DOI: 10.1088/1741-2552/aa776b. View

3.
Onishi A, Takano K, Kawase T, Ora H, Kansaku K . Affective Stimuli for an Auditory P300 Brain-Computer Interface. Front Neurosci. 2017; 11:522. PMC: 5613193. DOI: 10.3389/fnins.2017.00522. View

4.
Kotchoubey B, Lang S . Parallel processing of physical and lexical auditory information in humans. Neurosci Res. 2003; 45(4):369-74. DOI: 10.1016/s0168-0102(02)00250-x. View

5.
Krusienski D, Sellers E, McFarland D, Vaughan T, Wolpaw J . Toward enhanced P300 speller performance. J Neurosci Methods. 2007; 167(1):15-21. PMC: 2349091. DOI: 10.1016/j.jneumeth.2007.07.017. View