» Articles » PMID: 33505259

Perceived Mental Workload Classification Using Intermediate Fusion Multimodal Deep Learning

Overview
Specialty Neurology
Date 2021 Jan 28
PMID 33505259
Citations 4
Authors
Affiliations
Soon will be listed here.
Abstract

A lot of research has been done on the detection of mental workload (MWL) using various bio-signals. Recently, deep learning has allowed for novel methods and results. A plethora of measurement modalities have proven to be valuable in this task, yet studies currently often only use a single modality to classify MWL. The goal of this research was to classify perceived mental workload (PMWL) using a deep neural network (DNN) that flexibly makes use of multiple modalities, in order to allow for feature sharing between modalities. To achieve this goal, an experiment was conducted in which MWL was simulated with the help of verbal logic puzzles. The puzzles came in five levels of difficulty and were presented in a random order. Participants had 1 h to solve as many puzzles as they could. Between puzzles, they gave a difficulty rating between 1 and 7, seven being the highest difficulty. Galvanic skin response, photoplethysmograms, functional near-infrared spectrograms and eye movements were collected simultaneously using LabStreamingLayer (LSL). Marker information from the puzzles was also streamed on LSL. We designed and evaluated a novel intermediate fusion multimodal DNN for the classification of PMWL using the aforementioned four modalities. Two main criteria that guided the design and implementation of our DNN are modularity and generalisability. We were able to classify PMWL within-level accurate (0.985 levels) on a seven-level workload scale using the aforementioned modalities. The model architecture allows for easy addition and removal of modalities without major structural implications because of the modular nature of the design. Furthermore, we showed that our neural network performed better when using multiple modalities, as opposed to a single modality. The dataset and code used in this paper are openly available.

Citing Articles

A novel signal channel attention network for multi-modal emotion recognition.

Du Z, Ye X, Zhao P Front Neurorobot. 2024; 18:1442080.

PMID: 39323931 PMC: 11422387. DOI: 10.3389/fnbot.2024.1442080.


Artificial intelligence-driven radiomics study in cancer: the role of feature engineering and modeling.

Zhang Y, Zhang X, Cheng Y, Li B, Teng X, Zhang J Mil Med Res. 2023; 10(1):22.

PMID: 37189155 PMC: 10186733. DOI: 10.1186/s40779-023-00458-8.


Deep learning in fNIRS: a review.

Eastmond C, Subedi A, De S, Intes X Neurophotonics. 2022; 9(4):041411.

PMID: 35874933 PMC: 9301871. DOI: 10.1117/1.NPh.9.4.041411.


Perceived Mental Workload Classification Using Intermediate Fusion Multimodal Deep Learning.

Dolmans T, Poel M, van t Klooster J, Veldkamp B Front Hum Neurosci. 2021; 14:609096.

PMID: 33505259 PMC: 7829255. DOI: 10.3389/fnhum.2020.609096.

References
1.
Eyobu O, Han D . Feature Representation and Data Augmentation for Human Activity Classification Based on Wearable IMU Sensor Data Using a Deep LSTM Neural Network. Sensors (Basel). 2018; 18(9). PMC: 6165524. DOI: 10.3390/s18092892. View

2.
Tavakol M, Dennick R . Making sense of Cronbach's alpha. Int J Med Educ. 2016; 2:53-55. PMC: 4205511. DOI: 10.5116/ijme.4dfb.8dfd. View

3.
Toppi J, Borghini G, Petti M, He E, De Giusti V, He B . Investigating Cooperative Behavior in Ecological Settings: An EEG Hyperscanning Study. PLoS One. 2016; 11(4):e0154236. PMC: 4849782. DOI: 10.1371/journal.pone.0154236. View

4.
Dolmans T, Poel M, van t Klooster J, Veldkamp B . Perceived Mental Workload Classification Using Intermediate Fusion Multimodal Deep Learning. Front Hum Neurosci. 2021; 14:609096. PMC: 7829255. DOI: 10.3389/fnhum.2020.609096. View

5.
Villringer A, Planck J, Hock C, Schleinkofer L, Dirnagl U . Near infrared spectroscopy (NIRS): a new tool to study hemodynamic changes during activation of brain function in human adults. Neurosci Lett. 1993; 154(1-2):101-4. DOI: 10.1016/0304-3940(93)90181-j. View