» Articles » PMID: 38544181

Leveraging the Sensitivity of Plants with Deep Learning to Recognize Human Emotions

Overview
Journal Sensors (Basel)
Publisher MDPI
Specialty Biotechnology
Date 2024 Mar 28
PMID 38544181
Authors
Affiliations
Soon will be listed here.
Abstract

Recent advances in artificial intelligence combined with behavioral sciences have led to the development of cutting-edge tools for recognizing human emotions based on text, video, audio, and physiological data. However, these data sources are expensive, intrusive, and regulated, unlike plants, which have been shown to be sensitive to human steps and sounds. A methodology to use plants as human emotion detectors is proposed. Electrical signals from plants were tracked and labeled based on video data. The labeled data were then used for classification., and the MLP, biLSTM, MFCC-CNN, MFCC-ResNet, Random Forest, 1-Dimensional CNN, and biLSTM (without windowing) models were set using a grid search algorithm with cross-validation. Finally, the best-parameterized models were trained and used on the test set for classification. The performance of this methodology was measured via a case study with 54 participants who were watching an emotionally charged video; as ground truth, their facial emotions were simultaneously measured using facial emotion analysis. The Random Forest model shows the best performance, particularly in recognizing high-arousal emotions, achieving an overall weighted accuracy of 55.2% and demonstrating high weighted recall in emotions such as fear (61.0%) and happiness (60.4%). The MFCC-ResNet model offers decently balanced results, with AccuracyMFCC-ResNet=0.318 and RecallMFCC-ResNet=0.324. Regarding the MFCC-ResNet model, fear and anger were recognized with 75% and 50% recall, respectively. Thus, using plants as an emotion recognition tool seems worth investigating, addressing both cost and privacy concerns.

References
1.
Volkov A, Ranatunga D . Plants as environmental biosensors. Plant Signal Behav. 2009; 1(3):105-15. PMC: 2635006. DOI: 10.4161/psb.1.3.3000. View

2.
Ko B . A Brief Review of Facial Emotion Recognition Based on Visual Information. Sensors (Basel). 2018; 18(2). PMC: 5856145. DOI: 10.3390/s18020401. View

3.
Chatterjee S, Das S, Maharatna K, Masi E, Santopolo L, Mancuso S . Exploring strategies for classification of external stimuli using statistical features of the plant electrical response. J R Soc Interface. 2015; 12(104):20141225. PMC: 4345486. DOI: 10.1098/rsif.2014.1225. View

4.
Tao H, Duan Q . Hierarchical attention network with progressive feature fusion for facial expression recognition. Neural Netw. 2023; 170:337-348. DOI: 10.1016/j.neunet.2023.11.033. View

5.
Harris C, Millman K, van der Walt S, Gommers R, Virtanen P, Cournapeau D . Array programming with NumPy. Nature. 2020; 585(7825):357-362. PMC: 7759461. DOI: 10.1038/s41586-020-2649-2. View