» Articles » PMID: 33808989

Modeling of Recommendation System Based on Emotional Information and Collaborative Filtering

Overview
Journal Sensors (Basel)
Publisher MDPI
Specialty Biotechnology
Date 2021 Apr 3
PMID 33808989
Citations 5
Authors
Affiliations
Soon will be listed here.
Abstract

Emotion information represents a user's current emotional state and can be used in a variety of applications, such as cultural content services that recommend music according to user emotional states and user emotion monitoring. To increase user satisfaction, recommendation methods must understand and reflect user characteristics and circumstances, such as individual preferences and emotions. However, most recommendation methods do not reflect such characteristics accurately and are unable to increase user satisfaction. In this paper, six human emotions (neutral, happy, sad, angry, surprised, and bored) are broadly defined to consider user speech emotion information and recommend matching content. The "genetic algorithms as a feature selection method" (GAFS) algorithm was used to classify normalized speech according to speech emotion information. We used a support vector machine (SVM) algorithm and selected an optimal kernel function for recognizing the six target emotions. Performance evaluation results for each kernel function revealed that the radial basis function (RBF) kernel function yielded the highest emotion recognition accuracy of 86.98%. Additionally, content data (images and music) were classified based on emotion information using factor analysis, correspondence analysis, and Euclidean distance. Finally, speech information that was classified based on emotions and emotion information that was recognized through a collaborative filtering technique were used to predict user emotional preferences and recommend content that matched user emotions in a mobile application.

Citing Articles

Construction of self-learning classroom history teaching mode based on human-computer interaction emotion recognition.

Ji C, Zhao S Front Psychol. 2022; 13:949556.

PMID: 35967699 PMC: 9364044. DOI: 10.3389/fpsyg.2022.949556.


Integration of Wearable Devices and English Teaching under Positive Psychology.

Zhang J Comput Intell Neurosci. 2022; 2022:7650948.

PMID: 35909862 PMC: 9325595. DOI: 10.1155/2022/7650948.


Cluster Analysis Algorithm in the Analysis of College Students' Mental Health Education.

Zheng W Appl Bionics Biomech. 2022; 2022:6394707.

PMID: 35480710 PMC: 9038429. DOI: 10.1155/2022/6394707.


Enterprise Strategic Management From the Perspective of Business Ecosystem Construction Based on Multimodal Emotion Recognition.

Bi W, Xie Y, Dong Z, Li H Front Psychol. 2022; 13:857891.

PMID: 35310264 PMC: 8927019. DOI: 10.3389/fpsyg.2022.857891.


Efficient Graph Collaborative Filtering via Contrastive Learning.

Pan Z, Chen H Sensors (Basel). 2021; 21(14).

PMID: 34300404 PMC: 8309583. DOI: 10.3390/s21144666.

References
1.
Posner J, Russell J, Peterson B . The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev Psychopathol. 2005; 17(3):715-34. PMC: 2367156. DOI: 10.1017/S0954579405050340. View

2.
Veroniki A, Pavlides M, Patsopoulos N, Salanti G . Reconstructing 2 x 2 contingency tables from odds ratios using the Di Pietrantonj method: difficulties, constraints and impact in meta-analysis results. Res Synth Methods. 2015; 4(1):78-94. DOI: 10.1002/jrsm.1061. View

3.
Rosen S, Hui S . Sine-wave and noise-vocoded sine-wave speech in a tone language: Acoustic details matter. J Acoust Soc Am. 2016; 138(6):3698-702. DOI: 10.1121/1.4937605. View

4.
Ozbay Y, Ceylan M . Effects of window types on classification of carotid artery Doppler signals in the early phase of atherosclerosis using complex-valued artificial neural network. Comput Biol Med. 2006; 37(3):287-95. DOI: 10.1016/j.compbiomed.2006.01.008. View

5.
Wei W, Jia Q . Weighted Feature Gaussian Kernel SVM for Emotion Recognition. Comput Intell Neurosci. 2016; 2016:7696035. PMC: 5078736. DOI: 10.1155/2016/7696035. View