» Articles » PMID: 27382243

Joint Patch and Multi-label Learning for Facial Action Unit Detection

Overview
Authors
Affiliations
Soon will be listed here.
Abstract

The face is one of the most powerful channel of nonverbal communication. The most commonly used taxonomy to describe facial behaviour is the Facial Action Coding System (FACS). FACS segments the visible effects of facial muscle activation into 30+ action units (AUs). AUs, which may occur alone and in thousands of combinations, can describe nearly all-possible facial expressions. Most existing methods for automatic AU detection treat the problem using one-vs-all classifiers and fail to exploit dependencies among AU and facial features. We introduce joint-patch and multi-label learning (JPML) to address these issues. JPML leverages group sparsity by selecting a sparse subset of facial patches while learning a multi-label classifier. In four of five comparisons on three diverse datasets, CK+, GFT, and BP4D, JPML produced the highest average F1 scores in comparison with state-of-the art.

Citing Articles

A Non-Invasive Approach for Facial Action Unit Extraction and Its Application in Pain Detection.

Bouazizi M, Feghoul K, Wang S, Yin Y, Ohtsuki T Bioengineering (Basel). 2025; 12(2).

PMID: 40001714 PMC: 11851526. DOI: 10.3390/bioengineering12020195.


Time to retire F1-binary score for action unit detection.

Hinduja S, Nourivandi T, Cohn J, Canavan S Pattern Recognit Lett. 2024; 182:111-117.

PMID: 39086494 PMC: 11290352. DOI: 10.1016/j.patrec.2024.04.016.


Understanding Naturalistic Facial Expressions with Deep Learning and Multimodal Large Language Models.

Bian Y, Kuster D, Liu H, Krumhuber E Sensors (Basel). 2024; 24(1).

PMID: 38202988 PMC: 10781259. DOI: 10.3390/s24010126.


Multi-source transfer learning for facial emotion recognition using multivariate correlation analysis.

B A, Sarkar A, Behera P, Shukla J Sci Rep. 2023; 13(1):21004.

PMID: 38017241 PMC: 10684585. DOI: 10.1038/s41598-023-48250-x.


Study on emotion recognition bias in different regional groups.

Lukac M, Zhambulova G, Abdiyeva K, Lewis M Sci Rep. 2023; 13(1):8414.

PMID: 37225756 PMC: 10209154. DOI: 10.1038/s41598-023-34932-z.


References
1.
Chu W, de la Torre F, Cohn J . Selective Transfer Machine for Personalized Facial Action Unit Detection. Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit. 2014; 2013:3515-3522. PMC: 4169220. DOI: 10.1109/CVPR.2013.451. View

2.
Ding X, Chu W, de la Torre F, Cohn J, Wang Q . Facial Action Unit Event Detection by Cascade of Tasks. Proc IEEE Int Conf Comput Vis. 2014; 2013:2400-2407. PMC: 4174346. DOI: 10.1109/ICCV.2013.298. View

3.
Sayette M, Creswell K, Dimoff J, Fairbairn C, Cohn J, Heckman B . Alcohol and group formation: a multimodal investigation of the effects of alcohol on emotion and social bonding. Psychol Sci. 2012; 23(8):869-78. PMC: 5462438. DOI: 10.1177/0956797611435134. View

4.
Zhang S, Huang J, Li H, Metaxas D . Automatic image annotation and retrieval using group sparsity. IEEE Trans Syst Man Cybern B Cybern. 2012; 42(3):838-49. DOI: 10.1109/TSMCB.2011.2179533. View

5.
Zhu Y, de la Torre F, Cohn J, Zhang Y . Dynamic Cascades with Bidirectional Bootstrapping for Action Unit Detection in Spontaneous Facial Behavior. IEEE Trans Affect Comput. 2015; 2(2):79-91. PMC: 4644350. DOI: 10.1109/T-AFFC.2011.10. View