Back to Search
Start Over
Joint Training on Multiple Datasets With Inconsistent Labeling Criteria for Facial Expression Recognition.
- Source :
- IEEE Transactions on Affective Computing; Jul-Sep2024, Vol. 15 Issue 3, p1812-1825, 14p
- Publication Year :
- 2024
-
Abstract
- One potential way to enhance the performance of facial expression recognition (FER) is to augment the training set by increasing the number of samples. By incorporating multiple FER datasets, deep learning models can extract more discriminative features. However, the inconsistent labeling criteria and subjective biases found in annotated FER datasets can significantly hinder the recognition accuracy of deep learning models when handling mixed datasets. Effectively perform joint training on multiple datasets remains a challenging task. In this study, we propose a joint training method for training an FER model using multiple FER datasets. Our method consists of four steps: (1) selecting a subset from the additional dataset, (2) generating pseudo-continuous labels for the target dataset, (3) refining the labels of different datasets using continuous label mapping and discrete label relabeling according to the labeling criteria of the target dataset, and (4) jointly training the model using multi-task learning. We conduct joint training experiments on two popular in-the-wild FER benchmark databases, RAF-DB and CAER-S, while utilizing the AffectNet dataset as an additional dataset. The experimental results demonstrate that our proposed method outperforms the direct merging of different FER datasets into a single training set and achieves state-of-the-art performance on RAF-DB and CAER-S with accuracies of 92.24% and 94.57%, respectively. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 19493045
- Volume :
- 15
- Issue :
- 3
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Affective Computing
- Publication Type :
- Academic Journal
- Accession number :
- 179509571
- Full Text :
- https://doi.org/10.1109/TAFFC.2024.3382618