1. Open-vocabulary Multimodal Emotion Recognition: Dataset, Metric, and Benchmark
- Author
-
Lian, Zheng, Sun, Haiyang, Sun, Licai, Chen, Lan, Chen, Haoyu, Gu, Hao, Wen, Zhuofan, Chen, Shun, Zhang, Siyuan, Yao, Hailiang, Xu, Mingyu, Chen, Kang, Liu, Bin, Liu, Rui, Liang, Shan, Li, Ya, Yi, Jiangyan, and Tao, Jianhua
- Subjects
Computer Science - Human-Computer Interaction - Abstract
Multimodal Emotion Recognition (MER) is an important research topic. This paper advocates for a transformative paradigm in MER. The rationale behind our work is that current approaches often rely on a limited set of basic emotion labels, which do not adequately represent the rich spectrum of human emotions. These traditional and overly simplistic emotion categories fail to capture the inherent complexity and subtlety of human emotional experiences, leading to limited generalizability and practicality. Therefore, we propose a new MER paradigm called Open-vocabulary MER (OV-MER), which encompasses a broader range of emotion labels to reflect the richness of human emotions. This paradigm relaxes the label space, allowing for the prediction of arbitrary numbers and categories of emotions. To support this transition, we provide a comprehensive solution that includes a newly constructed database based on LLM and human collaborative annotations, along with corresponding metrics and a series of benchmarks. We hope this work advances emotion recognition from basic emotions to more nuanced emotions, contributing to the development of emotional AI.
- Published
- 2024