Back to Search
Start Over
Identifying emotions in images from valence and arousal ratings.
- Source :
- Multimedia Tools & Applications; Jul2018, Vol. 77 Issue 13, p17413-17435, 23p
- Publication Year :
- 2018
-
Abstract
- Experimental studies of emotion usually use datasets of normative emotional pictures to elicit specific emotional responses in human subjects. However, most of these datasets are not annotated with discriminating and reliable emotional tags, having only valence and arousal ratings for each image. Complementing this information with emotional tags would enrich the datasets, by increasing the number of annotated images available and consequently reducing the use of the same images in consecutive studies. This paper describes a multi-label recognizer that combines a Fuzzy approach with a Random Forest classifier to recognize both polarity and discrete emotions elicited by an image, using its valence and arousal ratings. Polarity indicates whether the emotional content of the image is negative, neutral, or positive, whereas emotions provide a more detailed description of the emotional content conveyed by the image. We evaluated our multi-label recognizer using pictures from four existing datasets containing images annotated with emotional content and valence and arousal ratings. Experimental results show that our recognizer is able to identify polarity with a precision of 84.8%, single emotions with 80.7%, and two emotions with 81.1%. Our recognizer can be useful to researchers who want to identify polarity and/or emotions from stimuli annotated with valence and arousal ratings. In particular, it can be used to automatically annotate with emotional tags already existent image datasets, avoiding the costs of manually annotating them with human subjects. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 13807501
- Volume :
- 77
- Issue :
- 13
- Database :
- Complementary Index
- Journal :
- Multimedia Tools & Applications
- Publication Type :
- Academic Journal
- Accession number :
- 130842857
- Full Text :
- https://doi.org/10.1007/s11042-017-5311-8