Back to Search Start Over

Multimodal analysis of personality traits on videos of self-presentation and induced behavior

Authors :
Dersu Giritlioğlu
Can Ufuk Ertenli
Emre Mutlu
Burak Mandira
Hamdi Dibeklioglu
Selim Firat Yilmaz
Aslı Gül Kurt
Merve Kiniklioglu
Şeref Can Gürel
Berhan Faruk Akgür
RS: FPN CN 4
Cognition
Giritlioğlu, Dersu
Mandira, Burak
Yılmaz, Selim Fırat
Akgür, Berhan Faruk
Kınıklıoğlu, Merve
Kurt, Aslı Gül
Dibeklioğlu, Hamdi
Yilmaz, Selim Firat
Source :
Journal on Multimodal User Interfaces, 15(4), 337-358. Springer Verlag, Journal on Multimodal User Interfaces
Publication Year :
2021

Abstract

Personality analysis is an important area of research in several fields, including psychology, psychiatry, and neuroscience. With the recent dramatic improvements in machine learning, it has also become a popular research area in computer science. While the current computational methods are able to interpret behavioral cues (e.g., facial expressions, gesture, and voice) to estimate the level of (apparent) personality traits, accessible assessment tools are still substandard for practical use, not to mention the need for fast and accurate methods for such analyses. In this study, we present multimodal deep architectures to estimate the Big Five personality traits from (temporal) audio-visual cues and transcribed speech. Furthermore, for a detailed analysis of personality traits, we have collected a new audio-visual dataset, namely: Self-presentation and Induced Behavior Archive for Personality Analysis (SIAP). In contrast to the available datasets, SIAP introduces recordings of induced behavior in addition to self-presentation (speech) videos. With thorough experiments on SIAP and ChaLearn LAP First Impressions datasets, we systematically assess the reliability of different behavioral modalities and their combined use. Furthermore, we investigate the characteristics and discriminative power of induced behavior for personality analysis, showing that the induced behavior indeed includes signs of personality traits.

Details

Language :
English
ISSN :
17837677
Volume :
15
Issue :
4
Database :
OpenAIRE
Journal :
Journal on Multimodal User Interfaces
Accession number :
edsair.doi.dedup.....1c437a72adcc528b2b7c1d0668e2b02d
Full Text :
https://doi.org/10.1007/s12193-020-00347-7