1. Multimodal Analysis and Estimation of Intimate Self-Disclosure
- Author
-
Jan Ondras, Sin-Hwa Kang, Mohammad Soleymani, Kalin Stefanov, and Jonathan Gratch
- Subjects
0209 industrial biotechnology ,Social robot ,Modality (human–computer interaction) ,Modalities ,Computer science ,05 social sciences ,Natural language understanding ,050801 communication & media studies ,Cognition ,02 engineering and technology ,computer.software_genre ,Mental health ,InformationSystems_MODELSANDPRINCIPLES ,020901 industrial engineering & automation ,0508 media and communications ,ComputerApplications_MISCELLANEOUS ,Self-disclosure ,computer ,Cognitive psychology ,Gesture - Abstract
Self-disclosure to others has a proven benefit for one’s mental health. It is shown that disclosure to computers can be similarly beneficial for emotional and psychological well-being. In this paper, we analyzed verbal and nonverbal behavior associated with self-disclosure in two datasets containing structured human-human and human-agent interviews from more than 200 participants. Correlation analysis of verbal and nonverbal behavior revealed that linguistic features such as affective and cognitive content in verbal behavior, and nonverbal behavior such as head gestures are associated with intimate self-disclosure. A multimodal deep neural network was developed to automatically estimate the level of intimate self-disclosure from verbal and nonverbal behavior. Between modalities, verbal behavior was the best modality for estimating self-disclosure within-corpora achieving r = 0.66. However, the cross-corpus evaluation demonstrated that nonverbal behavior can outperform language modality in cross-corpus evaluation. Such automatic models can be deployed in interactive virtual agents or social robots to evaluate rapport and guide their conversational strategy.
- Published
- 2019
- Full Text
- View/download PDF