1. Human-Centric Emotion Estimation Based on Correlation Maximization Considering Changes With Time in Visual Attention and Brain Activity
- Author
-
Yuya Moroto, Keisuke Maeda, Takahiro Ogawa, and Miki Haseyama
- Subjects
General Computer Science ,Computer science ,Brain activity and meditation ,tensor analysis ,fNIRS ,02 engineering and technology ,Correlation ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,Visual attention ,changes with time ,General Materials Science ,CCA ,Pixel ,eye gaze data ,business.industry ,General Engineering ,Pattern recognition ,Maximization ,Gaze ,Multimodal approach ,Eye tracking ,020201 artificial intelligence & image processing ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,Artificial intelligence ,business ,lcsh:TK1-9971 - Abstract
A human-centric emotion estimation method based on correlation maximization with consideration of changes with time in visual attention and brain activity when viewing images is proposed in this paper. Owing to the recent developments of many kinds of biological sensors, many researchers have focused on multimodal emotion estimation using both eye gaze data and brain activity data for improving the quality of emotion estimation. In this paper, a novel method that focuses on the following two points is introduced. First, in order to reduce the burden on users, we obtain brain activity data from users only in the training phase by using a projection matrix calculated by canonical correlation analysis (CCA) between gaze-based visual features and brain activity-based features. Next, for considering the changes with time in both visual attention and brain activity, we obtain novel features based on CCA-based projection in each time unit. In order to include these two points, the proposed method analyzes a fourth-order gaze and image tensor for which modes are pixel location, color channel and the changes with time in visual attention. Moreover, in each time unit, the proposed method performs CCA between gaze-based visual features and brain activity-based features to realize human-centric emotion estimation with a high level of accuracy. Experimental results show that accurate human emotion estimation is achieved by using our new human-centric image representation.
- Published
- 2021