Back to Search Start Over

Facial expression recognition, a predictive tool for perceiving urban open space environments under audio-visual interaction.

Authors :
Hu, Xuejun
Meng, Qi
Yang, Da
Li, Mengmeng
Source :
Energy & Buildings. Sep2024, Vol. 318, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

• The use of facial expression recognition (FER) in sound evaluation in urban open space. • Clustering the visual environments for FER evaluation. • The interactive effect of visual and sound environments. • Data processing methods of dynamic FER data. The quality of soundscape in urban open spaces is important to wellbeing. An AI-based facial expression recognition method has the potential to be a useful tool for predicting urban sound perception since it can correctly and in real time gather human perception data from facial photos. The purpose of this study is to provide classification criteria, time dimension data processing and indicator screening methods for facial expression recognition-based soundscape prediction. The study captured typical urban audiovisual environments and recorded facial expression data from subjects in a laboratory setting. The results of the study showed that there were significant differences in the effects of the three types of visual environments and the three types of acoustic environments on the subjects' facial expressions, and that the effect of acoustic environments on visual perception (η2 0.067 on average) was generally greater than the effect of visual environments on acoustic perception (η2 0.035 on average). A linear regression model for predicting acoustic perception was developed based on audiovisual interactions, dynamic change patterns, and indicator selection. The results of this paper help to realize another perceptual dimension in the establishment of smart city and help achieve sustainability. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
03787788
Volume :
318
Database :
Academic Search Index
Journal :
Energy & Buildings
Publication Type :
Academic Journal
Accession number :
178334556
Full Text :
https://doi.org/10.1016/j.enbuild.2024.114456