Back to Search Start Over

Electroencephalogram-based mental workload prediction for using Augmented Reality head mounted display in construction assembly: A deep learning approach.

Authors :
Qin, Yimin
Bulbul, Tanyel
Source :
Automation in Construction. Aug2023, Vol. 152, pN.PAG-N.PAG. 1p.
Publication Year :
2023

Abstract

Augmented Reality Head-Mounted Displays (AR HMD) are considered a promising technology for assisting on-site construction tasks such as assembly. However, concerns about creating information overload and becoming a distraction to workers outweigh the potential benefits to productivity and task performance. Previous research investigated workers' mental workloads as impacted by AR-based displays for assembly tasks. Within that context, this paper proposes a Long Short-Term Memory (LSTM) based approach to predict mental workload when using AR HMD for construction assembly, to forecast users' cognitive status under such complex working condition. Thirty participants were recruited to finish a wood frame assembly using an AR HMD projecting a 3D conformal model on the workspace. The proposed method provided a reliable prediction of real-time electroencephalogram (EEG) signal and mental workload. The outcomes validated the feasibility of LSTM model for EEG-based mental workload prediction and specifically provided a usable method for evaluating AR HMD use in construction tasks. • Discussed the benefits of predicting mental workload while using AR-HMD in construction assembly tasks. • Conducted an AR-based assembly test with 30 subjects to explore mental workload oscillations and feasibility of prediction. • A deep learning approach was deployed to forecast the raw EEG data and predict subjects' mental workload. • The performance of proposed LSTM model was compared with other commonly used RNNs. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09265805
Volume :
152
Database :
Academic Search Index
Journal :
Automation in Construction
Publication Type :
Academic Journal
Accession number :
164285555
Full Text :
https://doi.org/10.1016/j.autcon.2023.104892