1. Deep Neural Network-Based Generation of Planar CH Distribution through Flame Chemiluminescence in Premixed Turbulent Flame
- Author
-
Lei Han, Qiang Gao, Dayuan Zhang, Zhanyu Feng, Zhiwei Sun, Bo Li, and Zhongshan Li
- Subjects
Turbulent flame front ,Neural network ,Conditional generative adversarial nets ,Laser diagnostics ,Chemiluminescence ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 ,Computer software ,QA76.75-76.765 - Abstract
Flame front structure is one of the most fundamental characteristics and, hence, vital for understanding combustion processes. Measuring flame front structure in turbulent flames usually needs laser-based diagnostic techniques, mostly planar laser-induced fluorescence (PLIF). The equipment of PLIF, burdened with lasers, is often too sophisticated to be configured in harsh environments. Here, to shed the burden, we propose a deep neural network-based method to generate the structures of flame fronts using line-of-sight CH* chemiluminescence that can be obtained without the use of lasers. A conditional generative adversarial network (C-GAN) was trained by simultaneously recording CH-PLIF and chemiluminescence images of turbulent premixed methane/air flames. Two distinct generators of the C-GAN, namely Resnet and U-net, were evaluated. The former net performs better in this study in terms of both generating snap-shot images and statistics over multiple images. For chemiluminescence imaging, the selection of the camera's gate width produces a trade-off between the signal-to-noise (SNR) ratio and the temporal resolution. The trained C-GAN model can generate CH-PLIF images from the chemiluminescence images with an accuracy of over 91% at a Reynolds number of 5000, and the flame surface density at a higher Reynolds number of 10,000 can also be effectively estimated by the model. This new method has the potential to achieve the flame characteristics without the use of laser and significantly simplify the diagnosing system, also with the potential for high-speed flame diagnostics.
- Published
- 2023
- Full Text
- View/download PDF