1. Accurately Estimating Redshifts from CSST Slitless Spectroscopic Survey using Deep Learning
- Author
-
Zhou, Xingchen, Gong, Yan, Zhang, Xin, Li, Nan, Meng, Xian-Min, Chen, Xuelei, Wen, Run, Han, Yunkun, Zou, Hu, Zheng, Xian Zhong, Yang, Xiaohu, Guo, Hong, and Zhang, Pengjie
- Subjects
Astrophysics - Cosmology and Nongalactic Astrophysics - Abstract
Chinese Space Station Telescope (CSST) has the capability to conduct slitless spectroscopic survey simultaneously with photometric survey. The spectroscopic survey will measure slitless spectra, potentially providing more accurate estimations of galaxy properties, particularly redshifts, compared to using broadband photometry. CSST relies on these accurate redshifts to perform baryon acoustic oscilliation (BAO) and other probes to constrain the cosmological parameters. However, due to low resolution and signal-to-noise ratio of slitless spectra, measurement of redshifts is significantly challenging.} In this study, we employ a Bayesian neural network (BNN) to assess the accuracy of redshift estimations from slitless spectra anticipated to be observed by CSST. The simulation of slitless spectra is based on real observational data from the early data release of the Dark Energy Spectroscopic Instrument (DESI-EDR) and the 16th data release of the Baryon Oscillation Spectroscopic Survey (BOSS-DR16), combined with the 9th data release of the DESI Legacy Survey (DESI LS DR9). The BNN is constructed employing transfer learning technique, by appending two Bayesian layers after a convolutional neural network (CNN), leveraging the features learned from the slitless spectra and corresponding redshifts. Our network can provide redshift estimates along with corresponding uncertainties, achieving an accuracy of $\sigma_{\rm NMAD} = 0.00063$, outlier percentage $\eta=0.92\%$ and weighted mean uncertainty $\bar{E} = 0.00228$. These results successfully fulfill the requirement of $\sigma_{\rm NMAD} < 0.005$ for BAO and other studies employing CSST slitless spectroscopic surveys., Comment: 14 pages, 12 figures, accepted for publication in ApJ
- Published
- 2024