Photometric Redshift Estimates using Bayesian Neural Networks in the CSST Survey

Abstract Galaxy photometric redshift (photo z ) is crucial in cosmological studies, such as weak gravitational lensing and galaxy angular clustering measurements. In this work, we try to extract photo z information and construct its probability distribution function (PDF) using the Bayesian neural n...

Full description

Saved in:
Bibliographic Details
Published inResearch in astronomy and astrophysics Vol. 22; no. 11; pp. 115017 - 115033
Main Authors Zhou, Xingchen, Gong, Yan, Meng, Xian-Min, Chen, Xuelei, Chen, Zhu, Du, Wei, Fu, Liping, Luo, Zhijian
Format Journal Article
LanguageEnglish
Published Beijing National Astromonical Observatories, CAS and IOP Publishing 01.11.2022
IOP Publishing
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Abstract Galaxy photometric redshift (photo z ) is crucial in cosmological studies, such as weak gravitational lensing and galaxy angular clustering measurements. In this work, we try to extract photo z information and construct its probability distribution function (PDF) using the Bayesian neural networks from both galaxy flux and image data expected to be obtained by the China Space Station Telescope (CSST). The mock galaxy images are generated from the Hubble Space Telescope - Advanced Camera for Surveys (HST-ACS) and COSMOS catalogs, in which the CSST instrumental effects are carefully considered. In addition, the galaxy flux data are measured from galaxy images using aperture photometry. We construct a Bayesian multilayer perceptron (B-MLP) and Bayesian convolutional neural network (B-CNN) to predict photo z along with the PDFs from fluxes and images, respectively. We combine the B-MLP and B-CNN together, and construct a hybrid network and employ the transfer learning techniques to investigate the improvement of including both flux and image data. For galaxy samples with signal-to-noise ratio (SNR) > 10 in g or i band, we find the accuracy and outlier fraction of photo z can achieve σ NMAD = 0.022 and η = 2.35% for the B-MLP using flux data only, and σ NMAD = 0.022 and η = 1.32% for the B-CNN using image data only. The Bayesian hybrid network can achieve σ NMAD = 0.021 and η = 1.23%, and utilizing transfer learning technique can improve results to σ NMAD = 0.019 and η = 1.17%, which can provide the most confident predictions with the lowest average uncertainty.
Bibliography:RAA-2022-0241.R1
ISSN:1674-4527
2397-6209
DOI:10.1088/1674-4527/ac9578