Back to Search Start Over

Distinguishing Coupled Dark Energy Models with Neural Networks

Authors :
Goh, L. W. K.
Ocampo, I.
Nesseris, S.
Pettorino, V.
Publication Year :
2024

Abstract

We investigate whether Neural Networks (NN) can accurately differentiate between growth-rate data of the Large Scale Structure (LSS) of the Universe, simulated via two models: a cosmological constant and cold dark matter ({\Lambda}CDM) model and a tomographic Coupled Dark Energy (CDE) model. We build an NN classifier and test its accuracy in distinguishing between cosmological models. For our dataset, we generate f{\sigma_8}(z) growth-rate observables simulating a realistic Stage IV galaxy survey-like setup, for both {\Lambda}CDM and a tomographic CDE model, for various values of the model parameters. We then optimise and train our NN with Optuna, aiming to avoid overfitting and maximising the accuracy of the trained model. We conduct our analysis for both a binary classification, comparing between {\Lambda}CDM and a CDE model where only one tomographic coupling bin is activated, and a multiclass classification scenario where all the models are combined. For the case of binary classification, we find that our NN can confidently (with > 86% accuracy) detect non-zero values of the tomographic coupling regardless of the redshift range at which coupling is activated, and at a 100% confidence level, detect the {\Lambda}CDM model. For the multiclass classification task, we find that the NN performs adequately well at distinguishing between {\Lambda}CDM, a CDE model with low redshift coupling, and a model with high redshift coupling, with 99%, 79% and 84% accuracy respectively. By leveraging the power of machine learning, our pipeline can be a useful tool to analyse growth-rate data and maximise the potential of current surveys to probe for deviations from General Relativity.<br />Comment: Accepted for publication in A&A

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2411.04058
Document Type :
Working Paper