1. Scientific Data Lake for High Luminosity LHC project and other data-intensive particle and astro-particle physics experiments
- Author
-
A Klimentov, V Mitsyn, A Smirnov, A Kiryanov, Aleksandr Alekseev, T Korchuganova, S Smirnov, D Oleynik, and A Zarochentsev
- Subjects
Nuclear physics ,History ,Distributed Computer Systems ,Luminosity (scattering theory) ,Large Hadron Collider ,Computer science ,Particle ,High Energy Physics ,Virtual Organization ,Particle physics experiments ,Computer Science Applications ,Education - Abstract
Indexación Scopus The next phase of LHC Operations-High Luminosity LHC (HL-LHC), which is aimed at ten-fold increase in the luminosity of proton-proton collisions at the energy of 14 TeV, is expected to start operation in 2027-2028 and will deliver an unprecedented scientific data volume of multi-exabyte scale. This amount of data has to be stored and the corresponding storage system should ensure fast and reliable data delivery for processing by scientific groups distributed all over the world. The present LHC computing and data processing model will not be able to provide the required infrastructure growth even taking into account the expected hardware technology evolution. To address this challenge the new state-of-The-Art computing infrastructure technologies are now being developed and are presented here. The possibilities of application of the HL-LHC distributed data handling technique for other particle and astro-particle physics experiments dealing with large-scale data volumes like DUNE, LSST, Belle-II, JUNO, SKAO etc. are also discussed. © Published under licence by IOP Publishing Ltd. https://iopscience-iop-org.recursosbiblioteca.unab.cl/article/10.1088/1742-6596/1690/1/012166
- Published
- 2020