1. Evolution of ATLAS analysis workflows and tools for the HL-LHC era
- Author
-
Alexei Klimentov, David Cameron, Andrés Pacheco Pages, Alessandra Forti, and David South
- Subjects
Data processing ,network [computer] ,Large Hadron Collider ,Atlas (topology) ,Scale (chemistry) ,Physics ,QC1-999 ,Volume (computing) ,ATLAS ,Computing and Computers ,computer: network ,Workflow ,Systems engineering ,ddc:530 ,data management ,activity report ,performance ,Particle Physics - Experiment - Abstract
25th International Conference on Computing in High-Energy and Nuclear Physics, vCHEP2021, Online, France, 17 May 2021 - 21 May 2021; The European physical journal / Web of Conferences 251, 02002 - (2021). doi:10.1051/epjconf/202125102002, The High Luminosity LHC project at CERN, which is expected to deliver a ten-fold increase in the luminosity of proton-proton collisions over LHC, will start operation towards the end of this decade and will deliver an unprecedented scientific data volume of multi-exabyte scale. This vast amount of data has to be processed and analysed, and the corresponding computing facilities must ensure fast and reliable data processing for physics analyses by scientific groups distributed all over the world. The present LHC computing model will not be able to provide the required infrastructure growth, even taking into account the expected evolution in hardware technology. To address this challenge, several novel methods of how end-users analysis will be conducted are under evaluation by the ATLAS Collaboration. State-of-the-art workflow management technologies and tools to handle these methods within the existing distributed computing system are now being evaluated and developed. In addition the evolution of computing facilities and how this impacts ATLAS analysis workflows is being closely followed., Published by EDP Sciences, Les Ulis
- Published
- 2021