Back to Search
Start Over
The Fermi-LAT Dataprocessing Pipeline
- Source :
- Computing in High Energy and Nuclear Physics (CHEP2012), Computing in High Energy and Nuclear Physics (CHEP2012), May 2012, New-York, United States
- Publication Year :
- 2012
- Publisher :
- HAL CCSD, 2012.
-
Abstract
- The Data Handling Pipeline ("Pipeline") has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. In addition it receives heavy use in performing production MonteCarlo tasks. In daily use it receives a new data download every 3 hours and launches about 2000 jobs to process each download, typically completing the processing of the data before the next download arrives. The need for manual intervention has been reduced to less than
- Subjects :
- [PHYS.ASTR.IM]Physics [physics]/Astrophysics [astro-ph]/Instrumentation and Methods for Astrophysic [astro-ph.IM]
[INFO.INFO-DC]Computer Science [cs]/Distributed, Parallel, and Cluster Computing [cs.DC]
[SDU.ASTR.IM]Sciences of the Universe [physics]/Astrophysics [astro-ph]/Instrumentation and Methods for Astrophysic [astro-ph.IM]
Subjects
Details
- Language :
- English
- Database :
- OpenAIRE
- Journal :
- Computing in High Energy and Nuclear Physics (CHEP2012), Computing in High Energy and Nuclear Physics (CHEP2012), May 2012, New-York, United States
- Accession number :
- edsair.dedup.wf.001..486fb039d86bd2c9827c667b4edcc9c4