13,224 results
Search Results
2. The Snowmass UHECR White Paper on Ultra-High-Energy Cosmic Rays
- Author
-
Schroeder Frank G., Coleman Alan, Eser Johannes, Mayotte Eric, Sarazin Fred, Soldin Dennis, and Venters Tonia M.
- Subjects
Physics ,QC1-999 - Abstract
This proceeding summarizes the talk given at the opening of the UHECR 2022 conference in L’Aquila on the whitepaper ‘Ultra-High-Energy Cosmic Rays: The Intersection of the Cosmic and Energy Frontiers’ [Astroparticle Physics 149 (2023) 102819 - arXiv:2205.05845] that has been prepared for the Snowmass survey in the USA. The whitepaper provides an overview of recent progress and open questions regarding the particle physics and astrophysics related to ultra-high-energy cosmic rays (UHECR) and outlines the connections between the particle and astrophysics aspects of cosmic rays. It also discusses what instrumentation is needed to address the major scientific questions in ultra-high-energy cosmic-ray physics. While the upgraded Pierre Auger Observatory and Telescope Array will remain the workhorses at the highest energies in the current decade, new experiments with significantly higher exposure are needed in the coming decade. Ground arrays featuring simultaneous detection of the position of the shower maximum and the size of the muonic component will enable particle astronomy by measuring the rigidity of individual events. They should be complemented by other detectors maximizing the total exposure. This can be achieved by a few next-generation experiments using the latest developments in detection and analysis techniques: GRAND as a ground-based radio array, and POEMMA as a space-borne stereo fluorescence telescope will feature complementary approaches to provide maximum exposure; IceCube-Gen2 with its surface array, and GCOS aim at increased statistics with high accuracy for particle physics and rigidity-based galactic and extra-galactic astrophysics. While designed to discover the astrophysical cosmic-ray sources at the highest energies, the same experiments also contribute to particle physics, e.g., by studying the muon puzzle in cosmic-ray air showers, and by their discovery potential for exciting new physics, such as certain Dark Matter candidates. With the full whitepaper available as a reference, this proceeding will briefly present the science cases of the experiments, highlighting their individual strengths and outlining how they complement each other.
- Published
- 2023
- Full Text
- View/download PDF
3. Evaluating High Impact Papers: Are We Missing Something?
- Author
-
Winkelman, Sherry, Rots, Arnold, and D'Abrusco, Raffaele
- Subjects
BIBLIOGRAPHICAL citations ,CITATION analysis ,RESEARCH management ,DATA analysis ,CITATION indexes - Abstract
The bibliographic science papers with high citations rates are often used as an indication of the science impact of an observatory. These high impact papers are presented as examples of the best science being done with an observatory's data. But, is the number of citations by itself a good indicator of the scientific impact of the paper, and is impact a good indicator of the scientific impact of the observatory? In this paper we will present results from a recent study of Chandra high impact papers and suggest some alternative methods for identifying such papers. This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
4. Improved Constraints on Mergers with SZ, Hydrodynamical simulations, Optical, and X-ray (ICM-SHOX): Paper II: Galaxy cluster sample overview.
- Author
-
Silich, E.M., Bellomi, E., Sayers, J., ZuHone, J., Chadayammuri, U., Golwala, S., Hughes, D., Montaña, A., Mroczkowski, T., Nagai, D., Sánchez, D., Stanford, S.A., Wilson, G., Zemcov, M., and Zitrin, A.
- Subjects
GALAXY clusters ,HYDRODYNAMICS ,DARK matter ,PLASMA gases ,ASTRONOMICAL observations - Abstract
Galaxy cluster mergers are representative of a wide range of physics, making them an excellent probe of the properties of dark matter and the ionized plasma of the intracluster medium. To date, most studies have focused on mergers occurring in the plane of the sky, where morphological features can be readily identified. To allow study of mergers with arbitrary orientation, we have assembled multi-probe data for the eight-cluster ICM-SHOX sample sensitive to both morphology and line of sight velocity. The first ICM-SHOX paper [1] provided an overview of our methodology applied to one member of the sample, MACS J0018.5+1626, in order to constrain its merger geometry. That work resulted in an exciting new discovery of a velocity space decoupling of its gas and dark matter distributions. In this work, we describe the availability and quality of multi-probe data for the full ICM-SHOX galaxy cluster sample. These datasets will form the observational basis of an upcoming full ICM-SHOX galaxy cluster sample analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. A Bibliometric Analysis of Observatory Publications 2011-2015.
- Author
-
Crabtree, Dennis
- Subjects
BIBLIOMETRICS ,STATISTICAL methods in information science ,BIBLIOGRAPHY ,CITATION analysis ,ARTICLE-level metrics - Abstract
Bibliometrics are increasingly used as metrics to measure the performance of individuals, institutions and countries. Refereed publications are the primary output of modern observatories. In this paper, I use bibliometric techniques to examine the performance of astronomical observatories. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
6. The Snowmass UHECR White Paper on Ultra-High-Energy Cosmic Rays
- Author
-
Frank G. Schroeder, Alan Coleman, Johannes Eser, Eric Mayotte, Fred Sarazin, Dennis Soldin, and Tonia M. Venters
- Subjects
General Medicine - Abstract
This proceeding summarizes the talk given at the opening of the UHECR 2022 conference in L’Aquila on the whitepaper ‘Ultra-High-Energy Cosmic Rays: The Intersection of the Cosmic and Energy Frontiers’ [Astroparticle Physics 149 (2023) 102819 - arXiv:2205.05845] that has been prepared for the Snowmass survey in the USA. The whitepaper provides an overview of recent progress and open questions regarding the particle physics and astrophysics related to ultra-high-energy cosmic rays (UHECR) and outlines the connections between the particle and astrophysics aspects of cosmic rays. It also discusses what instrumentation is needed to address the major scientific questions in ultra-high-energy cosmic-ray physics. While the upgraded Pierre Auger Observatory and Telescope Array will remain the workhorses at the highest energies in the current decade, new experiments with significantly higher exposure are needed in the coming decade. Ground arrays featuring simultaneous detection of the position of the shower maximum and the size of the muonic component will enable particle astronomy by measuring the rigidity of individual events. They should be complemented by other detectors maximizing the total exposure. This can be achieved by a few next-generation experiments using the latest developments in detection and analysis techniques: GRAND as a ground-based radio array, and POEMMA as a space-borne stereo fluorescence telescope will feature complementary approaches to provide maximum exposure; IceCube-Gen2 with its surface array, and GCOS aim at increased statistics with high accuracy for particle physics and rigidity-based galactic and extra-galactic astrophysics. While designed to discover the astrophysical cosmic-ray sources at the highest energies, the same experiments also contribute to particle physics, e.g., by studying the muon puzzle in cosmic-ray air showers, and by their discovery potential for exciting new physics, such as certain Dark Matter candidates. With the full whitepaper available as a reference, this proceeding will briefly present the science cases of the experiments, highlighting their individual strengths and outlining how they complement each other.
- Published
- 2023
- Full Text
- View/download PDF
7. Toward Binding Database Interfaces with Scientific Papers
- Author
-
Michel Laurent, Acar Sinan, Landais Gilles, and Schaaff André
- Subjects
Physics ,QC1-999 - Abstract
Despite a large variety of facilities helping to either select or manipulate data from Web interfaces, it remains diffcult to provide users with relevant scientific or technical annotations for those data. Introducing such content by hand into a Web interface is a tedious job with a risk of providing in complete or inadequate content. To overcome this diffculty, we are exploring the possibility of using the names of exposed quantities to index a text corpus. This index can be used to show the most relevant text snippets in a given context. The full text can be displayed by user request and automatically scroll down to that snippet. Our approach is based on the conversion of PDF papers into machinereadable files that are indexed by a search engine. Index entries are reported as PDF annotations that are used to control the display. This workflow has been tested on the IVOA standard corpus as a proof of concept. It has then been applied to the XMM-Newtonuser guides for our catalog interface. Finally, it has been adapted to find resources within portals exposing a lot of various data collections.
- Published
- 2018
- Full Text
- View/download PDF
8. Progress of Astronomy in India: A Scientometric Study base on paper published during 1991 – 1995 and 2011 – 2015
- Author
-
Kumar Rai Vijay, Senger K.P. S., and Lohiya Rajesh K
- Subjects
Physics ,QC1-999 - Abstract
Astronomy is the oldest of the natural sciences. It is well known that the period covering about seven centuries, from Aryabhata to Bhaskaracharya II (c. 476-1150), was the golden age of Indian astronomy. In the present study, we identify research trends and the growth of knowledge in the field of astronomy science research. The study is an essential tool to measure the scientific publications in the field. This study analyses the publications of astronomy research in India published during 1991-1995 and 2011-2015. The study assesses how astronomy progresses in India and its impact is reflected in the science citation index over the period 1991-1995 and 2011-2015. The publication output has been analyzed using quantitative and qualitative indicators, such as progress in research article publishing, change in authorship patterns, citation patterns during 1991-1995 and 2011-2015, articles that received the most citations, and international collaboration with other countries. Further, the study investigated highly prolific authors and highly preferred journals. The author is from one of the FORSA institutes, therefore the study includes a separate ranking for FORSA institutes in term of output as well citations.
- Published
- 2018
- Full Text
- View/download PDF
9. Progress of Astronomy in India: A Scientometric Study base on paper published during 1991 - 1995 and 2011 - 2015.
- Author
-
Rai, Vijay Kumar, Senger, K. P. S., and Lohiya, Rajesh K.
- Subjects
SCIENTOMETRICS ,BIBLIOMETRICS ,STATISTICAL methods in information science ,SCIENTIFIC observation ,SCIENTIFIC method - Abstract
Astronomy is the oldest of the natural sciences. It is well known that the period covering about seven centuries, from Aryabhata to Bhaskaracharya II (c. 476-1150), was the golden age of Indian astronomy. In the present study, we identify research trends and the growth of knowledge in the field of astronomy science research. The study is an essential tool to measure the scientific publications in the field. This study analyses the publications of astronomy research in India published during 1991-1995 and 2011-2015. The study assesses how astronomy progresses in India and its impact is reflected in the science citation index over the period 1991-1995 and 2011-2015. The publication output has been analyzed using quantitative and qualitative indicators, such as progress in research article publishing, change in authorship patterns, citation patterns during 1991- 1995 and 2011-2015, articles that received the most citations, and international collaboration with other countries. Further, the study investigated highly prolific authors and highly preferred journals. The author is from one of the FORSA institutes, therefore the study includes a separate ranking for FORSA institutes in term of output as well citations. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
10. Summary of papers on technology
- Author
-
Plaum Burkhard
- Subjects
Physics ,QC1-999 - Abstract
The contributions on technology are summarized.
- Published
- 2017
- Full Text
- View/download PDF
11. Use Cases of the ESO Telescope Bibliography.
- Author
-
Grothkopf, Uta, Meakins, Silvia, and Bordelon, Dominic
- Subjects
DATA management ,TELESCOPES ,INFORMATION sharing ,DATA analysis ,DATA binning - Abstract
ESO's mission is to enable front-line research in astrophysics by operating and maintaining a wide range of world-class telescopes and state-of-the-art instruments. Various key performance indicators are used to understand whether ESO is achieving these ambitious goals. In order to assist the ESO Management, the Library builds and maintains the Telescope Bibliography (telbib). telbib provides insights into the performance of ESO's facilities and helps to understand publishing trends among the user community. This paper will highlight some use cases and recently added enhancements. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
12. Evaluating High Impact Papers: Are We Missing Something?
- Author
-
Raffaele D'Abrusco, Sherry L. Winkelman, and Arnold H. Rots
- Subjects
Alternative methods ,010308 nuclear & particles physics ,Astrophysics::High Energy Astrophysical Phenomena ,Physics ,QC1-999 ,Astrophysics::Instrumentation and Methods for Astrophysics ,01 natural sciences ,Data science ,Computer Science::Digital Libraries ,Work (electrical) ,Observatory ,0103 physical sciences ,Center (algebra and category theory) ,010306 general physics - Abstract
The bibliographic science papers with high citations rates are often used as an indication of the science impact of an observatory. These high impact papers are presented as examples of the best science being done with an observatory’s data. But, is the number of citations by itself a good indicator of the scientific impact of the paper, and is impact a good indicator of the scientific impact of the observatory? In this paper we will present results from a recent study of Chandra high impact papers and suggest some alternative methods for identifying such papers. This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.
- Published
- 2018
13. Toward Binding Database Interfaces with Scientific Papers
- Author
-
Sinan Acar, G. Landais, Laurent Michel, and André Schaaff
- Subjects
Text corpus ,Information retrieval ,Workflow ,Index (publishing) ,Proof of concept ,Interface (Java) ,Physics ,QC1-999 ,Context (language use) ,User interface ,Snippet - Abstract
Despite a large variety of facilities helping to either select or manipulate data from Web interfaces, it remains diffcult to provide users with relevant scientific or technical annotations for those data. Introducing such content by hand into a Web interface is a tedious job with a risk of providing in complete or inadequate content. To overcome this diffculty, we are exploring the possibility of using the names of exposed quantities to index a text corpus. This index can be used to show the most relevant text snippets in a given context. The full text can be displayed by user request and automatically scroll down to that snippet. Our approach is based on the conversion of PDF papers into machinereadable files that are indexed by a search engine. Index entries are reported as PDF annotations that are used to control the display. This workflow has been tested on the IVOA standard corpus as a proof of concept. It has then been applied to the XMM-Newtonuser guides for our catalog interface. Finally, it has been adapted to find resources within portals exposing a lot of various data collections.
- Published
- 2018
14. Design and Development of a Color Picker System to Integrate in POC Device Systems
- Author
-
Serafinelli Caterina, Fantoni Alessandro, Fernandes Miguel Tavares, Alegria Elisabete C.B.A., and Vieira Manuela
- Subjects
color sensor ,rgb display ,aunps ,plasmonic paper ,Physics ,QC1-999 - Abstract
Nowadays is increasing the demand for miniaturized, user-friendly, automated, and portable sensing systems able to provide a fast and reliable response. In this context, colorimetric detection has emerged for its intrinsic advantages, such as simplicity and rapidity, but also for the outstanding development of novel materials, such as plasmonic nanoparticles, and new technologies. Here, the Color Picker system, a system reproducing in the ba has been developed and tested on a plasmonic paper. The aim is to provide a tool for a colorimetric detection that can be successively integrated in next generation diagnostic devices for real world applications.
- Published
- 2024
- Full Text
- View/download PDF
15. Summary of papers on technology
- Author
-
Burkhard Plaum
- Subjects
Computer science ,Physics ,QC1-999 ,Library science - Abstract
The contributions on technology are summarized.
- Published
- 2017
16. Demystifying and defending diversity, equity, and inclusion.
- Author
-
Cochran, Geraldine
- Subjects
DIVERSITY & inclusion policies ,CULTURAL pluralism ,SOCIAL factors ,PHYSICISTS ,SOCIODEMOGRAPHIC factors - Abstract
The constructs of diversity, equity, and inclusion are complex and the DEI movement evokes a variety of strikingly different feelings and responses. One of the challenges to diversity, equity, and inclusion initiatives is that people often have very different views on these endeavors. In this short paper, I offer a concise, global perspective on the conceptualization of diversity, equity, and inclusion that might provide a method for communicating across differences. Though the DEI movement, and associated movements such as the DEI and belongingness movement and the DEI and access movements are important, the length of this paper does not allow for a discussion of these topics. I conclude this paper with a defense of diversity, equity, and inclusion and a discussion of implications for the physics community. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Progress of Astronomy in India: A Scientometric Study base on paper published during 1991 – 1995 and 2011 – 2015
- Author
-
K.P. S. Senger, Rajesh Kumar Lohiya, and Vijay Kumar Rai
- Subjects
Growth of knowledge ,Science research ,Publishing ,business.industry ,Physics ,QC1-999 ,Science Citation Index ,Astronomy ,Research article ,business ,Citation ,Field (geography) ,Indian astronomy - Abstract
Astronomy is the oldest of the natural sciences. It is well known that the period covering about seven centuries, from Aryabhata to Bhaskaracharya II (c. 476-1150), was the golden age of Indian astronomy. In the present study, we identify research trends and the growth of knowledge in the field of astronomy science research. The study is an essential tool to measure the scientific publications in the field. This study analyses the publications of astronomy research in India published during 1991-1995 and 2011-2015. The study assesses how astronomy progresses in India and its impact is reflected in the science citation index over the period 1991-1995 and 2011-2015. The publication output has been analyzed using quantitative and qualitative indicators, such as progress in research article publishing, change in authorship patterns, citation patterns during 1991-1995 and 2011-2015, articles that received the most citations, and international collaboration with other countries. Further, the study investigated highly prolific authors and highly preferred journals. The author is from one of the FORSA institutes, therefore the study includes a separate ranking for FORSA institutes in term of output as well citations.
- Published
- 2018
- Full Text
- View/download PDF
18. Measuring Research Impact of Astronomers/Astrophysicists by using Astrophysics Data System Beta: A Powerful New Interface: A case study with Special Reference to Prof. Jayant V. Narlikar.
- Author
-
Sahu, Hemant Kumar and Singh, Surya Nath
- Subjects
BIBLIOMETRICS ,CITATION analysis ,SCIENTOMETRICS ,SCIENTIFIC method ,STATISTICAL methods in information science - Abstract
This paper highlights qualitatively and quantitatively research and presents a valuable overview of new citation enhanced databases in the context of research evaluation for the productivity of Prof. Jayant V. Narlikar. He has total 472 research publications in the fields of Astronomy and Astrophysics (AA) published from 1961-2015. In addition, this paper provides an overview of the citation-enhanced databases viz Astrophysics Data System Beta: A powerful new interface for performing citation analysis. Previously, scientometrics had been used to measure the publication productivity of Prof. Jayant V. Narlikar using the Astrophysics Data System (ADS). The scope of this paper is limited to Astrophysics Data System Beta. The result indicates that most of his papers are published in peer reviewed journals having the highest Impact Factor. The average number of publications per year is 8.74 and with the maximum papers published during 1981-1990 and 1996-2000. The total number of citations for his publications is 3516 covering 665 of his papers published during 1961-1970. He had many research collaborations, specifically with Prof. F. Hoyle (87 papers), Prof. G. Burbidge (38 papers), Prof. N. Wickramasinghe (22 papers) and Prof. T. Padmanabhan (21 papers). [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
19. The reactor pressure vessel surveillance program of the Belgian nuclear power plants.
- Author
-
Wagemans, Jan and Slosse, Nicolas
- Subjects
NUCLEAR power plants ,NUCLEAR reactors ,PRESSURE vessels ,RADIATION dosimetry ,NEUTRON flux - Abstract
This paper presents an overview of the Reactor Pressure Vessel surveillance program as it is applied to the seven Belgian Pressurised Water Reactors (PWR). The first part of the paper recalls the original objectives of the surveillance program, detailing the number of capsules that were installed per reactor, the number that have been retrieved so far and discusses the type of dosimeters installed. The second part is a description of the experimental techniques applied to determine the neutron fluences, a work which is performed in the reactor dosimetry laboratory at SCK CEN and is based on qualified activity measurements. Finally, the paper ends by briefly presenting the calculation scheme developed at Tractebel and by providing a statistical analysis of the C/E values over the ~ 400 in-core dosimeters and 800 ex-core dosimeters that have been analysed to date. The method offers satisfactory results with an average C/E on all dosimeters close to 1.0 and a standard deviation comprised between 8 % and 10 % depending on the dosimeter considered. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. DPA calculation in a stainless steel baffle of Chooz-A PWR using auto-dosimetry measurements.
- Author
-
Bourganel, Stéphane, Galia, Antonio, Hure, Jérémy, Bonzom, Rémy, Domain, Christophe, and Gosmain, Cécile-Aline
- Subjects
RADIATION dosimetry ,STAINLESS steel ,NUCLEAR reactor cores ,NUCLEAR reactors ,NUCLEAR engineering - Abstract
This paper presents a method dedicated to the calculation of DPA (displacements per atom) in a stainless steel element irradiated inside Chooz-A reactor core. This method is based on experimental results obtained in samples taken from the analyzed stainless steel element, and calculation results. Experimental results consist in the mass of
59 Co and the activity of60 Co, both measured in 2018, i.e. 27 years after the final shutdown of Chooz-A reactor. Calculations are carried out with the TRIPOLI-4® 3D Monte Carlo particle transport code, and the DARWIN/PEPIN2 depletion code, with a simplified modelling of Chooz-A reactor. As explained in this paper, these simplifications are corrected by the use of measurement results. Due to a lack of information about the irradiated steel element (initial59 Co quantity in particular), an iterative process was developed and used. It allows to evaluate missing information and to calculate corrected DPA results in samples. Besides, stress tests were carried out to check the robustness of the process and the uniqueness of DPA result for each sample. This work was carried out by CEA/SERMA with the financial support of EDF. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
21. The Manhattan Project and the Development of Nuclear Astrophysics.
- Author
-
Wiescher, Michael
- Subjects
NUCLEAR astrophysics ,NUCLEAR reactions ,NUCLEAR weapons testing ,PROJECT management - Abstract
This paper will provide a historical analysis of the impact of the US Manhattan Project from 1942 to 1945 and the subsequent nuclear test program 1945-1970 towards the development of the field of Nuclear Astrophysics and the interpretation of nuclear reaction processes in stars and explosive stellar environments. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Growing a Bibliography.
- Author
-
Winkelman, Sherry, Rots, Arnold, D'Abrusco, Raffaele, Becker, Glenn, Thong, Sinh, and McCollough, Michael
- Subjects
DIGITAL libraries ,BIBLIOGRAPHY ,ACCESS to information ,ELECTRONIC information resources ,BIBLIOGRAPHICAL searching - Abstract
The Chandra Data Archive (CDA) has been tracking publications based on Chandra observations in journals and on-line conference proceedings since early in the mission. Our goals are two-fold: 1) provide a means for Chandra users to search literature on Chandra-related papers to further their scientific research; and 2) provide a means for measuring the science produced from Chandra data. Over the years the database and its associated tools have expanded dramatically. In this paper I will give a history of the development of the bibliography with a focus on the human capital involved, along with the skill sets and management structures developed which allow us to maintain a very rich and extensive bibliography with a limited number of full time employees (FTEs). I will also cover how the diverse metadata collected has made the Chandra bibliography an essential resource in managing the Chandra X-ray Center. This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center. It depends critically on the services provided by the ADS. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
23. The spray measurement with two different optical methods.
- Author
-
Huněk, Adam and Bartoš, Ondřej
- Subjects
OPTICAL measurements ,ATOMIZATION ,TWO-phase flow ,NOZZLES ,OPTICAL diffraction - Abstract
The aim of the paper is to compare two optical measurement methods for the measurement of the atomization properties of the two-phase nozzle. The measured liquid is the water at a room temperature and the atomization fluid is the compressed air. The first method is the Phase Doppler Anemometer (PDA) and the second is the Laser Diffraction LD. The two commercial instruments were tested, together with inhouse photogrammetric measurement of the nozzle outlet. The tested droplets have Sauter Mean Diameter approximately around 10 µm. The presented results could be useful for the estimation of the advantages and disadvantages of the methods and recommend the application in the wide field of aerosol technology. The first results of the comparison show the fine resolution with PDA system, but the tuning of the setup requires several times more time. The accuracy of the LD is sufficient. The important advantage of the PDA system is knowledge of the velocity distribution for each measured droplet. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. The Second-Factor Authentication System at CERN.
- Author
-
Ahmad, Adeel, Aguado Corman, Asier, Short, Hannah, Valsan, Liviu, Fava, Maria, Tedesco, Paolo, Lopienski, Sebastian, Lueders, Stefan, and Brillault, Vincent
- Subjects
PHISHING ,ARCHITECTURE ,COMPUTER users ,COMPUTER literacy ,DATA flow computing - Abstract
In 2022, CERN ran its annual simulated phishing campaign in which 2000 users gave away their passwords. In a real phishing incident, this would have meant 2000 compromised accounts, unless they were protected by Two-Factor Authentication (2FA). In the same year, CERN introduced 2FA for accounts with access to critical services. The new login flow requires users to always authenticate with a 2FA token, either with Time-based one-time password (TOTP) or WebAuthn. This introduces a significant security improvement for the individual and for the laboratory. The previous flow enforced 2FA to access a small number of applications. In this paper, we will discuss the rationale behind the 2FA deployment, as well as the technical setup of 2FA in the CERN Single Sign-On system, Keycloak. The paper will give a detailed overview of the architecture for this new 2FA flow and compare how it differs from the legacy 2FA system which was in place since 2019. We share statistics on how users are responding to this change in the login flow, and the actions we have taken to improve the user experience. Finally, we briefly describe our custom extensions to Keycloak for specific use cases, which include adding roles in the user token, overriding the default Keycloak session, and modifying the user login flow. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Multicore workflow characterisation methodology for payloads running in the ALICE Grid.
- Author
-
Bertran Ferrer, Marta, Grigoras, Costin, and Badia, Rosa M.
- Subjects
WORKFLOW ,METHODOLOGY ,MOTHERBOARDS ,ROCKET payloads ,MONTE Carlo method - Abstract
For LHC Run3 the ALICE experiment software stack has been completely refactored, incorporating support for multicore job execution. Whereas in both LHC Run 1 and 2 the Grid jobs were single-process and made use of a single CPU core, the new multicore jobs spawn multiple processes and threads within the payload. Some of these multicore jobs deploy a high amount of shortlived processes, in the order of more than a dozen per second. The overhead of starting so many processes impacts the overall CPU utilization of the payloads, in particular its System component. Furthermore, the short-lived processes were not correctly accounted for by the monitoring system of the experiment. This paper presents the developed new methodology for supervising the payload execution. We also present a black box analysis of the new multicore experiment software framework tracing the used resources and system function calls issued by MonteCarlo simulation jobs. Multiple sources of overhead in the lifecycle of processes and threads have thus been identified. This paper describes how the source of each was traced and what solutions were implemented to address them. These improvements have impacted the resource consumption and the overall turnaround time of these payloads with a notable 35% reduction in execution time for a reference production job. We also introduce how this methodology will be used to further improve the efficiency of our experiment software and what other optimization venues are currently under research. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. The HSF Conditions Database Reference Implementation.
- Author
-
Mashinistov, Ruslan, Gerlach, Lino, Laycock, Paul, Formica, Andrea, Govi, Giacomo, and Pinkenburg, Chris
- Subjects
DATABASES ,COMPUTING platforms ,COMPUTER architecture ,METADATA ,REDUNDANCY in engineering - Abstract
Conditions data is the subset of non-event data that is necessary to process event data. It poses a unique set of challenges, namely a heterogeneous structure and high access rates by distributed computing. The HSF Conditions Databases activity is a forum for cross-experiment discussions inviting as broad a participation as possible. It grew out of the HSF Community White Paper work to study conditions data access, where experts from ATLAS, Belle II, and CMS converged on a common language and proposed a schema that represents best practice. Following discussions with a broader community, including NP as well as HEP experiments, a core set of use cases, functionality and behaviour was defined with the aim to describe a core conditions database API. This paper will describe the reference implementation of both the conditions database service and the client which together encapsulate HSF best practice conditions data handling. Django was chosen for the service implementation, which uses an ORM instead of the direct use of SQL for all but one method. The simple relational database schema to organise conditions data is implemented in PostgreSQL. The task of storing conditions data payloads themselves is outsourced to any POSIX-compliant filesystem, allowing for transparent relocation and redundancy. Crucially this design provides a clear separation between retrieving the metadata describing which conditions data are needed for a data processing job, and retrieving the actual payloads from storage. The service deployment using Helm on OKD will be described together with scaling tests and operations experience from the sPHENIX experiment running more than 25k cores at BNL. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Molten Salt Research Reactor (MSRR) shielding analysis using SCALE/MAVRIC sequence with continuous energy and multigroup cross sections.
- Author
-
Petrovic, Bojan, Hirji, Rakim, and Carberry, Kyle
- Subjects
MOLTEN salt reactors ,RADIATION shielding ,NUCLEAR reactors ,NUCLEAR engineering ,ANALYSIS of variance - Abstract
It is necessary to verify (and validate when experimental data are available) shielding codes and methodologies for Molten Salt Reactors (MSRs). This paper examines the impact of using a fine and coarse multigroup cross section library vs. the reference continuous energy data. Molten Salt Research Reactor (MSRR) being developed by the NEXT Research Alliance (NEXTRA) is used as a testbed. Shielding analyses are performed in support of MSRR development using the MAVRIC sequence of the SCALE 6.2.4 code package. The sequence uses deterministic forward and adjoint radiation transport calculations with FW-CADIS methodology to generate variance reduction parameters that are used by MONACO, a fixed source Monte Carlo code. Originally, MONACO was developed as a multigroup code, and a 200-group and 27-group neutron shielding-focused libraries were provided as part of the package. While the 200-group library provides adequate accuracy for many applications, it is highly desirable to confirm this for any specific analysis. In SCALE 6.2, continuous energy capability was implemented in MONACO, thus enabling direct and consistent evaluation and quantification of the impact of using a multigroup rather than the reference continuous energy library. This paper examines the impact of multigroup approximation on main radiation environment parameters in MSRR shielding analysis, including the fast flux and dpa. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Light Meson Decays at BESIII.
- Author
-
in der Wiesche, Nikolai
- Subjects
MESON decay ,PHYSICS experiments ,PAIR production ,ELECTROMAGNETIC interactions ,FIELD theory (Physics) - Abstract
The 10 billion J/ψ decays collected with the BESIII experiment offer a unique opportunity to investigate the decays of η and η′ mesons produced in the radiative J/ψ → γη
(′) transitions. Using this clean production mechanism, the BESIII experiment is making important contributions to precision studies of the strong and electromagnetic interactions in η(′) decays. Three papers are presented in which the decays η → π+ π− π0 , η → π0 π0 π0 , η′ → ηπ0 π0 π0 , and η′ → π+ π− e+ e− are studied to test C and CP symmetry, as well as the predictions of different effective field theories. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
29. Characterization of a new reduced-height CALORRE differential calorimeter for CALOR-I Irradiation in MITR.
- Author
-
Volte, A., Hauptman, S., Carpenter, D., Carette, M., Lyoussi, A., Kohse, G., and Reynard-Carette, C.
- Subjects
CALORIMETRY ,IRRADIATION ,METROLOGY ,LABORATORIES ,MEASUREMENT - Abstract
This paper presents the preliminary characterization of a new reduced-height CALORRE differential calorimeter designed and fabricated for the irradiation in the MITR within the framework of the CALOR-I program. The paper begins by focusing on the preparation of the irradiation campaign, providing a brief and concise description of the MITR core and the newly fabricated differential calorimeter assembly. Next, the paper presents the preliminary experimental characterization conducted under laboratory conditions. This section includes a detailed presentation of the updated experimental set-up and the key metrological characteristics of the sensor response obtained from the first experimental results. Furthermore, the paper estimates and highlights the responses of the calorimeter calculated by means of a 3-D numerical thermal model under real conditions considering local heat sources determined by the NRL of the MIT using MCNP code. In conclusion, the paper offers some final remarks and prospects to realize the irradiation campaign successfully. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
30. A novel smart rad-hard fast detection system for Radioactive Ion Beam Tagging and Diagnostics.
- Author
-
Acosta, Luis, Altana, Carmen, Cardella, Giuseppe, Castoldi, Andrea, Costa, Michele, De Filippo, Enrico, Geraci, Elena, Gnoffo, Brunilde, Guazzoni, Chiara, Maiolino, Cettina, Simona Martorana, Nunzia, Naggi, Andrea, Pagano, Angelo, Pagano, Emanuele Vincenzo, Pirrone, Sara, Politi, Giuseppe, Risitano, Fabio, Rizzo, Francesca, Russo, Antonio Domenico, and Russotto, Paolo
- Subjects
ION beams ,RADIOACTIVE substances ,ELECTRONICS ,ENERGY dissipation ,RADIO frequency - Abstract
Radioactive Ion Beams (RIBs) of large intensity (10
6 pps or higher) are at the frontier in nuclear physics. We designed a novel detection system for RIBs diagnostics and tagging based on Silicon Carbide detectors and on custom frontend electronics ready to be coupled with a Real Data Management Unit. The full detection system is designed to measure the spatial distribution of the beam intensity and trajectory with sufficient spatial resolution (of the order of 1-2 mm). In addition, the detection system has to determine the RIB composition that can be obtained from the joint measurement of the energy loss (>E) of the ions passing through the sensors and the time of flight between two sensors or with respect to a given reference signal as the RadioFrequency signal of a Cyclotron. In this paper we present the full design of the proposed system together with the results of the first experimental qualification of the first mini-prototype. The paper also shows the steps towards the final detection system, housed in a DN160 spherical cross and able to cover an active area of 30 mm × 60 mm. [ABSTRACT FROM AUTHOR]- Published
- 2023
- Full Text
- View/download PDF
31. Shedding light on light-flavour-particle production in small systems at the LHC with ALICE.
- Author
-
Ercolessi, Francesca
- Subjects
LARGE Hadron Collider ,HEAVY ion collisions ,QUARK-gluon plasma ,HADRONS ,PLASMA physics - Abstract
The measurement of light-flavour-particle production in small collision systems at the LHC has shown features that resemble phenomena seen in heavy-ion collisions. The historical signatures of the quark–gluon plasma (QGP) formation, such as collective flow and the enhanced production of strange hadrons, were also observed in high-multiplicity proton–proton (pp) and proton–lead (p–Pb) collisions. In this article, new results on lightflavour-particle production measured in high-multiplicity triggered events are presented, reaching charged-particle values of semi-peripheral Pb–Pb collisions. In addition, this paper presents the first Run 3 results on the production of π, K, p, and Ω multi-strange baryons in pp collisions at √S = 13:6 TeV and √S = 900 GeV, the highest and the lowest collision energies at the LHC. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Integrating LHCb Offline Workflows on Supercomputers State of Practice.
- Author
-
Boyer, Alexandre F., Stagni, Federico, Haen, Christophe, Burr, Christopher, Romanovskiy, Vladimir, and Bozzi, Concezio
- Subjects
LARGE Hadron Collider ,SUPERCOMPUTERS ,CENTRAL processing units ,HIGH performance computing ,MONTE Carlo method - Abstract
To better understand experimental conditions and performances of its experiment, the LHCb collaboration executes tens of thousands of looselycoupled and CPU-intensive Monte Carlo simulation workflows per hour. To meet the increasing LHC computing needs, funding agencies encourage the collaboration to exploit High-Performance Computing resources, and more specifically supercomputers, which offer a significant additional amount of computing resources but also come with higher integration challenges. This state-ofpractice paper outlines years of integration of LHCb simulation workflows on several supercomputers. The main contributions of this paper are: (i) an extensive description of the gap to address to run High-Energy Physics Monte Carlo simulation workflows on supercomputers; (ii) various methods and proposals to maximize the use of allocated CPU resources; (iii) a comprehensive analysis of LHCb production workflows running on diverse supercomputers. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Transformers for Generalized Fast Shower Simulation.
- Author
-
Raikwar, Piyush, Cardoso, Renato, Chernyavskaya, Nadezda, Jaruskova, Kristina, Pokorski, Witold, Salamani, Dalila, Srivatsa, Mudhakar, Tsolaki, Kalliopi, Vallecorsa, Sofia, and Zaborowska, Anna
- Subjects
COMPUTER simulation ,INTERPOLATION ,EXTRAPOLATION ,PARTICLE detectors ,CALORIMETERS - Abstract
Recently, transformer-based foundation models have proven to be a generalized architecture applicable to various data modalities, ranging from text to audio and even a combination of multiple modalities. Transformers by design should accurately model the non-trivial structure of particle showers thanks to the absence of strong inductive bias, better modeling of long-range dependencies, and interpolation and extrapolation capabilities. In this paper, we explore a transformer-based generative model for detector-agnostic fast shower simulation, where the goal is to generate synthetic particle showers, i.e., the energy depositions in the calorimeter. When trained with an adequate amount and variety of showers, these models should learn better representations compared to other deep learning models, and hence should quickly adapt to new detectors. In this work, we will show the prototype of a transformer-based generative model for fast shower simulation, as well as explore certain aspects of transformer architecture such as input data representation, sequence formation, and the learning mechanism for our unconventional shower data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Baler - Machine Learning Based Compression of Scientific Data.
- Author
-
Bengtsson Folkesson, Fritjof, Doglioni, Caterina, Ekman, Per Alexander, Gallén, Axel, Jawahar, Pratik, Camps Santasmasas, Marta, and Skidmore, Nicola
- Subjects
MACHINE learning ,DATA compression ,INFORMATION retrieval ,INFORMATION sharing ,ARTIFICIAL intelligence - Abstract
A common and growing issue in scientific research and industry is that of storing and sharing ever-increasing datasets. In this paper we document the development and applications of Baler - a Machine Learning based tool for tailored compression of data across multiple disciplines. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. End-to-end deep learning inference with CMSSW via ONNX using Docker.
- Author
-
Chaudhari, Purva, Chaudhari, Shravan, Chudasama, Ruchi, and Gleyzer, Sergei
- Subjects
PARTICLE physics ,PARTICLE detectors ,DEEP learning ,COMPACT muon solenoid experiment ,GRAPHICS processing units - Abstract
Deep learning techniques have been proven to provide excellent performance for a variety of high-energy physics applications, such as particle identification, event reconstruction and trigger operations. Recently, we developed an end-to-end deep learning approach to identify various particles using low-level detector information from high-energy collisions. These models will be incorporated in the CMS software framework (CMSSW) to enable their use for particle reconstruction or for trigger operation in real time. Incorporating these computational tools in the experimental framework presents new challenges. This paper reports an implementation of the end-to-end deep learning inference with the CMS software framework. The inference has been implemented on GPU for faster computation using ONNX. We have benchmarked the ONNX inference with GPU and CPU using NERSC's Perlmutter cluster by building a Docker image of the CMS software framework. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Software Citation in HEP: Current State and Recommendations for the Future.
- Author
-
Feickert, Matthew, Katz, Daniel S., Neubauer, Mark S., Sexton-Kennedy, Elizabeth, and Stewart, Graeme A.
- Subjects
PARTICLE physics ,LARGE Hadron Collider ,COMPACT muon solenoid experiment ,OPEN source software ,SPECIAL relativity experiments - Abstract
In November 2022, the HEP Software Foundation and the Institute for Research and Innovation for Software in High-Energy Physics organized a workshop on the topic of Software Citation and Recognition in HEP. The goal of the workshop was to bring together different types of stakeholders whose roles relate to software citation, and the associated credit it provides, in order to engage the community in a discussion on: the ways HEP experiments handle citation of software, recognition for software efforts that enable physics results disseminated to the public, and how the scholarly publishing ecosystem supports these activities. Reports were given from the publication board leadership of the ATLAS, CMS, and LHCb experiments and HEP open source software community organizations (ROOT, Scikit-HEP, MCnet), and perspectives were given from publishers (Elsevier, JOSS) and related tool providers (INSPIRE, Zenodo). This paper summarizes key findings and recommendations from the workshop as presented at the 26th International Conference on Computing in High Energy and Nuclear Physics (CHEP 2023). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Facilitating the preservation of LHCb Analyses with APD.
- Author
-
Burr, Chris, Couturier, Ben, and O'Neil, Ryunosuke
- Subjects
LARGE Hadron Collider ,PARTICLE physics ,PARTICLE detectors ,ELECTRONIC data processing ,PYTHON programming language - Abstract
High Energy Physics experiments at the Large Hadron Collider generate petabytes of data per year that go through multiple transformations before final analysis and paper publication. Recording the provenance of these data is therefore crucial to maintain the quality of the final results. While tools are in place within LHCb to keep this information for the common experiment-wide transforms, analysts have had to implement their own solutions for the steps dealing with ntuples. This gap between centralised and interactive processing can become problematic. In order to facilitate the task, ntuples extracted by LHCb analysts via so-called "Analysis Productions" are tracked in the experiment bookkeeping database and can be enriched with extra information about their meaning and intended use. This information can then be used to access ntuples more easily: a set of Python tools allow querying of ntuple file locations with associated metadata, and integrate their processing within analysis workflows. The tools are designed with the intention of ensuring analysis code continues to be functional into the future and are robust against evolutions in how data is accessed. This paper presents the integration of these new tools into the LHCb codebase and demonstrates how they will be used in LHCb data processing and analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. BABAR's Experience with the Preservation of Data and Analysis Capabilities.
- Author
-
Ebert, Marcus, Roney, Michael, and Sobie, Randall
- Subjects
ELECTRON-positron interactions ,GRID computing ,COMPUTER files ,DATA analysis - Abstract
The BABAR experiment collected electron-positron collisions at the SLAC National Accelerator Laboratory (SLAC) from 1999-2008. Although data taking has stopped 15 years ago, the collaboration is still actively doing data analyses, publishing results, and giving presentations at international conferences. Special considerations were needed to do analyses using a computing environment that was developed decades ago. A framework is required that preserves the data, data access, and the capability of doing analyses using a well defined and preserved environment. Also, BABAR's support by SLAC ended at the beginning of 2021. Fortunately, the High Energy Physics Research Computing group at the University of Victoria (UVic), Canada, offered to provide the new home for the main BABAR computing infrastructure, the Grid Computing Centre Karlsruhe offered to host all data for access by analyses running at UVic, and CERN and the IN2P3 Computing Centre offered to store a backup of all data. This paper presents what was done at BABAR to preserve the data and analysis capabilities and what was needed to move the whole computing infrastructure, including collaboration tools and data files, away from SLAC. It will be shown how BABAR preserved the ability to continue to do data analyses and also have a working collaboration tools infrastructure. This paper will describe on BABAR's experience with such a big change in its infrastructure and what was learned from it, which may be useful to other experiments which are interested in long term analysis support and data preservation in general. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Overcoming obstacles to IPv6 on WLCG.
- Author
-
Babik, Marian, Bly, Martin, Buraglio, Nick, Chown, Tim, Christidis, Dimitrios, Chudoba, Jiri, DeMar, Phil, Flix Molina, José, Grigoras, Costin, Hoeft, Bruno, Ito, Hiro, Kelsey, David, Martelli, Edoardo, McKee, Shawn, Misa Moreira, Carmen, Nandakumar, Raja, Ohrenberg, Kars, Prelz, Francesco, Rand, Duncan, and Sciabà, Andrea
- Subjects
INTERNET protocol version 6 ,LARGE Hadron Collider ,INFORMATION sharing ,GRID computing ,INFORMATION retrieval - Abstract
The transition of the Worldwide Large Hadron Collider Computing Grid (WLCG) storage services to dual-stack IPv6/IPv4 is almost complete; all Tier-1 and 94% of Tier-2 storage are IPv6 enabled. While most data transfers now use IPv6, a significant number of IPv4 transfers still occur even when both endpoints support IPv6. This paper presents the ongoing efforts of the HEPiX IPv6 working group to steer WLCG toward IPv6-only services by investigating and fixing the obstacles to the use of IPv6 and identifying cases where IPv4 is used when IPv6 is available. Removing IPv4 use is essential for the long-term agreed goal of IPv6-only access to resources within WLCG, thus eliminating the complexity and security concerns associated with dual-stack services. We present our achievements and ongoing challenges as we navigate the final stages of the transition from IPv4 to IPv6 within WLCG. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Making Likelihood Calculations Fast: Automatic Differentiation Applied to RooFit.
- Author
-
Singh, Garima, Rembser, Jonas, Moneta, Lorenzo, Lange, David, and Vassilev, Vassil
- Subjects
NUCLEAR physics ,STATISTICAL models ,PERFORMANCE evaluation ,BENCHMARK testing (Engineering) ,FINANCIAL leverage - Abstract
With the growing datasets of current and next-generation HighEnergy and Nuclear Physics (HEP/NP) experiments, statistical analysis has become more computationally demanding. These increasing demands elicit improvements and modernizations in existing statistical analysis software. One way to address these issues is to improve parameter estimation performance and numeric stability using Automatic Differentiation (AD). AD's computational efficiency and accuracy are superior to the preexisting numerical differentiation techniques, and it offers significant performance gains when calculating the derivatives of functions with a large number of inputs, making it particularly appealing for statistical models with many parameters. For such models, many HEP/NP experiments use RooFit, a toolkit for statistical modeling and fitting that is part of ROOT. In this paper, we report on the effort to support the AD of RooFit likelihood functions. Our approach is to extend RooFit with a tool that generates overheadfree C++ code for a full likelihood function built from RooFit functional models. Gradients are then generated using Clad, a compiler-based source-codetransformation AD tool, using this C++ code. We present our results from applying AD to the entire minimization pipeline and profile likelihood calculations of several RooFit and HistFactory models at the LHC-experiment scale. We show significant reductions in calculation time and memory usage for the minimization of such likelihood functions. We also elaborate on this approach's current limitations and explain our plans for the future. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Iterative and incremental development of the ATLAS Publication Tracking System.
- Author
-
Loureiro Cruz, Ana Clara, Niklaus Moreira Da Rocha Rodrigues, Carolina, de Aragão Aleksandravicius, Gabriel, Lemos Lúcidi Pinhão, Gabriela, Goes Afonso, Pedro Henrique, Coura Torres, Rodrigo, and Seixas, José Manoel
- Subjects
WORKFLOW ,COMPUTER software ,DATABASES ,COMPUTER files ,PROJECT management - Abstract
The ATLAS experiment is a particle physics experiment situated at the Large Hadron Collider (LHC) at CERN. It involves almost 6000 members from approximately 300 institutes spread all over the globe and more than 100 papers published every year. This dynamic environment brings some challenges such as how to ensure publication deadlines, communication between the groups involved, and the continuity of workflows. The solution found for those challenges was automation, which was achieved through the Glance project, more specifically through the Glance Analysis systems, developed to support the analysis and publications life cycle in 2011. Now, after twelve years, in order to satisfy the experiments' most recent needs, the systems need code refactoring and database remodelling. The goal is to have only one system to accommodate all the analysis and publications workflows, the so-called ATLAS Publication Tracking system, an evolution of the current Analysis systems. This project includes a database remodelling that reflects the hierarchical relation between analyses and publications; a code base that supports non-standard workflows; the expansion of the current API so all the authorized ATLAS members can access ATLAS publication data programmatically; a service-oriented architecture for integration with external software, such as GitLab; the creation of an automatic test environment, which assures the quality of the systems on each update. The ATLAS Publication Tracking system is a long-term project being developed with an iterative and incremental approach, which ensures that the most valuable tools are implemented with priority while allowing a smooth transition between the old systems and the new one. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. JUNO distributed computing system.
- Author
-
Zhang, Xiaomei
- Subjects
NEUTRINOS ,NEUTRONS ,DISTRIBUTED computing ,DATA management ,COMPUTER networks - Abstract
The Jiangmen Underground Neutrino Observatory (JUNO) [1] is a multipurpose neutrino experiment and the determination of the neutrino mass hierarchy is its primary physics goal. JUNO is going to start data taking in 2024 and plans to use distributed computing infrastructure for the data processing and analysis tasks. The JUNO distributed computing system has been designed and built based on DIRAC [2]. Since last year, the official Monte Carlo (MC) production has been running on the system, and petabytes of massive MC data have been shared among JUNO data centers through this system. In this paper, an overview of the JUNO distributed computing system will be presented, including workload management system, data management, and condition data access system. Moreover, the progress of adapting the system to support token-based AAI [3] and HTTP-TPC [4] will be reported. Finally, the paper will mention the preparations for the upcoming JUNO data-taking. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Streaming Readout and Data-Stream Processing With ERSAP.
- Author
-
Vardan, Gyurjyan, David, Abbott, Michael, Goodrich, Graham, Heyes, Ed, Jastrzembski, David, Lawrence, Benjamin, Raydo, and Carl, Timmer
- Subjects
EXPONENTIAL functions ,ELECTRONIC data processing ,COMPUTER programming ,CALORIMETERS ,CALORIMETRY - Abstract
With the exponential growth in the volume and complexity of data generated at high-energy physics and nuclear physics research facilities, there is an imperative demand for innovative strategies to process this data in real or near-real-time. Given the surge in the requirement for high-performance computing, it becomes pivotal to reassess the adaptability of current data processing architectures in integrating new technologies and managing streaming data. This paper introduces the ERSAP framework, a modern solution that synergizes flow-based programming with the reactive actor model, paving the way for distributed, reactive, and high performance in data stream processing applications. Additionally, we unveil a novel algorithm focused on time-based clustering and event identification in data streams. The efficacy of this approach is further exemplified through the data-stream processing outcomes obtained from the recent beam tests of the EIC prototype calorimeter at DESY. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Using MQTT and Node-RED to monitor the ATLAS Meta-data Interface (AMI) stack and define metadata aggregation tasks in a pipelined way.
- Author
-
Odier, Jérôme, Lambert, Fabian, Fulachier, Jérôme, Jaume, Maxime, and Delsart, Pierre-Antoine
- Subjects
METADATA ,TELEMETRY ,GEODESY ,WORKFLOW ,ALGORITHMS - Abstract
ATLAS Metadata Interface (AMI) is a generic ecosystem for metadata aggregation, transformation and cataloging. Each sub-system of the stack has recently been improved in order to acquire messaging/telemetry capabilities. This paper describes the whole stack monitoring with the Message Queuing Telemetry Transport (MQTT) protocol and Node-RED, a tool for wiring together hardware/software devices. Finally, this paper shows how Node-RED is used to graphically define metadata aggregation tasks, in a pipelined way, without introducing any single point of failure. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. A web workbench system for the Slurm cluster at IHEP.
- Author
-
Du, Ran, Shi, Jingyan, Jiang, Xiaowei, and Guo, Chaoqi
- Subjects
WORKBENCHES ,JASMINE ,DASHBOARDS (Management information systems) ,COLLEGE administrators ,CLUSTER analysis (Statistics) - Abstract
Slurm REST APIs are released since version 20.02. With those REST APIs one can interact with slurmctld and slurmdbd daemons in a REST- ful way. As a result, job submission and cluster status query can be achieved with a web system. To take advantage of Slurm REST APIs, a web workbench system is developed for the Slurm cluster at IHEP. The workbench system con- sists with four subsystems including dashboard, tomato, jasmine and cosmos. The dashboard subsystem is used to display cluster status including nodes and jobs. The tomato subsystem is developed to submit special HTCondor glidein jobs in the Slurm cluster. The jasmine system is used to generate and submit batch jobs based on workload parameters. The cosmos subsystem is an ac- counting system, which not only generates statistical charts but also provides REST APIs to query jobs. This paper presents design and implementation de- tails of the Slurm workbench. With the help of workbench, administrators and researchers can get their work done in an effective way. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Experimental characterization of the TCV dual-frequency gyrotron and validation of numerical codes including the effect of After Cavity Interaction.
- Author
-
Genoud, Jérémy, Alberti, Stefano, Hogge, Jean-Philippe, Avramidis, Konstantinos, Braunmüller, Falk, Bruschi, Alessandro, Bin, Wiliam, Dubray, Jérémie, Fasel, Damien, Gantenbein, Gerd, Garavaglia, Saul, Goodman, Timothy, Illy, Stefan, Jin, Jianbo, Legrand, François, Marchesin, Rodolphe, Pagonakis, Ioannis, Siravo, Ugo, and Toussaint, Matthieu
- Subjects
GYROTRONS ,MICROWAVE tubes ,RADIO frequency ,TOKAMAKS ,COMPUTER simulation - Abstract
A dual-frequency gyrotron has been developed within the context of the recent Tokamak à Configuration Variable (TCV) upgrade. The gyrotron is designed to generate a 1 MW, 2 seconds RF wave at 84 or 126 GHz. Before integrating the gyrotrons in the TCV tokamak ECRH system, an extensive characterization of their behaviour has been performed. This paper focuses on presenting the results of these experiments at the two operating frequencies. The power measurements are systematically compared with numerical simulations. This comparison highlights the validation of numerical codes and the effect of After Cavity Interaction (ACI), a crucial factor that must be considered for achieving a good agreement between theoretical predictions and experimental results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Fabrication and assembly of the gyrotron multi-stage depressed collector prototype at KIT.
- Author
-
Ell, Benjamin, Wu, Chuanren, Feuerstein, Lukas, Gantenbein, Gerd, Illy, Stefan, Ruess, Tobias, Rzesnicki, Tomasz, Stanculovic, Sebastian, Thumm, Manfred, Weggen, Jörg, and Jelonnek, John
- Subjects
GYROTRONS ,ELECTRON beams ,MICROWAVE tubes ,ELECTRODES - Abstract
In this paper details of the fabrication and assembly of the first Multi-stage Depressed Collector (MDC) prototype developed at KIT for megawatt-class gyrotrons are presented. Utilizing the E×B drift concept for electron trajectory separation, the cylindrical Short-Pulse (SP) MDC prototype features a design compatible with applications in different fusion gyrotrons at KIT, for W7-X, ITER, and DEMO. The fabrication process includes inner electrodes out of copper-chromium-zirconium (CuCr1Zr) with a triple helix isolation design, modular vacuum housing consisting of four parts, and additional external coils for electron beam confinement. Detailed assembly procedures are provided for two configurations: the 170 GHz 2 MW coaxial cavity gyrotron and the W7-X upgrade SP gyrotron. The successful manufacturing of the modular collector and a robust design for vacuum tightness are demonstrated. The prototype is now primed for verification in the KIT FULGOR test stand with the W7-X gyrotron for validation of the E×B drift concept and improvement of the gyrotron efficiency. This comprehensive work bridges theoretical concepts with practical implementation, offering insights crucial for refining and advancing MDCs for megawattclass gyrotrons for fusion applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. ITER ECH&CD Transmission Line Layout Development.
- Author
-
Wolfe, Zachary C., Kaufman, Michael C., Shanmugasundaram, Aravind, and Hanson, Gregory R.
- Subjects
ELECTRON cyclotron resonance heating ,ELECTRIC lines ,OPTICAL waveguides ,OPTICAL distortion - Abstract
A primary contributor to the effectiveness of the ITER electron cyclotron (EC) heating & current drive (H&CD) transmission line (TL) is the distortion of the waveguide which has a strong, direct correlation with power transmission efficiency and electromagnetic mode purity. Two main sources of distortion are the deflection of the TL due to operational loads and misalignment of the waveguide supports. To address these and other interdependent variables, the EC TL design focused on a holistic method to provide iterative simultaneous development of the system and microwave components. The process provided close, rapid interconnectivity between the prescribed subsystem requirements and the design activities being performed, including a detailed 3D configuration management model, a system structural finite element analysis, a thorough multi-sample Monte-Carlo system functional performance assessment, and individual microwave component designs & analyses. This paper intends to provide an overview of the first of these: the design development process of the TL layout. This includes investigating the inputs into layout optimization, the overall integration process, the interconnectivity with the other design activities, and their impacts on TL subsystem performance that ultimately resulted in the current baseline layout. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Towards absolutely calibrated ECE Michelson measurements in EC heated plasmas at W7-X.
- Author
-
Oosterbeek, Johan Willem, Stern, Mathias, Chaudhary, Neha, Arnaiz, J.F. Guerrero, Hirsch, Matthias, Kasparek, Walter, Lechte, Carsten, Plaum, Burkhard, Schmuck, Stefan, Stange, Torsten, Steffen, Matthias, and Wolf, R.C.
- Subjects
CYCLOTRON resonance ,MICHELSON interferometer ,POWER density ,NOTCH filters ,PLASMA gases - Abstract
A Michelson Interferometer is in use at Wendelstein 7-X (W7-X) to probe the Electron Cyclotron Emission (ECE) spectrum [1], [2], [3]. During the past operational campaign (OP2.1), 2
nd and 3rd harmonic ECE power density spectra have been routinely recorded in the presence of X2and O2-mode Electron Cyclotron Resonance Heating (ECRH). However, combination of the particular notch filter arrangement and high transmission line losses have thus far prevented overall calibration using a hot source cold source exposure at the input antenna. As an alternative, the response of the individual components is measured and summed. While reasonable numbers on electron temperature are obtained in X2-mode polarisation, interaction between front-end components is neglected and large error bars must be assumed. But the information on the individual components, together with synthetic modelling and data from experiment (OP2.1), has been used to design a new front-end with improved S/N. This optimisation is discussed in this paper with focus on notch filter selection, a new transmission line (Tx-line) and a novel combined quasi-optical taper / polarizer tuner. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
50. KSTAR ECH System Development Progress.
- Author
-
Joung, Mi, Wang, Sonjong, Kim, Sunggug, Han, Jongwon, Rhee, Inhyuk, and Kwak, Jonggu
- Subjects
CYCLOTRON resonance ,GYROTRONS ,PLASMA confinement ,PHYSICS experiments ,MICROWAVE tubes - Abstract
The KSTAR ECH system has been upgraded to ensure steady-state advanced operation of KSTAR. The planned ECHs include a total of six 1 MW, 300 s ECH systems, of which four 105/140 GHz dual frequency systems have been commissioned and are being used in KSTAR experiments. Additionally, a 170 GHz Russian gyrotron has been commissioned and is ready for injection. The final unit, a multifrequency gyrotron similar to Japan's ITER gyrotron, is scheduled for installation and operation in early 2025. To achieve the KSTAR mission of high-performance long-pulse operation using a plasma operation scenario with a high poloidal beta, ECH requires stable and long pulses, which typically results in the reduction of ECH power during campaigns. Consequently, one ECH unit with power ranging from 0.5 to 0.7 MW was utilized in the KSTAR experiments. The ECH power and launching angle can be controlled by the Plasma Control System (PCS) which maximizes the ECH effect at startup and at the flat top of the discharge. This paper describes the development status of the KSTAR ECH system and reports the longpulse operation results of the ECH system for a dummy load and KSTAR. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.