8 results on '"neuroimaging pipelines"'
Search Results
2. Running Neuroimaging Applications on Amazon Web Services: How, When, and at What Cost?
- Author
-
Tara M. Madhyastha, Natalie Koh, Trevor K. M. Day, Moises Hernández-Fernández, Austin Kelley, Daniel J. Peterson, Sabreena Rajan, Karl A. Woelfer, Jonathan Wolf, and Thomas J. Grabowski
- Subjects
cloud computing ,neuroimaging pipelines ,workflow ,reproducibility ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
The contribution of this paper is to identify and describe current best practices for using Amazon Web Services (AWS) to execute neuroimaging workflows “in the cloud.” Neuroimaging offers a vast set of techniques by which to interrogate the structure and function of the living brain. However, many of the scientists for whom neuroimaging is an extremely important tool have limited training in parallel computation. At the same time, the field is experiencing a surge in computational demands, driven by a combination of data-sharing efforts, improvements in scanner technology that allow acquisition of images with higher image resolution, and by the desire to use statistical techniques that stress processing requirements. Most neuroimaging workflows can be executed as independent parallel jobs and are therefore excellent candidates for running on AWS, but the overhead of learning to do so and determining whether it is worth the cost can be prohibitive. In this paper we describe how to identify neuroimaging workloads that are appropriate for running on AWS, how to benchmark execution time, and how to estimate cost of running on AWS. By benchmarking common neuroimaging applications, we show that cloud computing can be a viable alternative to on-premises hardware. We present guidelines that neuroimaging labs can use to provide a cluster-on-demand type of service that should be familiar to users, and scripts to estimate cost and create such a cluster.
- Published
- 2017
- Full Text
- View/download PDF
3. Running Neuroimaging Applications on Amazon Web Services: How, When, and at What Cost?
- Author
-
Madhyastha, Tara M., Koh, Natalie, Day, Trevor K. M., Hernández-Fernández, Moises, Kelley, Austin, Peterson, Daniel J., Rajan, Sabreena, Woelfer, Karl A., Wolf, Jonathan, and Grabowski, Thomas J.
- Subjects
BRAIN imaging ,CLOUD computing - Abstract
The contribution of this paper is to identify and describe current best practices for using Amazon Web Services (AWS) to execute neuroimaging workflows "in the cloud." Neuroimaging offers a vast set of techniques by which to interrogate the structure and function of the living brain. However, many of the scientists for whom neuroimaging is an extremely important tool have limited training in parallel computation. At the same time, the field is experiencing a surge in computational demands, driven by a combination of data-sharing efforts, improvements in scanner technology that allow acquisition of images with higher image resolution, and by the desire to use statistical techniques that stress processing requirements. Most neuroimaging workflows can be executed as independent parallel jobs and are therefore excellent candidates for running on AWS, but the overhead of learning to do so and determining whether it is worth the cost can be prohibitive. In this paper we describe how to identify neuroimaging workloads that are appropriate for running on AWS, how to benchmark execution time, and how to estimate cost of running on AWS. By benchmarking common neuroimaging applications, we show that cloud computing can be a viable alternative to on-premises hardware. We present guidelines that neuroimaging labs can use to provide a cluster-on-demand type of service that should be familiar to users, and scripts to estimate cost and create such a cluster. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
4. Using Make for Reproducible and Parallel Neuroimaging Workflow and Quality Assurance
- Author
-
Mary K. Askren, Trevor K McAllister-Day, Natalie eKoh, Zoe eMestre, Jennifer N Dines, Benjamin A Korman, Susan J Melhorn, Daniel J Peterson, Matthew ePeverill, Xiaoyan eQin, Swati D Rane, Melissa A Reilly, Maya A Reiter, Kelly A Sambrook, Karl A Woelfer, Thomas J Grabowski, and Tara M Madhyastha
- Subjects
Neuroimaging methods ,Quality Assurance ,reproducibility ,workflow ,neuroimaging pipelines ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
The contribution of this paper is to describe how we can program neuroimaging workflow using Make, a software development tool designed for describing how to build executables from source files. We show that we can achieve many of the features of more sophisticated neuroimaging pipeline systems, including reproducibility, parallelization, fault tolerance, and quality assurance reports. We suggest that Make represents a large step towards these features with only a modest increase in programming demands over shell scripts. This approach reduces the technical skill and time required to write, debug, and maintain neuroimaging workflows in a dynamic environment, where pipelines are often modified to accommodate new best practices or to study the effect of alternative preprocessing steps, and where the underlying packages change frequently. This paper has a comprehensive accompanying manual with lab practicals and examples (see Supplemental Materials) and all data, scripts and makefiles necessary to run the practicals and examples are available in the makepipelines project at NITRC.
- Published
- 2016
- Full Text
- View/download PDF
5. Using Make for Reproducible and Parallel Neuroimaging Workflow and Quality-Assurance.
- Author
-
Askren, Mary K., McAllister-Day, Trevor K., Koh, Natalie, Mestre, Zoé, Dines, Jennifer N., Korman, Benjamin A., Melhorn, Susan J., Peterson, Daniel J., Peverill, Matthew, Xiaoyan Qin, Rane, Swati D., Reilly, Melissa A., Reiter, Maya A., Sambrook, Kelly A., Woelfer, Karl A., Grabowski, Thomas J., Madhyastha, Tara M., Smith, David V., and Denker, Michael
- Subjects
BRAIN imaging ,QUALITY assurance ,MATHEMATICAL programming - Abstract
The contribution of this paper is to describe how we can program neuroimaging workflow using Make, a software development tool designed for describing how to build executables from source files. A makefile (or a file of instructions for Make) consists of a set of rules that create or update target files if they have not been modified since their dependencies were last modified. These rules are processed to create a directed acyclic dependency graph that allows multiple entry points from which to execute the workflow. We show that using Make we can achieve many of the features of more sophisticated neuroimaging pipeline systems, including reproducibility, parallelization, fault tolerance, and quality assurance reports. We suggest that Make permits a large step toward these features with only a modest increase in programming demands over shell scripts. This approach reduces the technical skill and time required to write, debug, and maintain neuroimaging workflows in a dynamic environment, where pipelines are often modified to accommodate new best practices or to study the effect of alternative preprocessing steps, and where the underlying packages change frequently. This paper has a comprehensive accompanying manual with lab practicals and examples (see Supplemental Materials) and all data, scripts, and makefiles necessary to run the practicals and examples are available in the "makepipelines" project at NITRC. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
6. Using Make for Reproducible and Parallel Neuroimaging Workflow and Quality Assurance
- Author
-
Trevor K. McAllister-Day, Melissa A. Reilly, Daniel J. Peterson, Maya A. Reiter, Tara M. Madhyastha, Matthew Peverill, Mary K. Askren, Xiaoyan Qin, Zoe Mestre, Kelly A. Sambrook, Benjamin A. Korman, Jennifer N. Dines, Swati Rane, Karl A Woelfer, Natalie Koh, Susan J. Melhorn, and Thomas J. Grabowski
- Subjects
Computer science ,workflow ,media_common.quotation_subject ,Biomedical Engineering ,Neuroscience (miscellaneous) ,computer.software_genre ,050105 experimental psychology ,lcsh:RC321-571 ,03 medical and health sciences ,0302 clinical medicine ,0501 psychology and cognitive sciences ,neuroimaging pipelines, workflow, quality assurance, reproducibility ,Technology Report ,reproducibility ,lcsh:Neurosciences. Biological psychiatry. Neuropsychiatry ,media_common ,computer.programming_language ,Database ,business.industry ,Shell script ,05 social sciences ,Software development ,Neuroimaging methods ,Fault tolerance ,computer.file_format ,Pipeline (software) ,Computer Science Applications ,Workflow ,Debugging ,Scripting language ,ddc:320 ,Executable ,business ,Software engineering ,Quality Assurance ,computer ,neuroimaging pipelines ,030217 neurology & neurosurgery ,Neuroscience - Abstract
The contribution of this paper is to describe how we can program neuroimaging workflow using Make, a software development tool designed for describing how to build executables from source files. A makefile (or a file of instructions for Make) consists of a set of rules that create or update target files if they have not been modified since their dependencies were last modified. These rules are processed to create a directed acyclic dependency graph that allows multiple entry points from which to execute the workflow. We show that using Make we can achieve many of the features of more sophisticated neuroimaging pipeline systems, including reproducibility, parallelization, fault tolerance, and quality assurance reports. We suggest that Make permits a large step toward these features with only a modest increase in programming demands over shell scripts. This approach reduces the technical skill and time required to write, debug, and maintain neuroimaging workflows in a dynamic environment, where pipelines are often modified to accommodate new best practices or to study the effect of alternative preprocessing steps, and where the underlying packages change frequently. This paper has a comprehensive accompanying manual with lab practicals and examples (see Supplemental Materials) and all data, scripts, and makefiles necessary to run the practicals and examples are available in the "makepipelines" project at NITRC. published
- Published
- 2016
- Full Text
- View/download PDF
7. A Framework To Evaluate Pipeline Reproducibility Across Operating Systems
- Subjects
Big Data ,Neuroinformatics ,Neuroimaging Pipelines ,Neuroimaging ,Reproducibility ,Human Connectome Project
8. The effect of Computational Environments on Big Data Processing Pipelines in Neuroimaging
- Subjects
Numerical quantification ,Computing environment ,Neuroimaging pipelines ,Computational reproducibility ,Monte-Carlo arithmetic
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.