10 results on '"Todd D. Halter"'
Search Results
2. A Multi-Tier Provenance Model for Global Climate Research.
- Author
-
Eric G. Stephan, Todd D. Halter, Tara Gibson, Nathaniel Beagley, and Karen Schuchardt
- Published
- 2009
- Full Text
- View/download PDF
3. Using Domain Requirements to Achieve Science-Oriented Provenance.
- Author
-
Eric G. Stephan, Todd D. Halter, Terence Critchlow, Paulo Pinheiro da Silva, and Leonardo Salayandia
- Published
- 2010
- Full Text
- View/download PDF
4. Applications in Data-Intensive Computing.
- Author
-
Anuj R. Shah, Joshua N. Adkins, Douglas J. Baxter, William R. Cannon, Daniel G. Chavarría-Miranda, Sutanay Choudhury, Ian Gorton, Deborah K. Gracio, Todd D. Halter, Navdeep Jaitly, John R. Johnson, Richard T. Kouzes, Matthew C. Macduff, Andrès Márquez, Matthew E. Monroe, Christopher S. Oehmen, William A. Pike, Chad Scherrer, Oreste Villa, Bobbie-Jo M. Webb-Robertson, Paul D. Whitney, and Nino Zuljevic
- Published
- 2010
- Full Text
- View/download PDF
5. An Overview of ARM Program Climate Research Facility Data Quality Assurance
- Author
-
C. P. Bahrmann, Sigurd W Christensen, Justin Monroe, D. L. Sisterson, J. H. Mather, Kenneth E. Kehoe, M. C. Macduff, K. L. Nitschke, N. N. Keck, Robin C. Perez, K. J. Doty, Karen L. Sonntag, Raymond A. McCord, R. C. Eagan, Mark D. Ivey, David D. Turner, Chuck A. Long, B. D. Perkins, Sean T. Moore, Richard Wagener, J. C. Liljegren, Scott J. Richardson, B. W. Orr, Todd D. Halter, Randy A. Peppler, and Jimmy W. Voyles
- Subjects
Data stream ,Atmospheric Science ,Data processing ,Data collection ,Operations research ,Computer science ,business.industry ,Weather and climate ,Data set ,Software deployment ,Data quality ,Systems engineering ,business ,Quality assurance - Abstract
We present an overview of key aspects of the Atmospheric Radiation Measurement (ARM) Program Climate Research Facility (ACRF) data quality assurance program. Processes described include instrument deployment and calibration; instrument and facility maintenance; data collection and processing infrastructure; data stream inspection and assessment; problem reporting, review and resolution; data archival, display and distribution; data stream reprocessing; engineering and operations management; and the roles of value-added data processing and targeted field campaigns in specifying data quality and characterizing field measurements. The paper also includes a discussion of recent directions in ACRF data quality assurance. A comprehensive, end-to-end data quality assurance program is essential for producing a high-quality data set from measurements made by automated weather and climate networks. The processes developed during the ARM Program offer a possible framework for use by other instrumentation- and geographically-diverse data collection networks and highlight the myriad aspects that go into producing research-quality data.
- Published
- 2008
6. Applications in Data-Intensive Computing
- Author
-
William R. Cannon, Matthew E. Monroe, Sutanay Choudhury, Andres Marquez, Todd D. Halter, Chad Scherrer, Ian Gorton, Matthew C. Macduff, Paul D. Whitney, Joshua N. Adkins, Navdeep Jaitly, William A. Pike, Deborah K. Gracio, Daniel Chavarría-Miranda, Anuj R. Shah, Christopher S. Oehmen, Douglas J. Baxter, Nino Zuljevic, Bobbie-Jo M. Webb-Robertson, Oreste Villa, Richard T. Kouzes, and John R. Johnson
- Subjects
Data processing ,Data collection ,Software ,Analytics ,business.industry ,Computer science ,Scale (chemistry) ,Volume (computing) ,Data-intensive computing ,Instrumentation (computer programming) ,business ,Data science - Abstract
The total quantity of digital information in the world is growing at an alarming rate. Scientists and engineers are contributing heavily to this data “tsunami” by gathering data using computing and instrumentation at incredible rates. As data volumes and complexity grow, it is increasingly arduous to extract valuable information from the data and derive knowledge from that data. Addressing these demands of ever-growing data volumes and complexity requires game-changing advances in software, hardware, and algorithms. Solution technologies also must scale to handle the increased data collection and processing rates and simultaneously accelerate timely and effective analysis results. This need for ever faster data processing and manipulation as well as algorithms that scale to high-volume data sets have given birth to a new paradigm or discipline known as “data-intensive computing.” In this chapter, we define data-intensive computing, identify the challenges of massive data, outline solutions for hardware, software, and analytics, and discuss a number of applications in the areas of biology, cyber security, and atmospheric research.
- Published
- 2010
7. Leveraging the Open Provenance Model as a Multi-tier Model for Global Climate Research
- Author
-
Todd D. Halter, Brian Ermold, and Eric G. Stephan
- Subjects
Atmospheric radiation ,Provenance ,Database ,Data stream mining ,Global climate ,Computer science ,Component (UML) ,Construct (python library) ,Multi tier ,computer.software_genre ,Sensor fusion ,Data science ,computer - Abstract
Global climate researchers rely upon many forms of sensor data and analytical methods to help profile subtle changes in climate conditions. The U.S. Department of Energy’s Atmospheric Radiation Measurement (ARM) program provides researchers with a collection of curated Value Added Products (VAPs) resulting from continuous sensor data streams, data fusion, and modeling. We are leveraging the Open Provenance Model as a foundational construct that serves the needs of both the VAP producers and consumers. We are organizing the provenance in different tiers of granularity to model VAP lineage, causality at the component level within a VAP, and the causality for each time step as samples are being assembled within the VAP. This paper shares our implementation strategy and how the ARM operations staff and the climate research community can greatly benefit from this approach to more effectively assess and quantify VAP provenance.
- Published
- 2010
8. Using Domain Requirements to Achieve Science-Oriented Provenance
- Author
-
Terence Critchlow, Paulo Pinheiro da Silva, Eric G. Stephan, Todd D. Halter, and Leonardo Salayandia
- Subjects
World Wide Web ,Provenance ,Workflow ,Computer science ,Aggregate (data warehouse) ,Information system ,Context (language use) ,Product (category theory) ,Data science ,Filter (software) ,Domain (software engineering) - Abstract
The US Department of Energy (DOE) Atmospheric Radiation Measurement Program (ARM) is adopting the use of formalized provenance to support observational data products produced by ARM operations and relied upon by researchers. Because of the diversity of needs in the climate community provenance will need to be conveyed in a domain-oriented context. This paper explores a use case where semantic abstract workflows (SAW) are employed as a means to filter, aggregate, and contextually describe the historical events responsible for the ARM data product the scientist is relying upon.
- Published
- 2010
9. A Multi-Tier Provenance Model for Global Climate Research
- Author
-
Tara Gibson, Todd D. Halter, Nathaniel Beagley, Eric G. Stephan, and Karen Schuchardt
- Subjects
Database ,business.industry ,Computer science ,Team software process ,Software development ,Climate change ,computer.software_genre ,Sensor fusion ,Data science ,Information system ,Profiling (information science) ,Climate model ,business ,computer ,Quality assurance - Abstract
Global climate researchers rely upon many forms of sensor data and analytical methods to help profile subtle changes in climate conditions. The U.S. Department of Energy Atmospheric Radiation Measurement (ARM) program provides researchers with curated Value Added Products (VAPs) resulting from continuous instrumentation streams, data fusion, and analytical profiling. The ARM operational staff and software development teams (data producers) rely upon a number of techniques to ensure strict quality control (QC) and quality assurance (QA) standards are maintained. Climate researchers (data consumers) are highly interested in obtaining as much provenance evidence as possible to establish data trustworthiness. Currently all the evidence is not easily attainable or identifiable without significant efforts to extract and piece together information from configuration files, log files, codes, or status information on the ARM website. Our objective is to identify a provenance model that serves the needs of both the VAP producers and consumers. This paper shares our initial results – a comprehensive multi-tier provenance model. We describe how both ARM operations staff and the climate research community can greatly benefit from this approach to more effectively assess and quantify the data historical record.
- Published
- 2009
10. Quality Assurance of ARM Program Climate Research Facility Data
- Author
-
M. C. Macduff, Kenneth E. Kehoe, J. H. Mather, Scott J. Richardson, J. C. Lijegren, D. L. Nitschke, C. P. Bahrmann, D. L. Sisterson, Jimmy W. Voyles, Todd D. Halter, Karen L. Sonntag, S. W. Christensen, R. C. Eagan, Mark D. Ivey, Randy A. Peppler, B. W. Orr, David D. Turner, N. N. Keck, R. C. Perez, Charles N. Long, D. J. Doty, Raymond A. McCord, Richard Wagener, B. D. Perkins, and Sean T. Moore
- Subjects
Data processing ,Engineering ,Data collection ,Database ,business.industry ,Data management plan ,Usability ,computer.software_genre ,Data governance ,Data quality ,Data system ,business ,computer ,Quality assurance - Abstract
This report documents key aspects of the Atmospheric Radiation Measurement (ARM) Climate Research Facility (ACRF) data quality assurance program as it existed in 2008. The performance of ACRF instruments, sites, and data systems is measured in terms of the availability, usability, and accessibility of the data to a user. First, the data must be available to users; that is, the data must be collected by instrument systems, processed, and delivered to a central repository in a timely manner. Second, the data must be usable; that is, the data must be inspected and deemed of sufficient quality for scientific research purposes, and data users must be able to readily tell where there are known problems in the data. Finally, the data must be accessible; that is, data users must be able to easily find, obtain, and work with the data from the central repository. The processes described in this report include instrument deployment and calibration; instrument and facility maintenance; data collection and processing infrastructure; data stream inspection and assessment; the roles of value-added data processing and field campaigns in specifying data quality and haracterizing the basic measurement; data archival, display, and distribution; data stream reprocessing; and engineering and operations management processes and procedures. Future directions in ACRF data quality assurance also are presented.
- Published
- 2008
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.