79 results on '"Bert Debusschere"'
Search Results
2. Co-design Center for Exascale Machine Learning Technologies (ExaLearn)
- Author
-
Shinjae Yoo, Logan Ward, Nikoli Dryden, Ramakrishnan Kannan, Rajeev Thakur, Bert Debusschere, Ganesh Sivaraman, Sutanay Choudhury, Zhengchun Liu, Neeraj Kumar, Peter Nugent, Francis J. Alexander, Sudip K. Seal, Shantenu Jha, James A. Ang, David Pugmire, Li Tan, Ian Foster, Yunzhi Huang, Paul M. Welch, Cristina Garcia Cardona, Sivasankaran Rajamanickam, Thomas Proffen, Ai Kagawa, Malachi Schram, Byung-Jun Yoon, Jamaludin Mohd-Yusof, Erin McCarthy, Tiernan Casey, Sotiris S. Xantheas, Vinay Ramakrishniah, Jan Balewski, Sayan Ghosh, Brian Van Essen, Michael M. Wolf, Christine Sweeney, J. Austin Ellis, Peter Harrington, Jong Choi, Yosuke Oyama, Naoya Maruyama, Satoshi Matsuoka, Jenna A. Bilbrey, Kevin G. Yager, Anthony M. DeGennaro, Travis Johnston, and Ryan Chard
- Subjects
Co-design ,ComputerSystemsOrganization_COMPUTERSYSTEMIMPLEMENTATION ,Active learning (machine learning) ,Statistical learning ,Computer science ,business.industry ,Machine learning ,computer.software_genre ,Exascale computing ,Theoretical Computer Science ,Hardware and Architecture ,Reinforcement learning ,Center (algebra and category theory) ,Artificial intelligence ,business ,computer ,Software - Abstract
Rapid growth in data, computational methods, and computing power is driving a remarkable revolution in what variously is termed machine learning (ML), statistical learning, computational learning, and artificial intelligence. In addition to highly visible successes in machine-based natural language translation, playing the game Go, and self-driving cars, these new technologies also have profound implications for computational and experimental science and engineering, as well as for the exascale computing systems that the Department of Energy (DOE) is developing to support those disciplines. Not only do these learning technologies open up exciting opportunities for scientific discovery on exascale systems, they also appear poised to have important implications for the design and use of exascale computers themselves, including high-performance computing (HPC) for ML and ML for HPC. The overarching goal of the ExaLearn co-design project is to provide exascale ML software for use by Exascale Computing Project (ECP) applications, other ECP co-design centers, and DOE experimental facilities and leadership class computing facilities.
- Published
- 2021
- Full Text
- View/download PDF
3. GDSA Framework Development and Process Model Integration FY2022
- Author
-
Paul Mariner, Bert Debusschere, David Fukuyama, Jacob Harvey, Tara LaForce, Rosemary Leone, Frank Perry, Laura Swiler, and ANNA TACONI
- Published
- 2022
- Full Text
- View/download PDF
4. Probabilistic Nanomagnetic Memories for Uncertain and Robust Machine Learning
- Author
-
Christopher Bennett, Tianyao Xiao, Samuel Liu, Leonard Humphrey, Jean Incorvia, Bert Debusschere, Daniel Ries, and Sapan Agarwal
- Published
- 2022
- Full Text
- View/download PDF
5. EXPLORATION OF MULTIFIDELITY UQ SAMPLING STRATEGIES FOR COMPUTER NETWORK APPLICATIONS
- Author
-
Jonathan Crussell, Bert Debusschere, Gianluca Geraci, and Laura Painton Swiler
- Subjects
Statistics and Probability ,Control and Optimization ,Computer science ,Modeling and Simulation ,Monte Carlo method ,Discrete Mathematics and Combinatorics ,Sampling (statistics) ,Data mining ,Uncertainty quantification ,computer.software_genre ,computer - Published
- 2021
- Full Text
- View/download PDF
6. UQTk Version 3.1.2 User Manual
- Author
-
Khachik Sargsyan, Cosmin Safta, Luke Boll, Katherine Johnston, Mohammad Khalil, Kamaljit Chowdhary, Prashant Rai, Tiernan Casey, Xiaoshu Zeng, and Bert Debusschere
- Published
- 2022
- Full Text
- View/download PDF
7. GDSA Framework Development and Process Model Integration FY2021
- Author
-
Paul Mariner, Timothy Berg, Bert Debusschere, Aubrey Eckert, Jacob Harvey, Tara LaForce, Rosemary Leone, Melissa Mills, Michael Nole, Heeho Park, F. Perry, Daniel Seidl, Laura Swiler, and Kyung Chang
- Published
- 2021
- Full Text
- View/download PDF
8. Science and Engineering of Cybersecurity by Uncertainty quantification and Rigorous Experimentation (SECURE) (Final Report)
- Author
-
Ali Pinar, Thomas Tarman, Laura Swiler, Jared Gearhart, Derek Hart, Eric Vugrin, Gerardo Cruz, Bryan Arguello, Gianluca Geraci, Bert Debusschere, Seth Hanson, Alexander Outkin, Jamie Thorpe, William Hart, Meghan Sahakian, Kasimir Gabert, Casey Glatter, Emma Johnson, and She'ifa Punla-Green
- Published
- 2021
- Full Text
- View/download PDF
9. How a Quantum Computer Could Quantify Uncertainty in Microkinetic Models
- Author
-
Eric A. Walker, Bert Debusschere, Alejandro Becerra, Sri Charan Simha Velpur, Mary Sharmila Rongali, and Anand Prabhu
- Subjects
Physics ,Quantum circuit ,Qubit ,Linear system ,Block matrix ,General Materials Science ,Statistical physics ,Physical and Theoretical Chemistry ,Uncertainty quantification ,Scaling ,Quantum ,Quantum computer - Abstract
A method of uncertainty quantification on a quantum circuit using three samples for the Rh(111)-catalyzed CO oxidation reaction is demonstrated. Three parametrized samples of a reduced, linearized microkinetic model populate a single block diagonal matrix for a quantum circuit. This approach leverages the logarithmic scaling of the number of qubits with respect to matrix size. The Harrow, Hassidim, and Lloyd (HHL) algorithm for solving linear systems is employed, and the results are compared with the classical results. This application area of uncertainty quantification in chemical kinetics can experience a quantum advantage using the method reported here, although issues related to larger systems are discussed.
- Published
- 2021
10. Inference of Hydrogen RedOx Reactions Models using Bayesian Compressive Sensing
- Author
-
Luke Boll, Katherine Johnston, Anthony McDaniel, Ellen Stechel, and Bert Debusschere
- Published
- 2021
- Full Text
- View/download PDF
11. Multifidelity UQ sampling for Stochastic Simulations
- Author
-
Gianluca Geraci, Laura Swiler, and Bert Debusschere
- Published
- 2021
- Full Text
- View/download PDF
12. UQTk Version 3.1.1 User Manual
- Author
-
Khachik Sargsyan, Cosmin Safta, Katherine Johnston, Mohammad Khalil, Kamaljit Chowdhary, Prashant Rai, Tiernan Casey, Xiaoshu Zeng, Luke Boll, and Bert Debusschere
- Published
- 2021
- Full Text
- View/download PDF
13. Bayesian Model Selection for Thermodynamic Models of Redox Active Metal Oxides
- Author
-
Katherine Johnston, Bert Debusschere, and Ellen Stechel
- Published
- 2021
- Full Text
- View/download PDF
14. Polynomial Chaos Expansions for Discrete Random Variables in Cyber Security Emulytics Experiments
- Author
-
Bert Debusschere, Gianluca Geraci, John Jakeman, Cosmin Safta, and Laura Swiler
- Published
- 2021
- Full Text
- View/download PDF
15. Advances in GDSA Framework Development and Process Model Integration
- Author
-
Rosemary Leone, Melissa Mills, Michael Nole, Frank V. Perry, Spencer Jordan, Bert Debusschere, Timothy Berg, Glenn E. Hammond, HeeHo Park, Laura Painton Swiler, Paul Mariner, Daniel Thomas Seidl, Michael Gross, Aubrey Eckert, Jacob Harvey, Emily Stein, Mohamed S. Ebeida, Alex Salazar, William C. McLendon, Kyung Won Chang, Kristopher L. Kuhlman, Tara C. LaForce, Sevougian David, and Eduardo Basurto
- Subjects
Model integration ,Development (topology) ,Computer science ,Process (engineering) ,Systems engineering - Published
- 2020
- Full Text
- View/download PDF
16. Surrogate Model Development of Spent Fuel Degradation for Repository Performance Assessment
- Author
-
Bert Debusschere, Timothy Berg, Daniel Thomas Seidl, Kyung Won Chang, Rosemary Leone, and Paul Mariner
- Subjects
Surrogate model ,Waste management ,Environmental science ,Degradation (geology) ,Spent nuclear fuel - Published
- 2020
- Full Text
- View/download PDF
17. Characterization of Partially Observed Epidemics - Application to COVID-19
- Author
-
Erin Acquesta, Teresa Portone, Thomas A. Catanach, Mohammad Khalil, Gianluca Geraci, Bert Debusschere, Cosmin Safta, Edgar Galvan, Kenny Chowdhary, and Jaideep Ray
- Subjects
Materials science ,Coronavirus disease 2019 (COVID-19) ,Biophysics ,Characterization (materials science) - Published
- 2020
- Full Text
- View/download PDF
18. UQTk User Manual (V.3.1.0)
- Author
-
Prashant Rai, Mohammad Khalil, Khachik Sargsyan, Bert Debusschere, Xiaoshu Zeng, Kenny Chowdhary, Cosmin Safta, Tiernan Casey, and Katherine Johnston
- Published
- 2020
- Full Text
- View/download PDF
19. Exploring the interplay of resilience and energy consumption for a task-based partial differential equations preconditioner
- Author
-
Khachik Sargsyan, Bert Debusschere, O. P. Le Maître, Francesco Rizzi, Omar M. Knio, Cosmin Safta, Karla Morris, and Paul Mycek
- Subjects
020203 distributed computing ,Partial differential equation ,Computer Networks and Communications ,Computer science ,Preconditioner ,Domain decomposition methods ,010103 numerical & computational mathematics ,02 engineering and technology ,Energy consumption ,Parallel computing ,01 natural sciences ,Computer Graphics and Computer-Aided Design ,Theoretical Computer Science ,Artificial Intelligence ,Hardware and Architecture ,Server ,Scalability ,0202 electrical engineering, electronic engineering, information engineering ,Overhead (computing) ,0101 mathematics ,Fault model ,Frequency scaling ,Software - Abstract
We discuss algorithm-based resilience to silent data corruptions (SDCs) in a task-based domain-decomposition preconditioner for partial differential equations (PDEs). The algorithm exploits a reformulation of the PDE as a sampling problem, followed by a solution update through data manipulation that is resilient to SDCs. The implementation is based on a server-client model where all state information is held by the servers, while clients are designed solely as computational units. Scalability tests run up to ∼51 K cores show a parallel efficiency greater than 90%. We use a 2D elliptic PDE and a fault model based on random single and double bit-flip to demonstrate the resilience of the application to synthetically injected SDC. We discuss two fault scenarios: one based on the corruption of all data of a target task, and the other involving the corruption of a single data point. We show that for our application, given the test problem considered, a four-fold increase in the number of faults only yields a 2% change in the overhead to overcome their presence, from 7% to 9%. We then discuss potential savings in energy consumption via dynamic voltage/frequency scaling, and its interplay with fault-rates, and application overhead.
- Published
- 2018
- Full Text
- View/download PDF
20. How Do We Create More Equitable, Diverse, and Inclusive Organizations, and Why Does it Matter? A White Male’s Perspective
- Author
-
Bert Debusschere
- Subjects
General Computer Science ,business.industry ,media_common.quotation_subject ,05 social sciences ,Perspective (graphical) ,White male ,General Engineering ,050301 education ,Public relations ,Diversity training ,Organizational change ,0501 psychology and cognitive sciences ,Sociology ,business ,0503 education ,Inclusion (education) ,History of computing ,050104 developmental & child psychology ,Diversity (politics) ,media_common - Abstract
Diversity and inclusion must be personally meaningful so that they continue to be on our minds in everything we do long after diversity training has ended. When we create deep connections, we build community and inspire others to follow suit, gently but boldly leading a wave of change throughout our organizations. This is how we make computational science more equitable, diverse, and inclusive.
- Published
- 2018
- Full Text
- View/download PDF
21. A resilient domain decomposition polynomial chaos solver for uncertain elliptic PDEs
- Author
-
Bert Debusschere, Olivier Le Maitre, Omar M. Knio, Francesco Rizzi, Khachik Sargsyan, Cosmin Safta, Karla Morris, Andres A. Contreras, and Paul Mycek
- Subjects
Mathematical optimization ,Polynomial chaos ,Computation ,General Physics and Astronomy ,Domain decomposition methods ,010103 numerical & computational mathematics ,Solver ,01 natural sciences ,Dirichlet distribution ,010101 applied mathematics ,Elliptic curve ,symbols.namesake ,Test case ,Hardware and Architecture ,symbols ,Applied mathematics ,0101 mathematics ,Uncertainty quantification ,Mathematics - Abstract
A resilient method is developed for the solution of uncertain elliptic PDEs on extreme scale platforms. The method is based on a hybrid domain decomposition, polynomial chaos (PC) framework that is designed to address soft faults. Specifically, parallel and independent solves of multiple deterministic local problems are used to define PC representations of local Dirichlet boundary-to-boundary maps that are used to reconstruct the global solution. A LAD-lasso type regression is developed for this purpose. The performance of the resulting algorithm is tested on an elliptic equation with an uncertain diffusivity field. Different test cases are considered in order to analyze the impacts of correlation structure of the uncertain diffusivity field, the stochastic resolution, as well as the probability of soft faults. In particular, the computations demonstrate that, provided sufficiently many samples are generated, the method effectively overcomes the occurrence of soft faults.
- Published
- 2017
- Full Text
- View/download PDF
22. Analysis of Neural Network Combustion Surrogate Models
- Author
-
Tiernan Casey and Bert Debusschere
- Subjects
Artificial neural network ,Computer science ,Control engineering ,Combustion - Published
- 2019
- Full Text
- View/download PDF
23. Discrete A Priori Bounds for the Detection of Corrupted PDE Solutions in Exascale Computations
- Author
-
Omar M. Knio, Karla Morris, Francesco Rizzi, Olivier Le Maitre, Bert Debusschere, Paul Mycek, Cosmin Safta, and Khachik Sargsyan
- Subjects
Diffusion equation ,Applied Mathematics ,Mathematical analysis ,Finite difference ,Domain decomposition methods ,010103 numerical & computational mathematics ,Solver ,01 natural sciences ,010104 statistics & probability ,Computational Mathematics ,Maximum principle ,Elliptic partial differential equation ,A priori and a posteriori ,Boundary value problem ,0101 mathematics ,Mathematics - Abstract
A priori bounds are derived for the discrete solution of second-order elliptic partial differential equations (PDEs). The bounds have two contributions. First, the influence of boundary conditions is taken into account through a discrete maximum principle. Second, the contribution of the source field is evaluated in a fashion similar to that used in the treatment of the continuous a priori operators. Closed form expressions are, in particular, obtained for the case of a conservative, second-order finite difference approximation of the diffusion equation with variable scalar diffusivity. The bounds are then incorporated into a resilient domain decomposition framework, in order to verify the admissibility of local PDE solutions. The computations demonstrate that the bounds are able to detect most system faults, and thus considerably enhance the resilience and the overall performance of the solver.
- Published
- 2017
- Full Text
- View/download PDF
24. EXPLORATION OF MULTIFIDELITY APPROACHES FOR UNCERTAINTY QUANTIFICATION IN NETWORK APPLICATIONS
- Author
-
Jonathan Crussell, Gianluca Geraci, Bert Debusschere, and Laura Painton Swiler
- Subjects
Computer science ,business.industry ,Artificial intelligence ,Uncertainty quantification ,Machine learning ,computer.software_genre ,business ,computer - Published
- 2019
- Full Text
- View/download PDF
25. Quadrature Methods for the Calculation of Subgrid Microphysics Moments
- Author
-
Maher Salloum, Bert Debusschere, Kenny Chowdhary, and Vincent E. Larson
- Subjects
Physics ,Atmospheric Science ,Cloud microphysics ,Microphysics ,Meteorology ,Latin hypercube sampling ,Numerical grid ,Monte Carlo method ,Slow convergence ,Probability density function ,Statistical physics ,Quadrature (mathematics) - Abstract
Many cloud microphysical processes occur on a much smaller scale than a typical numerical grid box can resolve. In such cases, a probability density function (PDF) can act as a proxy for subgrid variability in these microphysical processes. This method is known as the assumed PDF method. By placing a density on the microphysical fields, one can use samples from this density to estimate microphysics averages. In the assumed PDF method, the calculation of such microphysical averages has primarily been done using classical Monte Carlo methods and Latin hypercube sampling. Although these techniques are fairly easy to implement and ubiquitous in the literature, they suffer from slow convergence rates as a function of the number of samples. This paper proposes using deterministic quadrature methods instead of traditional random sampling approaches to compute the microphysics statistical moments for the assumed PDF method. For smooth functions, the quadrature-based methods can achieve much greater accuracy with fewer samples by choosing tailored quadrature points and weights instead of random samples. Moreover, these techniques are fairly easy to implement and conceptually similar to Monte Carlo–type methods. As a prototypical microphysical formula, Khairoutdinov and Kogan’s autoconversion and accretion formulas are used to illustrate the benefit of using quadrature instead of Monte Carlo or Latin hypercube sampling.
- Published
- 2015
- Full Text
- View/download PDF
26. Parallel Domain Decomposition Strategies for Stochastic Elliptic Equations. Part A: Local Karhunen--Loève Representations
- Author
-
Bert Debusschere, Francesco Rizzi, Olivier Le Maitre, Omar M. Knio, Paul Mycek, Andres A. Contreras, Department of Mathematics and Statistics, McMaster University, Hamilton, Canada, Laboratoire Ondes et Milieux Complexes (LOMC), Centre National de la Recherche Scientifique (CNRS)-Université Le Havre Normandie (ULH), Normandie Université (NU)-Normandie Université (NU), Laboratoire d'Informatique pour la Mécanique et les Sciences de l'Ingénieur (LIMSI), Université Paris Saclay (COmUE)-Centre National de la Recherche Scientifique (CNRS)-Sorbonne Université - UFR d'Ingénierie (UFR 919), Sorbonne Université (SU)-Sorbonne Université (SU)-Université Paris-Saclay-Université Paris-Sud - Paris 11 (UP11), Istituto Italiano di Tecnologia (IIT), and King Abdullah University of Science and Technology (KAUST)
- Subjects
Karhunen–Loève theorem ,Work (thermodynamics) ,Covariance function ,Stochastic process ,Applied Mathematics ,Domain decomposition methods ,010103 numerical & computational mathematics ,01 natural sciences ,010101 applied mathematics ,[MATH.MATH-PR]Mathematics [math]/Probability [math.PR] ,Computational Mathematics ,[SPI.MECA.MEMA]Engineering Sciences [physics]/Mechanics [physics.med-ph]/Mechanics of materials [physics.class-ph] ,Applied mathematics ,[MATH.MATH-AP]Mathematics [math]/Analysis of PDEs [math.AP] ,0101 mathematics ,ComputingMilieux_MISCELLANEOUS ,Mathematics - Abstract
This work presents a method to efficiently determine the dominant Karhunen--Loeve (KL) modes of a random process with known covariance function. The truncated KL expansion is one of the most common...
- Published
- 2018
- Full Text
- View/download PDF
27. Parallel Domain Decomposition Strategies for Stochastic Elliptic Equations Part B: Accelerated Monte Carlo Sampling with Local PC Expansions
- Author
-
Andres A. Contreras, Bert Debusschere, Francesco Rizzi, Omar M. Knio, Paul Mycek, Olivier Le Maitre, Department of Mathematics and Statistics, McMaster University, Hamilton, Canada, Laboratoire Ondes et Milieux Complexes (LOMC), Centre National de la Recherche Scientifique (CNRS)-Université Le Havre Normandie (ULH), Normandie Université (NU)-Normandie Université (NU), Laboratoire d'Informatique pour la Mécanique et les Sciences de l'Ingénieur (LIMSI), Université Paris Saclay (COmUE)-Centre National de la Recherche Scientifique (CNRS)-Sorbonne Université - UFR d'Ingénierie (UFR 919), Sorbonne Université (SU)-Sorbonne Université (SU)-Université Paris-Saclay-Université Paris-Sud - Paris 11 (UP11), Istituto Italiano di Tecnologia (IIT), and King Abdullah University of Science and Technology (KAUST)
- Subjects
Polynomial chaos ,Applied Mathematics ,Monte Carlo method ,Domain decomposition methods ,010103 numerical & computational mathematics ,Input field ,[SPI.MECA.MSMECA]Engineering Sciences [physics]/Mechanics [physics.med-ph]/Materials and structures in mechanics [physics.class-ph] ,01 natural sciences ,Task (project management) ,010101 applied mathematics ,Stochastic partial differential equation ,[MATH.MATH-PR]Mathematics [math]/Probability [math.PR] ,Computational Mathematics ,Applied mathematics ,[MATH.MATH-AP]Mathematics [math]/Analysis of PDEs [math.AP] ,0101 mathematics ,Parametrization ,Computer Science::Databases ,ComputingMilieux_MISCELLANEOUS ,Mathematics - Abstract
Solving stochastic partial differential equations (SPDEs) can be a computationally intensive task, particularly when the underlying parametrization of the stochastic input field involves a large nu...
- Published
- 2018
- Full Text
- View/download PDF
28. Hybrid discrete/continuum algorithms for stochastic reaction networks
- Author
-
Habib N. Najm, Bert Debusschere, Cosmin Safta, and Khachik Sargsyan
- Subjects
Numerical Analysis ,Finite volume method ,Continuum (measurement) ,Discretization ,Physics and Astronomy (miscellaneous) ,Applied Mathematics ,Discrete Poisson equation ,Computer Science Applications ,Discrete system ,Computational Mathematics ,symbols.namesake ,Modeling and Simulation ,Master equation ,symbols ,Fokker–Planck equation ,Statistical physics ,Order of magnitude ,Mathematics - Abstract
Direct solutions of the Chemical Master Equation (CME) governing Stochastic Reaction Networks (SRNs) are generally prohibitively expensive due to excessive numbers of possible discrete states in such systems. To enhance computational efficiency we develop a hybrid approach where the evolution of states with low molecule counts is treated with the discrete CME model while that of states with large molecule counts is modeled by the continuum Fokker-Planck equation. The Fokker-Planck equation is discretized using a 2nd order finite volume approach with appropriate treatment of flux components. The numerical construction at the interface between the discrete and continuum regions implements the transfer of probability reaction by reaction according to the stoichiometry of the system. The performance of this novel hybrid approach is explored for a two-species circadian model with computational efficiency gains of about one order of magnitude.
- Published
- 2015
- Full Text
- View/download PDF
29. Quantifying Sampling Noise and Parametric Uncertainty in Atomistic-to-Continuum Simulations Using Surrogate Models
- Author
-
Reese E. Jones, Maher Salloum, Habib N. Najm, Bert Debusschere, and Khachik Sargsyan
- Subjects
Mathematical optimization ,Polynomial chaos ,Continuum (measurement) ,Computer science ,Ecological Modeling ,General Physics and Astronomy ,General Chemistry ,Bayesian inference ,Computer Science Applications ,Molecular dynamics ,Surrogate model ,Modeling and Simulation ,Boundary value problem ,Statistical physics ,Couette flow ,Parametric statistics - Abstract
We present a methodology to assess the predictive fidelity of multiscale simulations by incorporating uncertainty in the information exchanged between the components of an atomistic-to-continuum simulation. We account for both the uncertainty due to finite sampling in molecular dynamics (MD) simulations and the uncertainty in the physical parameters of the model. Using Bayesian inference, we represent the expensive atomistic component by a surrogate model that relates the long-term output of the atomistic simulation to its uncertain inputs. We then present algorithms to solve for the variables exchanged across the atomistic-continuum interface in terms of polynomial chaos expansions (PCEs). We consider a simple Couette flow where velocities are exchanged between the atomistic and continuum components, while accounting for uncertainty in the atomistic model parameters and the continuum boundary conditions. Results show convergence of the coupling algorithm at a reasonable number of iterations. The uncertainty in the obtained variables significantly depends on the amount of data sampled from the MD simulations and on the width of the time averaging window used in the MD simulations.
- Published
- 2015
- Full Text
- View/download PDF
30. Probabilistic Methods for Sensitivity Analysis and Calibration in the NASA Challenge Problem
- Author
-
Cosmin Safta, Laura Painton Swiler, Habib N. Najm, Khachik Sargsyan, Michael S. Eldred, Bert Debusschere, and Kenny Chowdhary
- Subjects
Mathematical model ,Computer science ,Bayesian probability ,Aerospace Engineering ,Variance (accounting) ,computer.software_genre ,Computer Science Applications ,TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES ,Probabilistic method ,Black box ,Data mining ,Sensitivity (control systems) ,Electrical and Electronic Engineering ,Uncertainty quantification ,computer ,Algorithm ,Nested sampling algorithm - Abstract
In this paper, a series of algorithms are proposed to address the problems in the NASA Langley Research Center Multidisciplinary Uncertainty Quantification Challenge. A Bayesian approach is employed to characterize and calibrate the epistemic parameters based on the available data, whereas a variance-based global sensitivity analysis is used to rank the epistemic and aleatory model parameters. A nested sampling of the aleatory–epistemic space is proposed to propagate uncertainties from model parameters to output quantities of interest.
- Published
- 2015
- Full Text
- View/download PDF
31. Fault Resilient Domain Decomposition Preconditioner for PDEs
- Author
-
Bert Debusschere, Omar M. Knio, Habib N. Najm, Cosmin Safta, Francesco Rizzi, Olivier Le Maitre, Khachik Sargsyan, Paul Mycek, and Karla Morris
- Subjects
Theoretical computer science ,Computer science ,Preconditioner ,Applied Mathematics ,media_common.quotation_subject ,Probabilistic logic ,Fidelity ,Domain decomposition methods ,Fault (power engineering) ,Fault detection and isolation ,Computational Mathematics ,Computer engineering ,State (computer science) ,Uncertainty quantification ,media_common - Abstract
The move towards extreme-scale computing platforms challenges scientific simulations in many ways. Given the recent tendencies in computer architecture development, one needs to reformulate legacy codes in order to cope with large amounts of communication, system faults, and requirements of low-memory usage per core. In this work, we develop a novel framework for solving PDEs via domain decomposition that reformulates the solution as a state of knowledge with a probabilistic interpretation. Such reformulation allows resiliency with respect to potential faults without having to apply fault detection, avoids unnecessary communication, and is generally well-suited for rigorous uncertainty quantification studies that target improvements of predictive fidelity of scientific models. We demonstrate our algorithm for one-dimensional PDE examples where artificial faults have been implemented as bit flips in the binary representation of subdomain solutions.
- Published
- 2015
- Full Text
- View/download PDF
32. UQTk Version 3.0.4 User Manual
- Author
-
Khachik Sargsyan, Cosmin Safta, Kamaljit Chowdhary, Sarah Castorena, Sarah De Bord, and Bert Debusschere
- Published
- 2017
- Full Text
- View/download PDF
33. DIMENSIONALITY REDUCTION FOR COMPLEX MODELS VIA BAYESIAN COMPRESSIVE SENSING
- Author
-
Peter E. Thornton, Habib N. Najm, Bert Debusschere, Daniel M. Ricciuto, Cosmin Safta, and Khachik Sargsyan
- Subjects
Statistics and Probability ,Control and Optimization ,Polynomial chaos ,business.industry ,Dimensionality reduction ,Machine learning ,computer.software_genre ,Bayesian inference ,Surrogate model ,Modeling and Simulation ,Discrete Mathematics and Combinatorics ,Sensitivity (control systems) ,Artificial intelligence ,Uncertainty quantification ,Cluster analysis ,business ,computer ,Algorithm ,Curse of dimensionality ,Mathematics - Abstract
Uncertainty quantification in complex physical models is often challenged by the computational expense of these models. One often needs to operate under the assumption of sparsely available model simulations. This issue is even more critical when models include a large number of input parameters. This “curse of dimensionality,” in particular, leads to a prohibitively large number of basis terms in spectral methods for uncertainty quantification, such as polynomial chaos (PC) methods. In this work, we implement a PC-based surrogate model construction that “learns” and retains only the most relevant basis terms of the PC expansion, using sparse Bayesian learning. This dramatically reduces the dimensionality of the problem, making it more amenable to further analysis such as sensitivity or calibration studies. The model of interest is the community land model with about 80 input parameters, which also exhibits nonsmooth input-output behavior. We enhanced the methodology by a clustering and classifying procedure that leads to a piecewisePC surrogate thereby dealing with nonlinearity. We then obtain global sensitivity information for five outputs with respect to all input parameters using less than 10,000 model simulations—a very small number for an 80-dimensional input parameter space.
- Published
- 2014
- Full Text
- View/download PDF
34. DATA-FREE INFERENCE OF UNCERTAIN PARAMETERS IN CHEMICAL MODELS
- Author
-
Cosmin Safta, Bert Debusschere, Habib N. Najm, Khachik Sargsyan, and Robert Dan Berry
- Subjects
Statistics and Probability ,Control and Optimization ,Markov chain ,Bayesian probability ,Inference ,computer.software_genre ,Chain (algebraic topology) ,Robustness (computer science) ,Modeling and Simulation ,Convergence (routing) ,Discrete Mathematics and Combinatorics ,Data mining ,Uncertainty quantification ,computer ,Mathematics ,Parametric statistics - Abstract
We outline the use of a data-free inference procedure for estimation of uncertain model parameters for a chemical model of methane-air ignition. The method involves a nested pair of Markov chains, exploring both the data and parametric spaces, to discover a pooled joint posterior consistent with available information. We describe the highlights of the method, and detail its particular implementation in the system at hand. We examine the performance of the procedure, focusing on the robustness and convergence of the estimated joint parameter posterior with increasing number of data chain samples. We also comment on comparisons of this posterior with the missing reference posterior density.
- Published
- 2014
- Full Text
- View/download PDF
35. Fundamental issues in the representation and propagation of uncertain equation of state information in shock hydrodynamics
- Author
-
John H. Carpenter, Ann E. Mattsson, Robert Dan Berry, William J. Rider, Allen C. Robinson, Bert Debusschere, and Richard Roy Drake
- Subjects
Equation of state ,General Computer Science ,Scale (ratio) ,General Engineering ,Bayesian inference ,computer.software_genre ,Shock (mechanics) ,symbols.namesake ,Phase space ,Euler's formula ,symbols ,Applied mathematics ,Data mining ,Uncertainty quantification ,Representation (mathematics) ,computer ,Mathematics - Abstract
Uncertainty quantification (UQ) deals with providing reasonable estimates of the uncertainties associated with an engineering model and propagating them to final engineering quantities of interest. We present a conceptual UQ framework for the case of shock hydrodynamics with Euler’s equations where the uncertainties are assumed to lie principally in the equation of state (EOS). In this paper we consider experimental data as providing both data and an estimate of data uncertainty. We propose a specific Bayesian inference approach for characterizing EOS uncertainty in thermodynamic phase space. We show how this approach provides a natural and efficient methodology for transferring data uncertainty to engineering outputs through an EOS representation that understands and deals consistently with parameter correlations as sensed in the data. Historically, complex multiphase EOSs have been built utilizing tables as the delivery mechanism in order to amortize the cost of creation of the tables over many subsequent continuum scale runs. Once UQ enters into the picture, however, the proper operational paradigm for multiphase tables become much less clear. Using a simple single-phase Mie–Gruneisen model we experiment with several approaches and demonstrate how uncertainty can be represented. We also show how the quality of the tabular representation is of key importance. As a first step, we demonstrate a particular tabular approach for the Mie–Gruneisen model which when extended to multiphase tables should have value for designing a UQ-enabled shock hydrodynamic modeling approach that is not only theoretically sound but also robust, useful, and acceptable to the modeling community. We also propose an approach to separate data uncertainty from modeling error in the EOS.
- Published
- 2013
- Full Text
- View/download PDF
36. Uncertainty Quantification Toolkit (UQTk)
- Author
-
Bert Debusschere, Khachik Sargsyan, Cosmin Safta, and Kenny Chowdhary
- Published
- 2017
- Full Text
- View/download PDF
37. Intrusive Polynomial Chaos Methods for Forward Uncertainty Propagation
- Author
-
Bert Debusschere
- Subjects
010101 applied mathematics ,010103 numerical & computational mathematics ,0101 mathematics ,01 natural sciences - Published
- 2017
- Full Text
- View/download PDF
38. UQTk Version 3.0 User Manual
- Author
-
Khachik Sargsyan, Cosmin Safta, Kamaljit Singh Chowdhary, Sarah Castorena, Sarah De Bord, and Bert Debusschere
- Published
- 2016
- Full Text
- View/download PDF
39. ULFM-MPI Implementation of a Resilient Task-Based Partial Differential Equations Preconditioner
- Author
-
Francesco Rizzi, Bert Debusschere, Paul Mycek, Omar M. Knio, Karla Morris, Cosmin Safta, Khachik Sargsyan, and Olivier LeMaitre
- Subjects
020203 distributed computing ,Partial differential equation ,Scale (ratio) ,Computer science ,Preconditioner ,Distributed computing ,010103 numerical & computational mathematics ,02 engineering and technology ,Parallel computing ,01 natural sciences ,Task (computing) ,Server ,0202 electrical engineering, electronic engineering, information engineering ,Overhead (computing) ,0101 mathematics ,Resilience (network) ,Scaling - Abstract
We present a task-based domain-decomposition preconditioner for partial differential equations (PDEs) resilient to silent data corruption (SDC) and hard faults.The algorithm exploits a reformulation of the PDE as a sampling problem, followed by a regression-based solution update that is resilient to SDC. We adopt a server-client model implemented using the User Level Fault Mitigation MPI (MPI-ULFM). All state information is held by the servers, while clients only serve as computational units. The task-based nature of the algorithm and the capabilities of ULFM are complemented at the algorithm level to support missing tasks, making the application resilient to hard faults affecting the clients.Weak and strong scaling tests up to ~115k cores show an excellent performance of the application with efficiencies above 90%, demonstrating the suitability to run at large scale. We demonstrate the resilience of the application for a 2D elliptic PDE by injecting SDC using a random single bit-flip model, and hard faults in the form of clients crashing. We show that in all cases, the application converges to the right solution. We analyze the overhead caused by the faults, and show that, for the test problem considered, the overhead incurred due to SDC is minimal compared to that from the hard faults.
- Published
- 2016
- Full Text
- View/download PDF
40. Application Fault Tolerance for Shrinking Resources via the Sparse Grid Combination Technique
- Author
-
Bert Debusschere, Mohsin Ali, and Peter Strazdins
- Subjects
020203 distributed computing ,Computer science ,business.industry ,Distributed computing ,Sparse grid ,Cloud computing ,Context (language use) ,Fault tolerance ,010103 numerical & computational mathematics ,02 engineering and technology ,Parallel computing ,Solver ,Grid ,01 natural sciences ,Exascale computing ,Elasticity (cloud computing) ,0202 electrical engineering, electronic engineering, information engineering ,0101 mathematics ,business - Abstract
The need to make large-scale scientific simulations resilient to the shrinking and growing of compute resources arises from exascale computing and adverse operating conditions (fault tolerance). It can also arise from the cloudcomputing context where the cost of these resources can fluctuate. In this paper, we describe how the Sparse Grid Combination Technique can make such applications resilient to shrinking compute resources. The solution of the non-trivial issues of dealing with data redistribution and on-the-fly malleability of process grid information and ULFM MPI communicatorsare described. Results on a 2D advection solver indicate that process recovery time is significantly reduced from the alternate strategy where failed resources are replaced, overall execution time is actually improved from this case and for checkpointing and the execution error remains small, even when multiple failures occur.
- Published
- 2016
- Full Text
- View/download PDF
41. Exploring the Interplay of Resilience and Energy Consumption for a Task-Based Partial Differential Equations Preconditioner
- Author
-
Francesco Rizzi, Karla Vanessa Morris, Khachik Sargsyan, Paul Mycek, Cosmin Safta, Olivier Le Maitre, Omar Knio, and Bert Debusschere
- Published
- 2016
- Full Text
- View/download PDF
42. Multiparameter Spectral Representation of Noise-Induced Competence in Bacillus Subtilis
- Author
-
Bert Debusschere, Khachik Sargsyan, Habib N. Najm, and Cosmin Safta
- Subjects
Stochastic Processes ,Mathematical optimization ,Models, Statistical ,Polynomial chaos ,biology ,Stochastic process ,Applied Mathematics ,Posterior probability ,Computational Biology ,Bayes Theorem ,Probability and statistics ,Observable ,Bacillus subtilis ,biology.organism_classification ,DNA Transformation Competence ,Models, Biological ,Genetics ,Applied mathematics ,Response surface methodology ,Spectral method ,Biotechnology ,Mathematics - Abstract
In this work, the problem of representing a stochastic forward model output with respect to a large number of input parameters is considered. The methodology is applied to a stochastic reaction network of competence dynamics in Bacillus subtilis bacterium. In particular, the dependence of the competence state on rate constants of underlying reactions is investigated. We base our methodology on Polynomial Chaos (PC) spectral expansions that allow effective propagation of input parameter uncertainties to outputs of interest. Given a number of forward model training runs at sampled input parameter values, the PC modes are estimated using a Bayesian framework. As an outcome, these PC modes are described with posterior probability distributions. The resulting expansion can be regarded as an uncertain response function and can further be used as a computationally inexpensive surrogate instead of the original reaction model for subsequent analyses such as calibration or optimization studies. Furthermore, the methodology is enhanced with a classification-based mixture PC formulation that overcomes the difficulties associated with representing potentially nonsmooth input-output relationships. Finally, the global sensitivity analysis based on the multiparameter spectral representation of an observable of interest provides biological insight and reveals the most important reactions and their couplings for the competence dynamics.
- Published
- 2012
- Full Text
- View/download PDF
43. Data-free inference of the joint distribution of uncertain model parameters
- Author
-
Robert Dan Berry, Helgi Adalsteinsson, Habib N. Najm, Bert Debusschere, and Youssef M. Marzouk
- Subjects
Numerical Analysis ,Propagation of uncertainty ,Physics and Astronomy (miscellaneous) ,Applied Mathematics ,Pooling ,Inference ,Statistical model ,Missing data ,Computer Science Applications ,Bayesian statistics ,Computational Mathematics ,Joint probability distribution ,Modeling and Simulation ,Econometrics ,Uncertainty quantification ,Mathematics - Abstract
A critical problem in accurately estimating uncertainty in model predictions is the lack of details in the literature on the correlation (or full joint distribution) of uncertain model parameters. In this paper we describe a framework and a class of algorithms for analyzing such ''missing data'' problems in the setting of Bayesian statistics. The analysis focuses on the family of posterior distributions consistent with given statistics (e.g. nominal values, confidence intervals). The combining of consistent distributions is addressed via techniques from the opinion pooling literature. The developed approach allows subsequent propagation of uncertainty in model inputs consistent with reported statistics, in the absence of data.
- Published
- 2012
- Full Text
- View/download PDF
44. Uncertainty Quantification given Discontinuous Model Response and a Limited Number of Model Runs
- Author
-
Bert Debusschere, Cosmin Safta, Habib N. Najm, and Khachik Sargsyan
- Subjects
Computational Mathematics ,Discontinuity (linguistics) ,Polynomial chaos ,Applied Mathematics ,Orthographic projection ,Probabilistic logic ,Sensitivity (control systems) ,Uncertainty quantification ,Bayesian inference ,Representation (mathematics) ,Algorithm ,Mathematics - Abstract
We outline a methodology for forward uncertainty quantification in systems with uncertain parameters, discontinuous model response, and a limited number of model runs. Our approach involves two stages. First we detect the discontinuity with Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve for arbitrarily distributed input parameters. Then, employing the Rosenblatt transform, we construct spectral representations of the uncertain model output, using polynomial chaos (PC) expansions on either side of the discontinuity curve, leading to an averaged PC representation of the forward model response that allows efficient uncertainty quantification. We obtain PC modes by either orthogonal projection or Bayesian inference, and argue for a hybrid approach that targets a balance between the accuracy provided by the orthogonal projection and the flexibility provided by the Bayesian inference. The uncertain model output is then computed by taking an ensemble average over PC expansions corresponding to sampled realizations of the discontinuity curve. The methodology is demonstrated on synthetic examples of discontinuous model response with adjustable sharpness and structure.
- Published
- 2012
- Full Text
- View/download PDF
45. Uncertainty Quantification in MD Simulations. Part II: Bayesian Inference of Force-Field Parameters
- Author
-
Omar M. Knio, Maher Salloum, Bert Debusschere, Habib N. Najm, Helgi Adalsteinsson, Khachik Sargsyan, and Francesco Rizzi
- Subjects
Mathematical optimization ,Polynomial chaos ,Ecological Modeling ,Bayesian probability ,General Physics and Astronomy ,Inference ,General Chemistry ,Bayesian inference ,Force field (chemistry) ,Computer Science Applications ,Nondeterministic algorithm ,Surrogate model ,Modeling and Simulation ,Statistical physics ,Uncertainty quantification ,Mathematics - Abstract
This paper explores the inference of small-scale, atomistic parameters, based on the specification of large, or macroscale, observables. Specifically, we focus on estimating a set of force-field parameters for the four-site, TIP4P, water model, based on a synthetic problem involving isothermal, isobaric molecular dynamics (MD) simulations of water at ambient conditions. We exploit the polynomial chaos (PC) expansions developed in Part I as surrogate representations of three macroscale observables, namely density, self-diffusion, and enthalpy, as a function of the force-field parameters. We analyze and discuss the use of two different PC representations in a Bayesian framework for the inference of atomistic parameters, based on synthetic observations of three macroscale observables. The first surrogate is a deterministic PC representation, constructed in Part I using nonintrusive spectral projection (NISP). An alternative strategy exploits a nondeterministic PC representation obtained using Bayesian infere...
- Published
- 2012
- Full Text
- View/download PDF
46. A Stochastic Multiscale Coupling Scheme to Account for Sampling Noise in Atomistic-to-Continuum Simulations
- Author
-
Helgi Adalsteinsson, Habib N. Najm, Bert Debusschere, Khachik Sargsyan, Reese E. Jones, and Maher Salloum
- Subjects
Coupling ,Mathematical optimization ,Polynomial chaos ,Continuum (topology) ,Ecological Modeling ,General Physics and Astronomy ,Sampling (statistics) ,General Chemistry ,Bayesian inference ,Noise (electronics) ,Computer Science Applications ,Surrogate model ,Modeling and Simulation ,Convergence (routing) ,Statistical physics ,Mathematics - Abstract
We present a methodology to assess the predictive fidelity of multiscale simulations by incorporating uncertainty in the information exchanged between the atomistic and continuum simulation components. Focusing on uncertainty due to finite sampling in molecular dynamics (MD) simulations, we present an iterative stochastic coupling algorithm that relies on Bayesian inference to build polynomial chaos expansions for the variables exchanged across the atomistic-continuum interface. We consider a simple Couette flow model where velocities are exchanged between the atomistic and continuum components. To alleviate the burden of running expensive MD simulations at every iteration, a surrogate model is constructed from which samples can be efficiently drawn as data for the Bayesian inference. Results show convergence of the coupling algorithm at a reasonable number of iterations. The uncertainty associated with the exchanged variables significantly depends on the amount of data sampled from the MD simulations and...
- Published
- 2012
- Full Text
- View/download PDF
47. Uncertainty Quantification in MD Simulations. Part I: Forward Propagation
- Author
-
Habib N. Najm, Maher Salloum, Omar M. Knio, Bert Debusschere, Helgi Adalsteinsson, Khachik Sargsyan, and Francesco Rizzi
- Subjects
Mathematical optimization ,Polynomial chaos ,Computer science ,Ecological Modeling ,General Physics and Astronomy ,General Chemistry ,Bayesian inference ,Noise (electronics) ,Computer Science Applications ,Nondeterministic algorithm ,Modeling and Simulation ,A priori and a posteriori ,Statistical physics ,Uncertainty quantification ,Random variable ,Parametric statistics - Abstract
This work focuses on quantifying the effect of intrinsic (thermal) noise and parametric uncertainty in molecular dynamics (MD) simulations. We consider isothermal, isobaric MD simulations of TIP4P (or four-site) water at ambient conditions, $T=298$ K and $P=1$ atm. Parametric uncertainty is assumed to originate from three force-field parameters that are parametrized in terms of standard uniform random variables. The thermal fluctuations inherent in MD simulations combine with parametric uncertainty to yield nondeterministic, noisy MD predictions of bulk water properties. Relying on polynomial chaos (PC) expansions, we develop a framework that enables us to isolate the impact of parametric uncertainty on the MD predictions and control the effect of the intrinsic noise. We construct the PC representations of quantities of interest (QoIs) using two different approaches: nonintrusive spectral projection (NISP) and Bayesian inference. We verify a priori the legitimacy of the NISP approach by verifying that the...
- Published
- 2012
- Full Text
- View/download PDF
48. Eigenvalues of the Jacobian of a Galerkin-Projected Uncertain ODE System
- Author
-
Robert Dan Berry, Benjamin Sonday, Habib N. Najm, and Bert Debusschere
- Subjects
Polynomial chaos ,Applied Mathematics ,Mathematical analysis ,Ode ,Projection (linear algebra) ,Computational Mathematics ,symbols.namesake ,Mathematics::Algebraic Geometry ,Ordinary differential equation ,Jacobian matrix and determinant ,Orthogonal polynomials ,symbols ,Numerical range ,Eigenvalues and eigenvectors ,Mathematics - Abstract
Projection onto polynomial chaos (PC) basis functions is often used to reformulate a system of ordinary differential equations (ODEs) with uncertain parameters and initial conditions as a deterministic ODE system that describes the evolution of the PC modes. The deterministic Jacobian of this projected system is different and typically much larger than the random Jacobian of the original ODE system. This paper shows that the location of the eigenvalues of the projected Jacobian is largely determined by the eigenvalues of the original Jacobian, regardless of PC order or choice of orthogonal polynomials. Specifically, the eigenvalues of the projected Jacobian always lie in the convex hull of the numerical range of the Jacobian of the original system.
- Published
- 2011
- Full Text
- View/download PDF
49. Spectral Representation and Reduced Order Modeling of the Dynamics of Stochastic Reaction Networks via Adaptive Data Partitioning
- Author
-
Habib N. Najm, Khachik Sargsyan, Olivier Le Maitre, and Bert Debusschere
- Subjects
Polynomial ,Polynomial chaos ,Stochastic process ,Stochastic modelling ,Applied Mathematics ,Markov process ,Markov model ,Mixture model ,Computational Mathematics ,symbols.namesake ,Calculus ,symbols ,Statistical physics ,Representation (mathematics) ,Mathematics - Abstract
Dynamical analysis tools are well established for deterministic models. However, for many biochemical phenomena in cells the molecule count is low, leading to stochastic behavior that causes deterministic macroscale reaction models to fail. The main mathematical framework representing these phenomena is based on jump Markov processes that model the underlying stochastic reaction network. Conventional dynamical analysis tools do not readily generalize to the stochastic setting due to nondifferentiability and absence of explicit state evolution equations. We developed a reduced order methodology for dynamical analysis that relies on the Karhunen-Loeve decomposition and polynomial chaos expansions. The methodology relies on adaptive data partitioning to obtain an accurate representation of the stochastic process, especially in the case of multimodal behavior. As a result, a mixture model is obtained that represents the reduced order dynamics of the system. The Schlogl model is used as a prototype bistable process that exhibits time scale separation and leads to multimodality in the reduced order model.
- Published
- 2010
- Full Text
- View/download PDF
50. Uncertainty quantification in chemical systems
- Author
-
Habib N. Najm, Bert Debusschere, O. P. Le Maître, Youssef M. Marzouk, and S. Widmer
- Subjects
Numerical Analysis ,Polynomial chaos ,Applied Mathematics ,General Engineering ,Time evolution ,Probabilistic logic ,Probability density function ,Bayesian inference ,law.invention ,Ignition system ,law ,Econometrics ,Applied mathematics ,Uncertainty quantification ,Representation (mathematics) ,Mathematics - Abstract
SUMMARY We demonstrate the use of multiwavelet spectral polynomial chaos techniques for uncertainty quantification in non-isothermal ignition of a methane–air system. We employ Bayesian inference for identifying the probabilistic representation of the uncertain parameters and propagate this uncertainty through the ignition process. We analyze the time evolution of moments and probability density functions of the solution. We also examine the role and significance of dependence among the uncertain parameters. We finish with a discussion of the role of non-linearity and the performance of the algorithm. Copyright q 2009 John Wiley & Sons, Ltd.
- Published
- 2009
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.