14 results on '"Bilionis, Ilias"'
Search Results
2. Bayesian model calibration and optimization of surfactant-polymer flooding
- Author
-
Naik, Pratik, Pandita, Piyush, Aramideh, Soroush, Bilionis, Ilias, and Ardekani, Arezoo M.
- Published
- 2019
- Full Text
- View/download PDF
3. Multi-objective optimization under uncertainty using the hyper-volume expected improvement
- Author
-
Figura, Martin, Pandita, Piyush, Tripathy, Rohit K, and Bilionis, Ilias
- Subjects
Bayesian global optimization ,uncertainty quantification ,Computational Engineering ,expected improvement ,Numerical Analysis and Computation ,Gaussian process regression ,expected improvement over the dominated hyper-volume - Abstract
The design of real engineering systems requires the optimization of multiple quantities of interest. In the electric motor design, one wants to maximize the average torque and minimize the torque variation. A study has shown that these attributes vary for different geometries of the rotor teeth. However, simulations of a large number of designs cannot be performed due to their high cost. In many problems, design optimization of multi-objective functions is a very challenging task due to the difficulty to evaluate the expectation of the objectives. Current multi-objective optimization (MOO) techniques, e.g., evolutionary algorithms cannot solve such problems because they require hundreds of thousands of function evaluations. Therefore, an alternative methodology must be used to identify a Pareto front, a set of optimal designs of MOO. Recent extensions of Bayesian global optimization are able to do exactly that. The idea is to replace the expensive objective functions with cheap-to-evaluate probabilistic surrogates trained using few input-output pairs and to sequentially query designs that maximize the improvement of the Pareto front. For these purposes, we developed SMOOT, a Rappture tool built on a NanoHUB platform. It enables experimentalists to optimize their expensive processes without a need to understand the optimization methodology and guides them to make better decisions in order to find optimal designs.
- Published
- 2016
- Full Text
- View/download PDF
4. Efficient Exploration of Quantified Uncertainty in Granular Crystals
- Author
-
Lopez Ramirez, Juan C, Gonzalez, Marcial, Bilionis, Ilias, and Tripathy, Rohit K
- Subjects
Applied Mechanics ,Acoustics, Dynamics, and Controls ,Uncertainty Quantification ,Granular Crystals - Abstract
Granular crystals present unique nonlinear properties that support standing waves. These depend on precompression and impurities. Thus, they can be used for different applications such as impact and shock dissipation. There are different models which rely on reasonable approximations and assumptions. While experimental results show good agreement with theory, there are experimental errors that are not easily explained and are usually attributed to the approximations made and phenomena that are not accounted for. This might be the result of not quantifying the uncertainty, since variables like the grain size, position, mass and Young modulus, of each particle, are uncertain. Building a response surface is computationally expensive, because the underlying mapping to be learned is a high dimensional problem. This work presents a way of quantifying uncertainty in granular crystals in a computationally efficient way. To accomplish this, a low dimensional response surface is approximated through the method of active subspaces. Within this framework, special structure within the inputs is exploited to project it onto a lower dimensional manifold. The problem of subspace approximation is then treated as an optimization problem, with the use of the Bayesian Information Criterion (BIC). We treat the underlying function to be learned as a Gaussian Process and use Gaussian process regression to generate predictive distributions for test inputs. Distributions obtained through these methods, present a model for uncertainty propagation and could potentially be used to better understand the experimental errors for different models.
- Published
- 2015
5. Bayesian Optimal Design of Experiments for Inferring the Statistical Expectation of Expensive Black-Box Functions.
- Author
-
Pandita, Piyush, Bilionis, Ilias, and Panchal, Jitesh
- Subjects
- *
EXPERIMENTAL design , *EXPECTED returns , *MATHEMATICAL formulas , *STEEL manufacture , *STEEL wire - Abstract
Bayesian optimal design of experiments (BODEs) have been successful in acquiring information about a quantity of interest (QoI) which depends on a black-box function. BODE is characterized by sequentially querying the function at specific designs selected by an infill-sampling criterion. However, most current BODE methods operate in specific contexts like optimization, or learning a universal representation of the black-box function. The objective of this paper is to design a BODE for estimating the statistical expectation of a physical response surface. This QoI is omnipresent in uncertainty propagation and design under uncertainty problems. Our hypothesis is that an optimal BODE should be maximizing the expected information gain in the QoI. We represent the information gain from a hypothetical experiment as the Kullback-Liebler (KL) divergence between the prior and the posterior probability distributions of the QoI. The prior distribution of the QoI is conditioned on the observed data, and the posterior distribution of the QoI is conditioned on the observed data and a hypothetical experiment. The main contribution of this paper is the derivation of a semi-analytic mathematical formula for the expected information gain about the statistical expectation of a physical response. The developed BODE is validated on synthetic functions with varying number of input-dimensions. We demonstrate the performance of the methodology on a steel wire manufacturing problem. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
6. Deep UQ: Learning deep neural network surrogate models for high dimensional uncertainty quantification.
- Author
-
Tripathy, Rohit K. and Bilionis, Ilias
- Subjects
- *
DEEP learning , *ARTIFICIAL neural networks , *MONTE Carlo method , *NONLINEAR functions , *MANIFOLDS (Mathematics) - Abstract
Abstract State-of-the-art computer codes for simulating real physical systems are often characterized by vast number of input parameters. Performing uncertainty quantification (UQ) tasks with Monte Carlo (MC) methods is almost always infeasible because of the need to perform hundreds of thousands or even millions of forward model evaluations in order to obtain convergent statistics. One, thus, tries to construct a cheap-to-evaluate surrogate model to replace the forward model solver. For systems with large numbers of input parameters, one has to address the curse of dimensionality through suitable dimensionality reduction techniques. A popular class of dimensionality reduction methods are those that attempt to recover a low-dimensional representation of the high-dimensional feature space. However, such methods often tend to overestimate the intrinsic dimensionality of the input feature space. In this work, we demonstrate the use of deep neural networks (DNN) to construct surrogate models for numerical simulators. We parameterize the structure of the DNN in a manner that lends the DNN surrogate the interpretation of recovering a low-dimensional nonlinear manifold. The model response is a parameterized nonlinear function of the low-dimensional projections of the input. We think of this low-dimensional manifold as a nonlinear generalization of the notion of the active subspace. Our approach is demonstrated with a problem on uncertainty propagation in a stochastic elliptic partial differential equation (SPDE) with uncertain diffusion coefficient. We deviate from traditional formulations of the SPDE problem by lifting the assumption of fixed lengthscales of the uncertain diffusion field. Instead we attempt to solve a more challenging problem of learning a map between an arbitrary snapshot of the diffusion field and the response. Highlights • A DNN-based surrogate for UQ tasks which is interpretable as a generalization of the AS. • A parameterization of the structure of the DNN to avoid selecting the number of layers and sizes of individual layers. • An easy-to-implement and parallelizable approach to model selection for the DNN surrogate. • Application of the proposed approach to a novel and challenging SPDE problem. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
7. Physics-informed information field theory for modeling physical systems with uncertainty quantification.
- Author
-
Alberts, Alex and Bilionis, Ilias
- Subjects
- *
INFORMATION theory , *NONLINEAR differential equations , *PHYSICAL laws , *INVERSE problems , *MODEL theory , *SINE-Gordon equation - Abstract
Data-driven approaches coupled with physical knowledge are powerful techniques to model engineering systems. The goal of such models is to efficiently solve for the underlying physical field through combining measurements with known physical laws. As many physical systems contain unknown elements, such as missing parameters, noisy measurements, or incomplete physical laws, this is widely approached as an uncertainty quantification problem. The common techniques to handle all of these variables typically depend on the specific numerical scheme used to approximate the posterior, and it is desirable to have a method which is independent of any such discretization. Information field theory (IFT) provides the tools necessary to perform statistics over fields that are not necessarily Gaussian. The objective of this paper is to extend IFT to physics-informed information field theory (PIFT) by encoding the functional priors with information about the physical laws which describe the field. The posteriors derived from this PIFT remain independent of any numerical scheme, and can capture multiple modes which allows for the solution of problems which are not well-posed. We demonstrate our approach through an analytical example involving the Klein-Gordon equation. We then develop a variant of stochastic gradient Langevin dynamics to draw samples from the field posterior and from the posterior of any model parameters. We apply our method to several numerical examples with various degrees of model-form error and to inverse problems involving non-linear differential equations. As an addendum, the method is equipped with a metric which allows the posterior to automatically quantify model-form uncertainty. Because of this, our numerical experiments show that the method remains robust to even an incorrect representation of the physics given sufficient data. We numerically demonstrate that the method correctly identifies when the physics cannot be trusted, in which case it automatically treats learning the field as a regression problem. • Derive physics-informed functional priors from PDEs with an analytical example. • Develop a SGLD scheme for sampling the field prior and posterior. • Develop a nested SLGD scheme for sampling parameters for inverse problems. • Demonstrate that other classic approaches are limiting cases of this methodology. • Applications to a variety of problems including inverse and ill-defined problems. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
8. Optimizing autoinjector devices using physics-based simulations and Gaussian processes.
- Author
-
Sree, Vivek, Zhong, Xiaoxu, Bilionis, Ilias, Ardekani, Arezoo, and Tepole, Adrian Buganza
- Subjects
GAUSSIAN processes ,TISSUE mechanics ,MEDICAL equipment design ,MODULUS of rigidity ,FRACTURE toughness - Abstract
Autoinjectors are becoming a primary drug delivery option to the subcutaneous space. These devices need to work robustly and autonomously to maximize drug bio-availability. However, current designs ignore the coupling between autoinjector dynamics and tissue biomechanics. Here we present a Bayesian framework for optimization of autoinjector devices that can account for the coupled autoinjector-tissue biomechanics and uncertainty in tissue mechanical behavior. The framework relies on replacing the high fidelity model of tissue insertion with a Gaussian process (GP). The GP model is accurate yet computationally affordable, enabling a thorough sensitivity analysis that identified tissue properties, which are not part of the autoinjector design space, as important variables for the injection process. Higher fracture toughness decreases the crack depth, while tissue shear modulus has the opposite effect. The sensitivity analysis also shows that drug viscosity and spring force, which are part of the design space, affect the location and timing of drug delivery. Low viscosity could lead to premature delivery, but can be prevented with smaller spring forces, while higher viscosity could prevent premature delivery while demanding larger spring forces and increasing the time of injection. Increasing the spring force guarantees penetration to the desired depth, but it can result in undesirably high accelerations. The Bayesian optimization framework tackles the challenge of designing devices with performance metrics coupled to uncertain tissue properties. This work is important for the design of other medical devices for which optimization in the presence of material behavior uncertainty is needed. [Display omitted] [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
9. Gaussian processes with built-in dimensionality reduction: Applications to high-dimensional uncertainty propagation.
- Author
-
Tripathy, Rohit, Bilionis, Ilias, and Gonzalez, Marcial
- Subjects
- *
GAUSSIAN processes , *DIMENSION reduction (Statistics) , *HEISENBERG uncertainty principle , *MATHEMATICAL optimization , *COMPUTER software , *RESPONSE surfaces (Statistics) , *ARTIFICIAL neural networks - Abstract
Uncertainty quantification (UQ) tasks, such as model calibration, uncertainty propagation, and optimization under uncertainty, typically require several thousand evaluations of the underlying computer codes. To cope with the cost of simulations, one replaces the real response surface with a cheap surrogate based, e.g., on polynomial chaos expansions, neural networks, support vector machines, or Gaussian processes (GP). However, the number of simulations required to learn a generic multivariate response grows exponentially as the input dimension increases. This curse of dimensionality can only be addressed, if the response exhibits some special structure that can be discovered and exploited. A wide range of physical responses exhibit a special structure known as an active subspace (AS). An AS is a linear manifold of the stochastic space characterized by maximal response variation. The idea is that one should first identify this low dimensional manifold, project the high-dimensional input onto it, and then link the projection to the output. If the dimensionality of the AS is low enough, then learning the link function is a much easier problem than the original problem of learning a high-dimensional function. The classic approach to discovering the AS requires gradient information, a fact that severely limits its applicability. Furthermore, and partly because of its reliance to gradients, it is not able to handle noisy observations. The latter is an essential trait if one wants to be able to propagate uncertainty through stochastic simulators, e.g., through molecular dynamics codes. In this work, we develop a probabilistic version of AS which is gradient-free and robust to observational noise. Our approach relies on a novel Gaussian process regression with built-in dimensionality reduction. In particular, the AS is represented as an orthogonal projection matrix that serves as yet another covariance function hyper-parameter to be estimated from the data. To train the model, we design a two-step maximum likelihood optimization procedure that ensures the orthogonality of the projection matrix by exploiting recent results on the Stiefel manifold, i.e., the manifold of matrices with orthogonal columns. The additional benefit of our probabilistic formulation, is that it allows us to select the dimensionality of the AS via the Bayesian information criterion. We validate our approach by showing that it can discover the right AS in synthetic examples without gradient information using both noiseless and noisy observations. We demonstrate that our method is able to discover the same AS as the classical approach in a challenging one-hundred-dimensional problem involving an elliptic stochastic partial differential equation with random conductivity. Finally, we use our approach to study the effect of geometric and material uncertainties in the propagation of solitary waves in a one dimensional granular system. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
10. Multi-output separable Gaussian process: Towards an efficient, fully Bayesian paradigm for uncertainty quantification
- Author
-
Bilionis, Ilias, Zabaras, Nicholas, Konomi, Bledar A., and Lin, Guang
- Subjects
- *
GAUSSIAN processes , *BAYESIAN analysis , *COMPUTER simulation , *BOUNDARY value problems , *ALGORITHMS , *MODULES (Algebra) - Abstract
Abstract: Computer codes simulating physical systems usually have responses that consist of a set of distinct outputs (e.g., velocity and pressure) that evolve also in space and time and depend on many unknown input parameters (e.g., physical constants, initial/boundary conditions, etc.). Furthermore, essential engineering procedures such as uncertainty quantification, inverse problems or design are notoriously difficult to carry out mostly due to the limited simulations available. The aim of this work is to introduce a fully Bayesian approach for treating these problems which accounts for the uncertainty induced by the finite number of observations. Our model is built on a multi-dimensional Gaussian process that explicitly treats correlations between distinct output variables as well as space and/or time. The proper use of a separable covariance function enables us to describe the huge covariance matrix as a Kronecker product of smaller matrices leading to efficient algorithms for carrying out inference and predictions. The novelty of this work, is the recognition that the Gaussian process model defines a posterior probability measure on the function space of possible surrogates for the computer code and the derivation of an algorithmic procedure that allows us to sample it efficiently. We demonstrate how the scheme can be used in uncertainty quantification tasks in order to obtain error bars for the statistics of interest that account for the finite number of observations. [Copyright &y& Elsevier]
- Published
- 2013
- Full Text
- View/download PDF
11. MULTIDIMENSIONAL ADAPTIVE RELEVANCE VECTOR MACHINES FOR UNCERTAINTY QUANTIFICATION.
- Author
-
BILIONIS, ILIAS and ZABARAS, NICHOLAS
- Subjects
- *
BAYESIAN analysis , *STATISTICAL decision making , *UNCERTAINTY , *REASONING , *PROBABILITY theory - Abstract
We develop a Bayesian uncertainty quantification framework using a local binary tree surrogate model that is able to make use of arbitrary Bayesian regression methods. The tree is adaptively constructed using information about the sensitivity of the response and is biased by the underlying input probability distribution. The local Bayesian regressions are based on a reformulation of the relevance vector machine model that accounts for the multiple output dimensions. A fast algorithm for training the local models is provided. The methodology is demonstrated with examples in the solution of stochastic differential equations. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
12. Multi-output local Gaussian process regression: Applications to uncertainty quantification
- Author
-
Bilionis, Ilias and Zabaras, Nicholas
- Subjects
- *
GAUSSIAN processes , *REGRESSION analysis , *BAYESIAN analysis , *EXPERIMENTAL design , *NUMERICAL analysis , *NUMERICAL solutions to stochastic differential equations - Abstract
Abstract: We develop an efficient, Bayesian Uncertainty Quantification framework using a novel treed Gaussian process model. The tree is adaptively constructed using information conveyed by the observed data about the length scales of the underlying process. On each leaf of the tree, we utilize Bayesian Experimental Design techniques in order to learn a multi-output Gaussian process. The constructed surrogate can provide analytical point estimates, as well as error bars, for the statistics of interest. We numerically demonstrate the effectiveness of the suggested framework in identifying discontinuities, local features and unimportant dimensions in the solution of stochastic differential equations. [Copyright &y& Elsevier]
- Published
- 2012
- Full Text
- View/download PDF
13. Bayesian Methods For Uncertainty Quantification
- Author
-
Bilionis, Ilias
- Subjects
- Bayesian, uncertainty quantification, computer surrogate, probability, limited simulations, expensive solvers
- Abstract
Computer codes simulating physical systems usually have responses that consist of a set of distinct outputs (e.g., velocity and pressure) that evolve also in space and time and depend on many unknown input parameters (e.g., physical constants, initial/boundary conditions, etc.). Furthermore, essential engineering procedures such as uncertainty quantification, inverse problems or design are notoriously difficult to carry out mostly due to the limited simulations available. The aim of this work is to introduce a fully Bayesian approach for treating these problems which accounts for the uncertainty induced by the infinite number of observations.
- Published
- 2013
14. Uncertainty propagation using infinite mixture of Gaussian processes and variational Bayesian inference.
- Author
-
Chen, Peng, Zabaras, Nicholas, and Bilionis, Ilias
- Subjects
- *
GAUSSIAN processes , *UNCERTAINTY (Information theory) , *THEORY of wave motion , *INFINITY (Mathematics) , *MIXTURES , *BAYESIAN analysis , *POROUS materials , *PERMEABILITY - Abstract
Uncertainty propagation in flow through porous media problems is a challenging problem. This is due to the high-dimensionality of the random property fields, e.g. permeability and porosity, as well as the computational complexity of the models that are involved. The usual approach is to construct a surrogate response surface and then use it instead of the expensive model to carry out the uncertainty propagation task. However, the construction of the surrogate surface is hampered by various aspects such as the limited number of model evaluations that one can afford, the curse of dimensionality, multi-variate responses with non-trivial correlations, potential localized features of the response and/or discontinuities. In this work, we extend upon the concept of the Multi-output Gaussian Process (MGP) to effectively deal with all of these difficulties simultaneously. This non-trivial extension involves an infinite mixture of MGP's that is trained using variational Bayesian inference. Prior to observing any data, a Dirichlet process is used to generate the components of the MGP mixture. The Bayesian nature of the model allows for the quantification of the uncertainties due to the limited number of simulations, i.e., we can derive error bars for the statistics of interest. The automatic detection of the mixture components by the variational inference algorithm is able to capture discontinuities and localized features without adhering to ad hoc constructions. Finally, correlations between the components of multi-variate responses are captured by the underlying MGP model in a natural way. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.