10 results on '"Fabian Franzelin"'
Search Results
2. Comparison of data-driven uncertainty quantification methods for a carbon dioxide storage benchmark scenario.
- Author
-
Markus Köppel, Fabian Franzelin, Ilja Kröker, Sergey Oladyshkin, Gabriele Santin, Dominik Wittwar, Andrea Barth, Bernard Haasdonk, Wolfgang Nowak, Dirk Pflüger, and Christian Rohde
- Published
- 2018
3. Comparison of data-driven uncertainty quantification methods for a carbon dioxide storage benchmark scenario
- Author
-
Christian Rohde, Markus Köppel, Dirk Pflüger, Fabian Franzelin, Ilja Kröker, Wolfgang Nowak, Andrea Barth, Bernard Haasdonk, Gabriele Santin, Dominik Wittwar, and Sergey Oladyshkin
- Subjects
FOS: Computer and information sciences ,Mathematical optimization ,Computer science ,0208 environmental biotechnology ,Monte Carlo method ,02 engineering and technology ,010502 geochemistry & geophysics ,01 natural sciences ,65D05, 65D15, 65C20 ,Computational Engineering, Finance, and Science (cs.CE) ,FOS: Mathematics ,Mathematics - Numerical Analysis ,Computers in Earth Sciences ,Uncertainty quantification ,Computer Science - Computational Engineering, Finance, and Science ,0105 earth and related environmental sciences ,Polynomial chaos ,Computer Science - Numerical Analysis ,Sparse grid ,Numerical Analysis (math.NA) ,020801 environmental engineering ,Computer Science Applications ,Computational Mathematics ,Computational Theory and Mathematics ,Kernel (statistics) ,Benchmark (computing) ,Probability distribution ,Interpolation - Abstract
A variety of methods is available to quantify uncertainties arising within the modeling of flow and transport in carbon dioxide storage, but there is a lack of thorough comparisons. Usually, raw data from such storage sites can hardly be described by theoretical statistical distributions since only very limited data is available. Hence, exact information on distribution shapes for all uncertain parameters is very rare in realistic applications. We discuss and compare four different methods tested for data-driven uncertainty quantification based on a benchmark scenario of carbon dioxide storage. In the benchmark, for which we provide data and code, carbon dioxide is injected into a saline aquifer modeled by the nonlinear capillarity-free fractional flow formulation for two incompressible fluid phases, namely carbon dioxide and brine. To cover different aspects of uncertainty quantification, we incorporate various sources of uncertainty such as uncertainty of boundary conditions, of parameters in constitutive relations, and of material properties. We consider recent versions of the following non-intrusive and intrusive uncertainty quantification methods: arbitrary polynomial chaos, spatially adaptive sparse grids, kernel-based greedy interpolation, and hybrid stochastic Galerkin. The performance of each approach is demonstrated assessing expectation value and standard deviation of the carbon dioxide saturation against a reference statistic based on Monte Carlo sampling. We compare the convergence of all methods reporting on accuracy with respect to the number of model runs and resolution. Finally, we offer suggestions about the methods’ advantages and disadvantages that can guide the modeler for uncertainty quantification in carbon dioxide storage and beyond.
- Published
- 2018
4. B-splines on sparse grids for surrogates in uncertainty quantification
- Author
-
Michael Rehme, Fabian Franzelin, and Dirk Pflüger
- Subjects
021110 strategic, defence & security studies ,Mathematical optimization ,021103 operations research ,Probabilistic risk assessment ,Computer science ,B-spline ,0211 other engineering and technologies ,Sparse grid ,Basis function ,02 engineering and technology ,Industrial and Manufacturing Engineering ,Tensor product ,Benchmark (computing) ,Uncertainty quantification ,Safety, Risk, Reliability and Quality ,Curse of dimensionality - Abstract
Robust prediction of the behavior of complex physical and engineering systems relies on approximating solutions in terms of physical and stochastic domains. For higher resolution and accuracy, simulation models must increase the number of deterministic and stochastic variables and therefore further increase the dimensionality of the problem. Sparse grids are an established technique to tackle higher-dimensional problems. Their efficient tensor product structure allows the creation of accurate surrogates from few model evaluations. Classical approaches use hat functions, resulting in non-differentiable surrogates, or global basis functions, resulting in potential instabilities. Therefore, we propose using modified not-a-knot B-splines to overcome both problems. Additionally, we use established spatially adaptive refinement criteria to reduce the number of model evaluations even further. We compare our technique to other data-driven uncertainty quantification methods in a real-world benchmark for probabilistic risk assessment for carbon dioxide storage in geological formations.
- Published
- 2021
5. Polynomial chaos expansions for dependent random variables
- Author
-
John Jakeman, Fabian Franzelin, Akil Narayan, Dirk Pflueger, and Michael Eldred
- Published
- 2019
6. Polynomial chaos expansions for dependent random variables
- Author
-
John D. Jakeman, Fabian Franzelin, Akil Narayan, Michael S. Eldred, and Dirk Plfüger
- Subjects
Polynomial ,media_common.quotation_subject ,Computational Mechanics ,General Physics and Astronomy ,010103 numerical & computational mathematics ,01 natural sciences ,Collocation method ,FOS: Mathematics ,Applied mathematics ,Orthonormal basis ,Mathematics - Numerical Analysis ,0101 mathematics ,Independence (probability theory) ,65C05, 65D99 ,media_common ,Mathematics ,Polynomial chaos ,Variables ,Mechanical Engineering ,Probability (math.PR) ,Numerical Analysis (math.NA) ,Computer Science Applications ,010101 applied mathematics ,Mechanics of Materials ,Random variable ,Orthogonalization ,Mathematics - Probability - Abstract
Polynomial chaos expansions (PCE) are well-suited to quantifying uncertainty in models parameterized by independent random variables. The assumption of independence leads to simple strategies for evaluating PCE coefficients. In contrast, the application of PCE to models of dependent variables is much more challenging. Three approaches can be used. The first approach uses mapping methods where measure transformations, such as the Nataf and Rosenblatt transformation, can be used to map dependent random variables to independent ones; however we show that this can significantly degrade performance since the Jacobian of the map must be approximated. A second strategy is the class of dominating support methods which build PCE using independent random variables whose distributional support dominates the support of the true dependent joint density; we provide evidence that this approach appears to produce approximations with suboptimal accuracy. A third approach, the novel method proposed here, uses Gram-Schmidt orthogonalization (GSO) to numerically compute orthonormal polynomials for the dependent random variables. This approach has been used successfully when solving differential equations using the intrusive stochastic Galerkin method, and in this paper we use GSO to build PCE using a non-intrusive stochastic collocation method. The stochastic collocation method treats the model as a black box and builds approximations of model output from a set of samples. Building PCE from samples can introduce ill-conditioning which does not plague stochastic Galerkin methods. To mitigate this ill-conditioning we generate weighted Leja sequences, which are nested sample sets, to build accurate polynomial interpolants. We show that our proposed approach produces PCE which are orders of magnitude more accurate than PCE constructed using mapping or dominating support methods., 27 pages, 10 figures
- Published
- 2019
7. Bond-based peridynamics: a quantitative study of Mode I crack opening
- Author
-
Dirk Pflüger, Patrick Diehl, Fabian Franzelin, Georg Ganzenmüller, and Publica
- Subjects
Engineering ,Discretization ,sparse grids ,Constitutive equation ,Traction (engineering) ,Computational Mechanics ,02 engineering and technology ,01 natural sciences ,0203 mechanical engineering ,0101 mathematics ,critical traction ,Continuum mechanics ,Computer simulation ,Peridynamics ,business.industry ,EMU-ND ,Mathematical analysis ,Sparse grid ,Structural engineering ,Quadrature (mathematics) ,010101 applied mathematics ,020303 mechanical engineering & transports ,Mechanics of Materials ,Modeling and Simulation ,bond-based peridynamics ,business - Abstract
This paper shows a new approach to estimate the critical traction for Mode I crack opening before crack growth by numerical simulation. For quasi-static loading, Linear Elastic Fracture Mechanics predicts the critical traction before crack growth. To simulate the crack growth, we used bond-based peridynamics, a non-local generalization of continuum mechanics. We discretize the peridynamics equation of motion with a collocation by space approach, the so-called EMU nodal discretization. As the constitutive law, we employ the improved prototype micro brittle material model. This bond-based material model is verified by the Young’s modulus from classical theory for a homogeneous deformation for different quadrature rules. For the EMU-ND we studied the behavior for different ratios of the horizon and nodal spacing to gain a robust value for a large variety of materials. To access this wide range of materials, we applied sparse grids, a technique to build high-dimensional surrogate models. Sparse grids significantly reduce the number of simulation runs compared to a full grid approach and keep up a similar approximation accuracy. For the validation of the quasi-static loading process, we show that the critical traction is independent of the material density for most material parameters. The bond-based IPMB model with EMU nodal discretization seems very robust for the ratio $${\delta }/{\varDelta X}=3$$ for a wide range of materials, if an error of 5 % is acceptable.
- Published
- 2016
8. Limiting Ranges of Function Values of Sparse Grid Surrogates
- Author
-
Fabian Franzelin and Dirk Pflüger
- Subjects
Computer science ,Search algorithm ,MathematicsofComputing_NUMERICALANALYSIS ,Sparse grid ,Range (statistics) ,Probability density function ,Basis function ,Function (mathematics) ,Density estimation ,Limit (mathematics) ,Algorithm - Abstract
Sparse grid interpolants of high-dimensional functions do not maintain the range of function values. This is a core problem when one is dealing with probability density functions, for example. We present a novel approach to limit range of function values of sparse grid surrogates. It is based on computing minimal sets of sparse grid indices that extend the original sparse grid with properly chosen coefficients such that the function value range of the resulting surrogate function is limited to a certain interval. We provide the prerequisites for the existence of minimal extension sets and formally derive the intersection search algorithm that computes them efficiently. The main advantage of this approach is that the surrogate remains a linear combination of basis functions and, therefore, any problem specific post-processing operation such as evaluation, quadrature, differentiation, regression, density estimation, etc. can remain unchanged. Our sparse grid approach is applicable to arbitrarily refined sparse grids.
- Published
- 2018
9. From Data to Uncertainty: An Efficient Integrated Data-Driven Sparse Grid Approach to Propagate Uncertainty
- Author
-
Dirk Pflüger and Fabian Franzelin
- Subjects
Flexibility (engineering) ,Propagation of uncertainty ,Polynomial chaos ,0208 environmental biotechnology ,Sparse grid ,010103 numerical & computational mathematics ,02 engineering and technology ,Density estimation ,computer.software_genre ,01 natural sciences ,020801 environmental engineering ,Data-driven ,Collocation method ,Data mining ,0101 mathematics ,Uncertainty quantification ,Algorithm ,computer - Abstract
We present a novel data-driven approach to propagate uncertainty. It consists of a highly efficient integrated adaptive sparse grid approach. We remove the gap between the subjective assumptions of the input’s uncertainty and the unknown real distribution by applying sparse grid density estimation on given measurements. We link the estimation to the adaptive sparse grid collocation method for the propagation of uncertainty. This integrated approach gives us two main advantages: First, the linkage of the density estimation and the stochastic collocation method is straightforward as they use the same fundamental principles. Second, we can efficiently estimate moments for the quantity of interest without any additional approximation errors. This includes the challenging task of solving higher-dimensional integrals. We applied this new approach to a complex subsurface flow problem and showed that it can compete with state-of-the-art methods. Our sparse grid approach excels by efficiency, accuracy and flexibility and thus can be applied in many fields from financial to environmental sciences.
- Published
- 2016
10. Classification with Probability Density Estimation on Sparse Grids
- Author
-
Hans-Joachim Bungartz, Benjamin Peherstorfer, Dirk Pflüger, and Fabian Franzelin
- Subjects
Computer science ,business.industry ,Sparse grid ,Pattern recognition ,Probability density function ,Density estimation ,Sparse approximation ,Cross-validation ,Multivariate kernel density estimation ,Set (abstract data type) ,Point (geometry) ,Artificial intelligence ,business ,Algorithm - Abstract
We present a novel method to tackle the multi-class classification problem with sparse grids and show how the computational procedure can be split into an Offline phase (pre-processing) and a very rapid Online phase. For each class of the training data the underlying probability density function is estimated on a sparse grid. The class of a new data point is determined by the values of the density functions at this point. Our classification method can deal with more than two classes in a natural way and it provides a stochastically motivated confidence value which indicates how to rate the respond to a new point. Furthermore, the underlying density estimation method allows us to pre-compute the system matrix and store it in an appropriate format. This so-called Offline/Online splitting of the computational procedure allows an Online phase where only a few matrix-vector products are necessary to learn a new, previously unseen training data set. In particular, we do not have to solve a system of linear equations anymore. We show that speed ups by a factor of several hundred are possible. A typical application for such an Offline/Online splitting is cross validation. We present the algorithm and the computational procedure for our classification method, report on the employed density estimation method on sparse grids and show by means of artificial and real-world data sets that we obtain competitive results compared to the classical sparse grid classification method based on regression.
- Published
- 2014
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.