15 results on '"Pankavich, Stephen"'
Search Results
2. Reactive particle-tracking solutions to a benchmark problem on heavy metal cycling in lake sediments
- Author
-
Schmidt, Michael J., Pankavich, Stephen D., Navarre-Sitchler, Alexis, Engdahl, Nicholas B., Bolster, Diogo, and Benson, David A.
- Published
- 2020
- Full Text
- View/download PDF
3. Estimating the reproductive number, total outbreak size, and reporting rates for Zika epidemics in South and Central America
- Author
-
Shutt, Deborah P., Manore, Carrie A., Pankavich, Stephen, Porter, Aaron T., and Del Valle, Sara Y.
- Published
- 2017
- Full Text
- View/download PDF
4. On the separate treatment of mixing and spreading by the reactive-particle-tracking algorithm: An example of accurate upscaling of reactive Poiseuille flow.
- Author
-
Benson, David A., Pankavich, Stephen, and Bolster, Diogo
- Subjects
- *
POISEUILLE flow , *PARTICLE tracking velocimetry , *DIFFUSION , *PARTICLE methods (Numerical analysis) , *MASS transfer - Abstract
Highlights • Accurate upscaling of reactive transport Poiseuille flow with Reactive Particle Tracking (RPT) model. • RPT model separately simulates mixing with local molecular diffusion and spreading by Taylor macro-dispersion. • Comparison to two semi-analytic upscaling techniques: Volume-averaging and Ensemble Streamtube. • Separate treatment of mixing and spreading makes Lagrangian RPT model more representative than Eulerian advection-dispersion-reaction equation. Abstract The Eulerian advection-dispersion-reaction equation (ADRE) suffers the well-known scale-effect of reduced apparent reaction rates between chemically dissimilar fluids at larger scales (or dimensional averaging). The dispersion tensor in the ADRE must equally and simultaneously account for both solute mixing and spreading. Recent reactive-particle-tracking (RPT) algorithms can, by separate mechanisms, simulate 1) smaller-scale mixing by inter-particle mass transfer, and 2) mass spreading by traditional random walks. To test the supposition that the RPT can accurately track these separate mechanisms, we upscale reactive transport in Hagen-Poiseuille flow between two plates. The simple upscaled 1-D RPT model with one velocity value, an upscaled Taylor macro-dispersivity, and the local molecular diffusion coefficient matches the results obtained from a detailed 2-D model with fully described velocity and diffusion. Both models use the same thermodynamic reaction rate, because the rate is not forced to absorb the loss of information upon upscaling. Analytic and semi-analytic upscaling is also performed using volume averaging and ensemble streamtube techniques. Volume averaging does not perform as well as the RPT, while the streamtube approach (using an effective dispersion coefficient along with macro-dispersion) performs almost exactly the same as RPT. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
5. On the accuracy of simulating mixing by random-walk particle-based mass-transfer algorithms.
- Author
-
Schmidt, Michael J., Pankavich, Stephen D., and Benson, David A.
- Subjects
- *
MASS transfer , *ALGORITHMS , *DIFFUSION , *MATRICES (Mathematics) , *COMPUTATIONAL fluid dynamics - Abstract
Several algorithms have been used for mass transfer between particles undergoing advective and macro-dispersive random walks. The mass transfer between particles is required for general reactions on, and among, particles. The mass transfer is shown to be diffusive, and may be simulated using implicit, explicit, or mixed methods. All algorithms investigated are accurate to O ( Δ t ) . For N particles, the implicit and semi-implicit methods require inverse matrix solutions and O ( N 3 ) calculations. The explicit methods use forward matrix solves and require only O ( N 2 ) calculations. Practically, this means that naïve implementations with more than about 5000 particles run more reliably using explicit methods. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
6. Spatially-heterogeneous embedded stochastic SEIR models for the 2014–2016 Ebola outbreak in West Africa.
- Author
-
Martinez, Kaitlyn, Brown, Grant, and Pankavich, Stephen
- Abstract
The dynamics of human infectious diseases are challenging to understand, particularly when a pathogen spreads spatially over a large region. We present a stochastic, spatially-heterogeneous model framework derived from the foundational SEIR compartmental model. These models utilize a graph structure of spatial locations, facilitating mobility via random walks while progressing through disease states, parameterized by the net probability flux between locations. The analysis is bolstered by Approximate Bayesian Computation, by which epidemiological and mobility parameter distributions are estimated, including an empirically adjusted reproductive number, while model structure proposals are compared using Bayes Factors. The utility of this novel class of models is demonstrated through application to the 2014–2016 Ebola outbreak in West Africa. The flexibility of such models, whose complexity may be adjusted as desired, and complementary methods of analysis enable the exploration of various spatial divisions and mobility schema, while maintaining the essential spatiotemporal disease dynamics. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
7. A particle method for a collisionless plasma with infinite mass
- Author
-
Pankavich, Stephen
- Subjects
- *
COLLISIONLESS plasmas , *MATHEMATICAL models , *NUMERICAL analysis , *ELECTROSTATICS , *PLASMA gases , *PARTICLES - Abstract
Abstract: The one-dimensional Vlasov–Poisson system is considered and a particle method is developed to approximate solutions without compact support which tend to a fixed background of charge as |x|→∞. Such a system of equations can be used to model kinetic phenomena occurring in plasma physics. A localized particle method is constructed and implemented using the fact that solutions to the Vlasov–Poisson system propagate at finite speeds. Finally, the numerical method is utilized to ascertain information regarding the time asymptotics of the generated electrostatic field. [Copyright &y& Elsevier]
- Published
- 2012
- Full Text
- View/download PDF
8. Entropy: (1) The former trouble with particle-tracking simulation, and (2) A measure of computational information penalty.
- Author
-
Benson, David A., Pankavich, Stephen, Schmidt, Michael J., and Sole-Mari, Guillem
- Subjects
- *
INFORMATION measurement , *ENTROPY (Information theory) , *PROBABILITY density function , *RANDOM variables , *AKAIKE information criterion , *TOPOLOGICAL entropy - Abstract
• We provide a consistent definition of entropy between discrete and continuous random variables and models. • This definition reveals the entropy associated with computational models; the extra entropy with computationally complex models is a penalty added to Akaike's information criteria • The additional accuracy gained by adding nodes or particles to a model may not be justified when the extra entropy is taken into account; we show optimal model discretizations for a simple 1-D diffusion problem. • We examine the classical particle-tracking algorithm (which does not track entropy until post-processing) versus newer mass-transfer algorithms (which automatically and continuously track entropy) in terms of mixing. • The mass-transfer methods correctly represent the initial condition and dependence of mixing on particle number; the SPH methods are better able to solve the well-mixed equation in which mixing and spreading are assumed to be equal. Traditional random-walk particle-tracking (PT) models of advection and dispersion do not track entropy, because particle masses remain constant. However, newer mass-transfer particle tracking (MTPT) models have the ability to do so because masses of all compounds may change along trajectories. Additionally, the probability mass functions (PMF) of these MTPT models may be compared to continuous solutions with probability density functions, when a consistent definition of entropy (or similarly, the dilution index) is constructed. This definition reveals that every discretized numerical model incurs a computational entropy. Similar to Akaike's (1974, 1992) entropic penalty for larger numbers of adjustable parameters, the computational complexity of a model (e.g., number of nodes or particles) adds to the entropy and, as such, must be penalized. Application of a new computational information criterion reveals that increased accuracy is not always justified relative to increased computational complexity. The MTPT method can use a particle-collision based kernel or an adaptive kernel derived from smoothed-particle hydrodynamics (SPH). The latter is more representative of a locally well-mixed system (i.e., one in which the dispersion tensor equally represents mixing and solute spreading), while the former better represents the separate processes of mixing versus spreading. We use computational means to demonstrate the fitness of each of these methods for simulating 1-D advective-dispersive transport with uniform coefficients. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
9. Preface of 2nd Annual Meeting of SIAM Central States Section.
- Author
-
He, Xiaoming, Pankavich, Stephen, Van Vleck, Erik, Wang, Zhu, and Ye, Xiu
- Subjects
- *
ANNUAL meetings , *MATHEMATICS - Published
- 2018
- Full Text
- View/download PDF
10. Numerical equivalence between SPH and probabilistic mass transfer methods for Lagrangian simulation of dispersion.
- Author
-
Sole-Mari, Guillem, Schmidt, Michael J., Pankavich, Stephen D., and Benson, David A.
- Subjects
- *
MASS transfer , *MASS transfer coefficients , *DISPERSION (Chemistry) , *FINITE difference time domain method - Abstract
Abstract Several Lagrangian methodologies have been proposed in recent years to simulate advection-dispersion of solutes in fluids as a mass exchange between numerical particles carrying the fluid. In this paper, we unify these methodologies, showing that mass transfer particle tracking (MTPT) algorithms can be framed within the context of smoothed particle hydrodynamics (SPH), provided the choice of a Gaussian smoothing kernel whose bandwidth depends on the dispersion and the time discretization. Numerical simulations are performed for a simple dispersion problem, and they are compared to an analytical solution. Based on the results, we advocate for the use of a kernel bandwidth of the size of the characteristic dispersion length ℓ = 2 D Δ t , at least given a "dense enough" distribution of particles, for in this case the mass transfer operation is not just an approximation, but in fact the exact solution, of the solute's displacement by dispersion in a time step. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
11. Parallelization of particle-mass-transfer algorithms on shared-memory, multi-core CPUs.
- Author
-
Benson, David A., Pribec, Ivan, Engdahl, Nicholas B., Pankavich, Stephen, and Schauer, Lucas
- Subjects
- *
CONSERVATION of mass , *MATRIX decomposition , *REQUIREMENTS engineering , *SCALABILITY , *MEMORY - Abstract
Simulating the transfer of mass between particles is not straightforwardly parallelized because it involves the calculation of the influence of many particles on each other. Engdahl et al. (2019) intuited that the number of matrix operations used for mass transfer grows quadratically with the number of particles, so that dividing the domain geometrically into sub-domains will give speed and memory advantages, even on a single processing thread. Those authors also showed the speed scalability of several one-dimensional examples on multiple cores. Here, we extend those results for more general cases, both in terms of spatial dimensions and algorithmic implementation. We show that there is an optimal subdivision scheme for naive, full-matrix calculations on a multi-processor, or multi-threading shared-memory machine. A similar sparse-matrix implementation that also uses row-and-column-sum normalization often greatly reduces the memory requirements. We also introduce a completely new mass transfer algorithm that uses a non-geometric domain decomposition and only matrix row-sum normalization. This allows the mass-transfer "matrix" to be constructed and solved one row at a time in parallel, so it is faster and vastly more memory efficient than previous methods, but requires more care for suitable accuracy. • Analysis of memory requirements for three mass-transfer particle-tracking (MTPT) algorithms. • Theoretical and empirical analysis of parallel speedup of MTPT algorithms on shared-memory, multi-core CPUs. • Introduction of novel MTPT algorithm with many orders-of-magnitude memory savings. • Derivation of constraints for MTPT matrix construction for mass conservation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Nonparametric, data-based kernel interpolation for particle-tracking simulations and kernel density estimation.
- Author
-
Benson, David A., Bolster, Diogo, Pankavich, Stephen, and Schmidt, Michael J.
- Subjects
- *
GREEN'S functions , *INTERPOLATION , *DENSITY , *KERNEL functions , *INTERPOLATION algorithms , *MACHINE learning - Abstract
• Traditional kernel density estimation uses assumed kernel function form (e.g., Gaussian). • A more intuitive approach uses a kernel that is the shape of the underlying data density. • An iterative, machine learning approach learns the shape of the kernel by using the evolving estimated density. • Our approach has lower error than traditional methods when applied to a range of known densities. • When applied to particle arrival times, the new approach smoothly interpolates and extrapolates the BTC. Traditional interpolation techniques for particle tracking include binning and convolutional formulas that use pre-determined (i.e., closed-form, parameteric) kernels. In many instances, the particles are introduced as point sources in time and space, so the cloud of particles (either in space or time) is a discrete representation of the Green's function of an underlying PDE. As such, each particle is a sample from the Green's function; therefore, each particle should be distributed according to the Green's function. In short, the kernel of a convolutional interpolation of the particle sample "cloud" should be a replica of the cloud itself. This idea gives rise to an iterative method by which the form of the kernel may be discerned in the process of interpolating the Green's function. When the Green's function is a density, this method is broadly applicable to interpolating a kernel density estimate based on random data drawn from a single distribution. We formulate and construct the algorithm and demonstrate its ability to perform kernel density estimation of skewed and/or heavy-tailed data including breakthrough curves. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
13. A mass-transfer particle-tracking method for simulating transport with discontinuous diffusion coefficients.
- Author
-
Schmidt, Michael J., Engdahl, Nicholas B., Pankavich, Stephen D., and Bolster, Diogo
- Subjects
- *
DISCONTINUOUS coefficients , *RANDOM walks , *ANALYTICAL solutions , *MASS transfer , *REACTION-diffusion equations , *DIFFUSION - Abstract
• A mass-transfer particle-tracking method is developed for the problem of a spatially discontinuous diffusion coefficient. • The method employs a semi-analytical solution that we derive and may be employed for complicated subdomain interfaces and in higher dimensions. • Solutions generated by this method closely agree with analytical solutions or trusted numerical results. The problem of a spatially discontinuous diffusion coefficient (D ( x )) is one that may be encountered in hydrogeologic systems due to natural geological features or as a consequence of numerical discretization of flow properties. To date, mass-transfer particle-tracking (MTPT) methods, a family of Lagrangian methods in which diffusion is jointly simulated by random walk and diffusive mass transfers, have been unable to solve this problem. This manuscript presents a new mass-transfer (MT) algorithm that enables MTPT methods to accurately solve the problem of discontinuous D ( x ). To achieve this, we derive a semi-analytical solution to the discontinuous D ( x ) problem by employing a predictor-corrector approach, and we use this semi-analytical solution as the weighting function in a reformulated MT algorithm. This semi-analytical solution is generalized for cases with multiple 1D interfaces as well as for 2D cases, including a 2 × 2 tiling of 4 subdomains that corresponds to a numerically-generated diffusion field. The solutions generated by this new mass-transfer algorithm closely agree with an analytical 1D solution or, in more complicated cases, trusted numerical results, demonstrating the success of our proposed approach. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
14. A modified SEIR model for the spread of Ebola in Western Africa and metrics for resource allocation.
- Author
-
Diaz, Paul, Constantine, Paul, Kalmbach, Kelsey, Jones, Eric, and Pankavich, Stephen
- Subjects
- *
EPIDEMIOLOGY , *EBOLA viral disease transmission , *STABILITY theory ,MATHEMATICAL models of uncertainty - Abstract
A modified, deterministic SEIR model is developed for the 2014 Ebola epidemic occurring in the West African nations of Guinea, Liberia, and Sierra Leone. The model describes the dynamical interaction of susceptible and infected populations, while accounting for the effects of hospitalization and the spread of disease through interactions with deceased, but infectious, individuals. Using data from the World Health Organization (WHO), parameters within the model are fit to recent estimates of infected and deceased cases from each nation. The model is then analyzed using these parameter values. Finally, several metrics are proposed to determine which of these nations is in greatest need of additional resources to combat the spread of infection. These include local and global sensitivity metrics of both the infected population and the basic reproduction number with respect to rates of hospitalization and proper burial. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
15. A Computational Information Criterion for Particle-Tracking with Sparse or Noisy Data.
- Author
-
Tran, Nhat Thanh V., Benson, David A., Schmidt, Michael J., and Pankavich, Stephen D.
- Subjects
- *
ADVECTION-diffusion equations , *AKAIKE information criterion , *COMPUTATIONAL complexity , *PARAMETER estimation , *DATA distribution - Abstract
• Review the COMputational Information Criterion (COMIC) as an adjustment to the AIC • Demonstrate the tradeoff between the complexity and accuracy of a computational model • Investigate the influence of the COMIC on particle-tracking methods • Study the effects of sparse, noisy, and spatially-heterogeneous data on the COMIC • Reformulate the COMIC to account for a non-uniform discretization volume • Provide a generalization of COMIC for non-Gaussian errors and non-uniform variance Traditional probabilistic methods for the simulation of advection-diffusion equations (ADEs) often overlook the entropic contribution of the discretization, e.g., the number of particles, within associated numerical methods. Many times, the gain in accuracy of a highly discretized numerical model is outweighed by its associated computational costs or the noise within the data. We address the question of how many particles are needed in a simulation to best approximate and estimate parameters in one-dimensional advective-diffusive transport. To do so, we use the well-known Akaike Information Criterion (AIC) and a recently-developed correction called the Computational Information Criterion (COMIC) to guide the model selection process. Random-walk and mass-transfer particle tracking methods are employed to solve the model equations at various levels of discretization. Numerical results demonstrate that the COMIC provides an optimal number of particles that can describe a more efficient model in terms of parameter estimation and model prediction compared to the model selected by the AIC even when the data is sparse or noisy, the sampling volume is not uniform throughout the physical domain, or the error distribution of the data is non-IID Gaussian. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.