91 results on '"Kirby RM"'
Search Results
2. Tradeoffs in automated financial regulation of decentralized finance due to limits on mutable turing machines.
- Author
-
Charoenwong B, Kirby RM, and Reiter J
- Abstract
We examine which decentralized finance architectures enable meaningful regulation by combining financial and computational theory. We show via deduction that a decentralized and permissionless Turing-complete system cannot provably comply with regulations concerning anti-money laundering, know-your-client obligations, some securities restrictions and forms of exchange control. Any system that claims to follow regulations must choose either a form of permission or a less-than-Turing-complete update facility. Compliant decentralized systems can be constructed only by compromising on the richness of permissible changes. Regulatory authorities must accept new tradeoffs that limit their enforcement powers if they want to approve permissionless platforms formally. Our analysis demonstrates that the fundamental constraints of computation theory have direct implications for financial regulation. By mapping regulatory requirements onto computational models, we characterize which types of automated compliance are achievable and which are provably impossible. This framework allows us to move beyond traditional debates about regulatory effectiveness to establish concrete boundaries for automated enforcement., Competing Interests: Declarations. Competing interests: The authors declare no competing interests., (© 2025. The Author(s).)
- Published
- 2025
- Full Text
- View/download PDF
3. Kolmogorov n-widths for multitask physics-informed machine learning (PIML) methods: Towards robust metrics.
- Author
-
Penwarden M, Owhadi H, and Kirby RM
- Subjects
- Algorithms, Physics, Neural Networks, Computer, Machine Learning
- Abstract
Physics-informed machine learning (PIML) as a means of solving partial differential equations (PDEs) has garnered much attention in the Computational Science and Engineering (CS&E) world. This topic encompasses a broad array of methods and models aimed at solving a single or a collection of PDE problems, called multitask learning. PIML is characterized by the incorporation of physical laws into the training process of machine learning models in lieu of large data when solving PDE problems. Despite the overall success of this collection of methods, it remains incredibly difficult to analyze, benchmark, and generally compare one approach to another. Using Kolmogorov n-widths as a measure of effectiveness of approximating functions, we judiciously apply this metric in the comparison of various multitask PIML architectures. We compute lower accuracy bounds and analyze the model's learned basis functions on various PDE problems. This is the first objective metric for comparing multitask PIML architectures and helps remove uncertainty in model validation from selective sampling and overfitting. We also identify avenues of improvement for model architectures, such as the choice of activation function, which can drastically affect model generalization to "worst-case" scenarios, which is not observed when reporting task-specific errors. We also incorporate this metric into the optimization process through regularization, which improves the models' generalizability over the multitask PDE problem., Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper., (Copyright © 2024 Elsevier Ltd. All rights reserved.)
- Published
- 2024
- Full Text
- View/download PDF
4. Multi-Omic Integration of Blood-Based Tumor-Associated Genomic and Lipidomic Profiles Using Machine Learning Models in Metastatic Prostate Cancer.
- Author
-
Fang S, Zhe S, Lin HM, Azad AA, Fettke H, Kwan EM, Horvath L, Mak B, Zheng T, Du P, Jia S, Kirby RM, and Kohli M
- Subjects
- Male, Humans, Androgen Antagonists therapeutic use, Lipidomics, Multiomics, Retrospective Studies, Genomics, Prostatic Neoplasms, Castration-Resistant diagnosis, Prostatic Neoplasms, Castration-Resistant genetics, Prostatic Neoplasms, Castration-Resistant therapy
- Abstract
Purpose: To determine prognostic and predictive clinical outcomes in metastatic hormone-sensitive prostate cancer (mHSPC) and metastatic castrate-resistant prostate cancer (mCRPC) on the basis of a combination of plasma-derived genomic alterations and lipid features in a longitudinal cohort of patients with advanced prostate cancer., Methods: A multifeature classifier was constructed to predict clinical outcomes using plasma-based genomic alterations detected in 120 genes and 772 lipidomic species as informative features in a cohort of 71 patients with mHSPC and 144 patients with mCRPC. Outcomes of interest were collected over 11 years of follow-up. These included in mHSPC state early failure of androgen-deprivation therapy (ADT) and exceptional responders to ADT; early death (poor prognosis) and long-term survivors in mCRPC state. The approach was to build binary classification models that identified discriminative candidates with optimal weights to predict outcomes. To achieve this, we built multi-omic feature-based classifiers using traditional machine learning (ML) methods, including logistic regression with sparse regularization, multi-kernel Gaussian process regression, and support vector machines., Results: The levels of specific ceramides (d18:1/14:0 and d18:1/17:0), and the presence of CHEK2 mutations, AR amplification, and RB1 deletion were identified as the most crucial factors associated with clinical outcomes. Using ML models, the optimal multi-omics feature combination determined resulted in AUC scores of 0.751 for predicting mHSPC survival and 0.638 for predicting ADT failure; and in mCRPC state, 0.687 for prognostication and 0.727 for exceptional survival. The models were observed to be superior than using a limited candidate number of features for developing multi-omic prognostic and predictive signatures., Conclusion: Using a ML approach that incorporates multiple omic features improves the prediction accuracy for metastatic prostate cancer outcomes significantly. Validation of these models will be needed in independent data sets in future.
- Published
- 2023
- Full Text
- View/download PDF
5. Particle Merging-and-Splitting.
- Author
-
Truong N, Yuksel C, Watcharopas C, Levine JA, and Kirby RM
- Abstract
Robustly handling collisions between individual particles in a large particle-based simulation has been a challenging problem. We introduce particle merging-and-splitting, a simple scheme for robustly handling collisions between particles that prevents inter-penetrations of separate objects without introducing numerical instabilities. This scheme merges colliding particles at the beginning of the time-step and then splits them at the end of the time-step. Thus, collisions last for the duration of a time-step, allowing neighboring particles of the colliding particles to influence each other. We show that our merging-and-splitting method is effective in robustly handling collisions and avoiding penetrations in particle-based simulations. We also show how our merging-and-splitting approach can be used for coupling different simulation systems using different and otherwise incompatible integrators. We present simulation tests involving complex solid-fluid interactions, including solid fractures generated by fluid interactions.
- Published
- 2022
- Full Text
- View/download PDF
6. Vector Field Decompositions Using Multiscale Poisson Kernel.
- Author
-
Bhatia H, Kirby RM, Pascucci V, and Bremer PT
- Abstract
Extraction of multiscale features using scale-space is one of the fundamental approaches to analyze scalar fields. However, similar techniques for vector fields are much less common, even though it is well known that, for example, turbulent flows contain cascades of nested vortices at different scales. The challenge is that the ideas related to scale-space are based upon iteratively smoothing the data to extract features at progressively larger scale, making it difficult to extract overlapping features. Instead, we consider spatial regions of influence in vector fields as scale, and introduce a new approach for the multiscale analysis of vector fields. Rather than smoothing the flow, we use the natural Helmholtz-Hodge decomposition to split it into small-scale and large-scale components using progressively larger neighborhoods. Our approach creates a natural separation of features by extracting local flow behavior, for example, a small vortex, from large-scale effects, for example, a background flow. We demonstrate our technique on large-scale, turbulent flows, and show multiscale features that cannot be extracted using state-of-the-art techniques.
- Published
- 2021
- Full Text
- View/download PDF
7. The Effect of Data Transformations on Scalar Field Topological Analysis of High-Order FEM Solutions.
- Author
-
Jallepalli A, Levine JA, and Kirby RM
- Abstract
High-order finite element methods (HO-FEM) are gaining popularity in the simulation community due to their success in solving complex flow dynamics. There is an increasing need to analyze the data produced as output by these simulations. Simultaneously, topological analysis tools are emerging as powerful methods for investigating simulation data. However, most of the current approaches to topological analysis have had limited application to HO-FEM simulation data for two reasons. First, the current topological tools are designed for linear data (polynomial degree one), but the polynomial degree of the data output by these simulations is typically higher (routinely up to polynomial degree six). Second, the simulation data and derived quantities of the simulation data have discontinuities at element boundaries, and these discontinuities do not match the input requirements for the topological tools. One solution to both issues is to transform the high-order data to achieve low-order, continuous inputs for topological analysis. Nevertheless, there has been little work evaluating the possible transformation choices and their downstream effect on the topological analysis. We perform an empirical study to evaluate two commonly used data transformation methodologies along with the recently introduced L-SIAC filter for processing high-order simulation data. Our results show diverse behaviors are possible. We offer some guidance about how best to consider a pipeline of topological analysis of HO-FEM simulations with the currently available implementations of topological analysis.
- Published
- 2020
- Full Text
- View/download PDF
8. Visualization in Meteorology-A Survey of Techniques and Tools for Data Analysis Tasks.
- Author
-
Rautenhaus M, Bottinger M, Siemen S, Hoffman R, Kirby RM, Mirzargar M, Rober N, and Westermann R
- Abstract
This article surveys the history and current state of the art of visualization in meteorology, focusing on visualization techniques and tools used for meteorological data analysis. We examine characteristics of meteorological data and analysis tasks, describe the development of computer graphics methods for visualization in meteorology from the 1960s to today, and visit the state of the art of visualization techniques and tools in operational weather forecasting and atmospheric research. We approach the topic from both the visualization and the meteorological side, showing visualization techniques commonly used in meteorological practice, and surveying recent studies in visualization research aimed at meteorological applications. Our overview covers visualization techniques from the fields of display design, 3D visualization, flow dynamics, feature-based visualization, comparative visualization and data fusion, uncertainty and ensemble visualization, interactive visual analysis, efficient rendering, and scalability and reproducibility. We discuss demands and challenges for visualization research targeting meteorological data analysis, highlighting aspects in demonstration of benefit, interactive visual analysis, seamless visualization, ensemble visualization, 3D visualization, and technical issues.
- Published
- 2018
- Full Text
- View/download PDF
9. Isolated metachronous breast metastasis from renal cell carcinoma: A report of two cases.
- Author
-
Tandon M, Panwar P, Kirby RM, Narayanan S, Soumian S, and Stephens M
- Subjects
- Aged, Aged, 80 and over, Breast pathology, Breast surgery, Breast Neoplasms pathology, Breast Neoplasms surgery, Carcinoma, Renal Cell pathology, Carcinoma, Renal Cell surgery, Female, Humans, Breast Neoplasms diagnosis, Breast Neoplasms secondary, Carcinoma, Renal Cell diagnosis, Carcinoma, Renal Cell secondary, Kidney Neoplasms pathology
- Abstract
Metastases to the breast are very uncommon as compared to primary tumours. Breast is an unusual site for metastasis from renal cell carcinoma. Only occasional cases are reported in the literature. These metastases must be clearly diagnosed as the treatment of primary breast cancer and metastases differs markedly. Treatment of isolated metastases from renal cell carcinoma is usually surgical resection. We report two cases of isolated metachronous metastases to breast from renal cell carcinoma.
- Published
- 2018
- Full Text
- View/download PDF
10. On the Treatment of Field Quantities and Elemental Continuity in FEM Solutions.
- Author
-
Jallepalli A, Docampo-Sanchez J, Ryan JK, Haimes R, and Kirby RM
- Abstract
As the finite element method (FEM) and the finite volume method (FVM), both traditional and high-order variants, continue their proliferation into various applied engineering disciplines, it is important that the visualization techniques and corresponding data analysis tools that act on the results produced by these methods faithfully represent the underlying data. To state this in another way: the interpretation of data generated by simulation needs to be consistent with the numerical schemes that underpin the specific solver technology. As the verifiable visualization literature has demonstrated: visual artifacts produced by the introduction of either explicit or implicit data transformations, such as data resampling, can sometimes distort or even obfuscate key scientific features in the data. In this paper, we focus on the handling of elemental continuity, which is often only continuous or piecewise discontinuous, when visualizing primary or derived fields from FEM or FVM simulations. We demonstrate that traditional data handling and visualization of these fields introduce visual errors. In addition, we show how the use of the recently proposed line-SIAC filter provides a way of handling elemental continuity issues in an accuracy-conserving manner with the added benefit of casting the data in a smooth context even if the representation is element discontinuous.
- Published
- 2018
- Full Text
- View/download PDF
11. Association of One-Step Nucleic Acid Amplification Detected Micrometastases with Tumour Biology and Adjuvant Chemotherapy.
- Author
-
Goussous G, Jafferbhoy S, Smyth N, Hammond L, Narayanan S, Kirby RM, and Soumian S
- Abstract
One-step nucleic acid amplification (OSNA) is an intraoperative technique with a high sensitivity and specificity for sentinel node assessment. The aim of this study was to assess the impact of OSNA on micrometastases detection rates and use of adjuvant chemotherapy. A retrospective review of patients with sentinel node micrometastases over a five-year period was carried out and a comparison of micrometastases detection using OSNA and H&E techniques was made. Out of 1285 patients who underwent sentinel node (SLN) biopsy, 76 patients had micrometastases. Using H&E staining, 36 patients were detected with SLN micrometastases (9/year) in contrast to 40 patients in the OSNA year (40/year) ( p < 0.0001), demonstrating a fourfold increase with the use of OSNA. In the OSNA group, there was also a proportional increase in Grade III, triple-negative, ER-negative, and HER-2-positive tumours being diagnosed with micrometastases. Also on interactive PREDICT tool, the number of patients with a predicted 10-year survival benefit of more than 3% with adjuvant chemotherapy increased from 52 to 70 percent. OSNA has resulted in an increased detection rate of micrometastases especially in patients with aggressive tumour biology. This increased the number of patients who had a predicted survival benefit from adjuvant chemotherapy.
- Published
- 2017
- Full Text
- View/download PDF
12. A Radial Basis Function (RBF)-Finite Difference (FD) Method for Diffusion and Reaction-Diffusion Equations on Surfaces.
- Author
-
Shankar V, Wright GB, Kirby RM, and Fogelson AL
- Abstract
In this paper, we present a method based on Radial Basis Function (RBF)-generated Finite Differences (FD) for numerically solving diffusion and reaction-diffusion equations (PDEs) on closed surfaces embedded in ℝ
d . Our method uses a method-of-lines formulation, in which surface derivatives that appear in the PDEs are approximated locally using RBF interpolation. The method requires only scattered nodes representing the surface and normal vectors at those scattered nodes. All computations use only extrinsic coordinates, thereby avoiding coordinate distortions and singularities. We also present an optimization procedure that allows for the stabilization of the discrete differential operators generated by our RBF-FD method by selecting shape parameters for each stencil that correspond to a global target condition number. We show the convergence of our method on two surfaces for different stencil sizes, and present applications to nonlinear PDEs simulated both on implicit/parametric surfaces and more general surfaces represented by point clouds.- Published
- 2016
- Full Text
- View/download PDF
13. Evaluating Shape Alignment via Ensemble Visualization.
- Author
-
Raj M, Mirzargar M, Preston JS, Kirby RM, and Whitaker RT
- Subjects
- Imaging, Three-Dimensional methods
- Abstract
The visualization of variability in surfaces embedded in 3D, which is a type of ensemble uncertainty visualization, provides a means of understanding the underlying distribution of a collection or ensemble of surfaces. This work extends the contour boxplot technique to 3D and evaluates it against an enumeration-style visualization of the ensemble members and other conventional visualizations used by atlas builders. The authors demonstrate the efficacy of using the 3D contour boxplot ensemble visualization technique to analyze shape alignment and variability in atlas construction and analysis as a real-world application.
- Published
- 2016
- Full Text
- View/download PDF
14. Curve Boxplot: Generalization of Boxplot for Ensembles of Curves.
- Author
-
Mirzargar M, Whitaker RT, and Kirby RM
- Subjects
- Algorithms, Hydrodynamics, Meteorology, Neuroimaging, Computer Graphics, Image Processing, Computer-Assisted methods, Models, Statistical
- Abstract
In simulation science, computational scientists often study the behavior of their simulations by repeated solutions with variations in parameters and/or boundary values or initial conditions. Through such simulation ensembles, one can try to understand or quantify the variability or uncertainty in a solution as a function of the various inputs or model assumptions. In response to a growing interest in simulation ensembles, the visualization community has developed a suite of methods for allowing users to observe and understand the properties of these ensembles in an efficient and effective manner. An important aspect of visualizing simulations is the analysis of derived features, often represented as points, surfaces, or curves. In this paper, we present a novel, nonparametric method for summarizing ensembles of 2D and 3D curves. We propose an extension of a method from descriptive statistics, data depth, to curves. We also demonstrate a set of rendering and visualization strategies for showing rank statistics of an ensemble of curves, which is a generalization of traditional whisker plots or boxplots to multidimensional curves. Results are presented for applications in neuroimaging, hurricane forecasting and fluid dynamics.
- Published
- 2014
- Full Text
- View/download PDF
15. From h to p efficiently: optimal implementation strategies for explicit time-dependent problems using the spectral/ hp element method.
- Author
-
Bolis A, Cantwell CD, Kirby RM, and Sherwin SJ
- Abstract
We investigate the relative performance of a second-order Adams-Bashforth scheme and second-order and fourth-order Runge-Kutta schemes when time stepping a 2D linear advection problem discretised using a spectral/ hp element technique for a range of different mesh sizes and polynomial orders. Numerical experiments explore the effects of short (two wavelengths) and long (32 wavelengths) time integration for sets of uniform and non-uniform meshes. The choice of time-integration scheme and discretisation together fixes a CFL limit that imposes a restriction on the maximum time step, which can be taken to ensure numerical stability. The number of steps, together with the order of the scheme, affects not only the runtime but also the accuracy of the solution. Through numerical experiments, we systematically highlight the relative effects of spatial resolution and choice of time integration on performance and provide general guidelines on how best to achieve the minimal execution time in order to obtain a prescribed solution accuracy. The significant role played by higher polynomial orders in reducing CPU time while preserving accuracy becomes more evident, especially for uniform meshes, compared with what has been typically considered when studying this type of problem.© 2014. The Authors. International Journal for Numerical Methods in Fluids published by John Wiley & Sons, Ltd.
- Published
- 2014
- Full Text
- View/download PDF
16. How short is too short for the interactions of a water potential? Exploring the parameter space of a coarse-grained water model using uncertainty quantification.
- Author
-
Jacobson LC, Kirby RM, and Molinero V
- Abstract
Coarse-grained models are becoming increasingly popular due to their ability to access time and length scales that are prohibitively expensive with atomistic models. However, as a result of decreasing the degrees of freedom, coarse-grained models often have diminished accuracy, representability, and transferability compared with their finer grained counterparts. Uncertainty quantification (UQ) can help alleviate this challenge by providing an efficient and accurate method to evaluate the effect of model parameters on the properties of the system. This method is useful in finding parameter sets that fit the model to several experimental properties simultaneously. In this work we use UQ as a tool for the evaluation and optimization of a coarse-grained model. We efficiently sample the five-dimensional parameter space of the coarse-grained monatomic water (mW) model to determine what parameter sets best reproduce experimental thermodynamic, structural and dynamical properties of water. Generalized polynomial chaos (gPC) was used to reconstruct the analytical surfaces of density, enthalpy of vaporization, radial and angular distribution functions, and diffusivity of liquid water as a function of the input parameters. With these surfaces, we evaluated the sensitivity of these properties to perturbations of the model input parameters and the accuracy and representability of the coarse-grained models. In particular, we investigated what is the optimum length scale of the water-water interactions needed to reproduce the properties of liquid water with a monatomic model with two- and three-body interactions. We found that there is an optimum cutoff length of 4.3 Å, barely longer than the size of the first neighbor shell in water. As cutoffs deviate from this optimum value, the ability of the model to simultaneously reproduce the structure and thermodynamics is severely diminished.
- Published
- 2014
- Full Text
- View/download PDF
17. Architecting the Finite Element Method Pipeline for the GPU.
- Author
-
Fu Z, Lewis TJ, Kirby RM, and Whitaker RT
- Abstract
The finite element method (FEM) is a widely employed numerical technique for approximating the solution of partial differential equations (PDEs) in various science and engineering applications. Many of these applications benefit from fast execution of the FEM pipeline. One way to accelerate the FEM pipeline is by exploiting advances in modern computational hardware, such as the many-core streaming processors like the graphical processing unit (GPU). In this paper, we present the algorithms and data-structures necessary to move the entire FEM pipeline to the GPU. First we propose an efficient GPU-based algorithm to generate local element information and to assemble the global linear system associated with the FEM discretization of an elliptic PDE. To solve the corresponding linear system efficiently on the GPU, we implement a conjugate gradient method preconditioned with a geometry-informed algebraic multi-grid (AMG) method preconditioner. We propose a new fine-grained parallelism strategy, a corresponding multigrid cycling stage and efficient data mapping to the many-core architecture of GPU. Comparison of our on-GPU assembly versus a traditional serial implementation on the CPU achieves up to an 87 × speedup. Focusing on the linear system solver alone, we achieve a speedup of up to 51 × versus use of a comparable state-of-the-art serial CPU linear system solver. Furthermore, the method compares favorably with other GPU-based, sparse, linear solvers.
- Published
- 2014
- Full Text
- View/download PDF
18. High-order spectral/ hp element discretisation for reaction-diffusion problems on surfaces: Application to cardiac electrophysiology.
- Author
-
Cantwell CD, Yakovlev S, Kirby RM, Peters NS, and Sherwin SJ
- Abstract
We present a numerical discretisation of an embedded two-dimensional manifold using high-order continuous Galerkin spectral/ hp elements, which provide exponential convergence of the solution with increasing polynomial order, while retaining geometric flexibility in the representation of the domain. Our work is motivated by applications in cardiac electrophysiology where sharp gradients in the solution benefit from the high-order discretisation, while the computational cost of anatomically-realistic models can be significantly reduced through the surface representation and use of high-order methods. We describe and validate our discretisation and provide a demonstration of its application to modelling electrochemical propagation across a human left atrium.
- Published
- 2014
- Full Text
- View/download PDF
19. Verifying volume rendering using discretization error analysis.
- Author
-
Etiene T, Jönsson D, Ropinski T, Scheidegger C, Comba JL, Nonato LG, Kirby RM, Ynnerman A, and Silva CT
- Abstract
We propose an approach for verification of volume rendering correctness based on an analysis of the volume rendering integral, the basis of most DVR algorithms. With respect to the most common discretization of this continuous model (Riemann summation), we make assumptions about the impact of parameter changes on the rendered results and derive convergence curves describing the expected behavior. Specifically, we progressively refine the number of samples along the ray, the grid size, and the pixel size, and evaluate how the errors observed during refinement compare against the expected approximation errors. We derive the theoretical foundations of our verification approach, explain how to realize it in practice, and discuss its limitations. We also report the errors identified by our approach when applied to two publicly available volume rendering packages.
- Published
- 2014
- Full Text
- View/download PDF
20. GPU-based volume visualization from high-order finite element fields.
- Author
-
Nelson B, Kirby RM, and Haimes R
- Abstract
This paper describes a new volume rendering system for spectral/hp finite-element methods that has as its goal to be both accurate and interactive. Even though high-order finite element methods are commonly used by scientists and engineers, there are few visualization methods designed to display this data directly. Consequently, visualizations of high-order data are generally created by first sampling the high-order field onto a regular grid and then generating the visualization via traditional methods based on linear interpolation. This approach, however, introduces error into the visualization pipeline and requires the user to balance image quality, interactivity, and resource consumption. We first show that evaluation of the volume rendering integral, when applied to the composition of piecewise-smooth transfer functions with the high-order scalar field, typically exhibits second-order convergence for a wide range of high-order quadrature schemes, and has worst case first-order convergence. This result provides bounds on the ability to achieve high-order convergence to the volume rendering integral. We then develop an algorithm for optimized evaluation of the volume rendering integral, based on the categorization of each ray according to the local behavior of the field and transfer function. We demonstrate the effectiveness of our system by running performance benchmarks on several high-order fluid-flow simulations.
- Published
- 2014
- Full Text
- View/download PDF
21. Contour boxplots: a method for characterizing uncertainty in feature sets from simulation ensembles.
- Author
-
Whitaker RT, Mirzargar M, and Kirby RM
- Subjects
- Computer Simulation, Algorithms, Computer Graphics, Data Interpretation, Statistical, Models, Statistical, User-Computer Interface
- Abstract
Ensembles of numerical simulations are used in a variety of applications, such as meteorology or computational solid mechanics, in order to quantify the uncertainty or possible error in a model or simulation. Deriving robust statistics and visualizing the variability of an ensemble is a challenging task and is usually accomplished through direct visualization of ensemble members or by providing aggregate representations such as an average or pointwise probabilities. In many cases, the interesting quantities in a simulation are not dense fields, but are sets of features that are often represented as thresholds on physical or derived quantities. In this paper, we introduce a generalization of boxplots, called contour boxplots, for visualization and exploration of ensembles of contours or level sets of functions. Conventional boxplots have been widely used as an exploratory or communicative tool for data analysis, and they typically show the median, mean, confidence intervals, and outliers of a population. The proposed contour boxplots are a generalization of functional boxplots, which build on the notion of data depth. Data depth approximates the extent to which a particular sample is centrally located within its density function. This produces a center-outward ordering that gives rise to the statistical quantities that are essential to boxplots. Here we present a generalization of functional data depth to contours and demonstrate methods for displaying the resulting boxplots for two-dimensional simulation data in weather forecasting and computational fluid dynamics.
- Published
- 2013
- Full Text
- View/download PDF
22. Visualization collaborations: what works and why.
- Author
-
Kirby RM and Meyer M
- Abstract
For over 25 years, the visualization community has grown and evolved as a function of collaboration with other areas. It's now commonplace for visualization scientists to engage with other researchers in scientific teams. Commonplace, however, doesn't mean easy. Two visualization researchers' years of experience have led to a set of observations and recommendations on what works (and what doesn't) and why in visualization collaborations. These insights can help guide the visualization community as it moves forward.
- Published
- 2013
- Full Text
- View/download PDF
23. Inverse Electrocardiographic Source Localization of Ischemia: An Optimization Framework and Finite Element Solution.
- Author
-
Wang D, Kirby RM, Macleod RS, and Johnson CR
- Abstract
With the goal of non-invasively localizing cardiac ischemic disease using body-surface potential recordings, we attempted to reconstruct the transmembrane potential (TMP) throughout the myocardium with the bidomain heart model. The task is an inverse source problem governed by partial differential equations (PDE). Our main contribution is solving the inverse problem within a PDE-constrained optimization framework that enables various physically-based constraints in both equality and inequality forms. We formulated the optimality conditions rigorously in the continuum before deriving finite element discretization, thereby making the optimization independent of discretization choice. Such a formulation was derived for the L
2 -norm Tikhonov regularization and the total variation minimization. The subsequent numerical optimization was fulfilled by a primal-dual interior-point method tailored to our problem's specific structure. Our simulations used realistic, fiber-included heart models consisting of up to 18,000 nodes, much finer than any inverse models previously reported. With synthetic ischemia data we localized ischemic regions with roughly a 10% false-negative rate or a 20% false-positive rate under conditions up to 5% input noise. With ischemia data measured from animal experiments, we reconstructed TMPs with roughly 0.9 correlation with the ground truth. While precisely estimating the TMP in general cases remains an open problem, our study shows the feasibility of reconstructing TMP during the ST interval as a means of ischemia localization.- Published
- 2013
- Full Text
- View/download PDF
24. A study of different modeling choices for simulating platelets within the immersed boundary method.
- Author
-
Shankar V, Wright GB, Fogelson AL, and Kirby RM
- Abstract
The Immersed Boundary (IB) method is a widely-used numerical methodology for the simulation of fluid-structure interaction problems. The IB method utilizes an Eulerian discretization for the fluid equations of motion while maintaining a Lagrangian representation of structural objects. Operators are defined for transmitting information (forces and velocities) between these two representations. Most IB simulations represent their structures with piecewise linear approximations and utilize Hookean spring models to approximate structural forces. Our specific motivation is the modeling of platelets in hemodynamic flows. In this paper, we study two alternative representations - radial basis functions (RBFs) and Fourier-based (trigonometric polynomials and spherical harmonics) representations - for the modeling of platelets in two and three dimensions within the IB framework, and compare our results with the traditional piecewise linear approximation methodology. For different representative shapes, we examine the geometric modeling errors (position and normal vectors), force computation errors, and computational cost and provide an engineering trade-off strategy for when and why one might select to employ these different representations.
- Published
- 2013
- Full Text
- View/download PDF
25. A FAST ITERATIVE METHOD FOR SOLVING THE EIKONAL EQUATION ON TETRAHEDRAL DOMAINS.
- Author
-
Fu Z, Kirby RM, and Whitaker RT
- Abstract
Generating numerical solutions to the eikonal equation and its many variations has a broad range of applications in both the natural and computational sciences. Efficient solvers on cutting-edge, parallel architectures require new algorithms that may not be theoretically optimal , but that are designed to allow asynchronous solution updates and have limited memory access patterns. This paper presents a parallel algorithm for solving the eikonal equation on fully unstructured tetrahedral meshes. The method is appropriate for the type of fine-grained parallelism found on modern massively-SIMD architectures such as graphics processors and takes into account the particular constraints and capabilities of these computing platforms. This work builds on previous work for solving these equations on triangle meshes; in this paper we adapt and extend previous two-dimensional strategies to accommodate three-dimensional, unstructured, tetrahedralized domains. These new developments include a local update strategy with data compaction for tetrahedral meshes that provides solutions on both serial and parallel architectures, with a generalization to inhomogeneous, anisotropic speed functions. We also propose two new update schemes, specialized to mitigate the natural data increase observed when moving to three dimensions, and the data structures necessary for efficiently mapping data to parallel SIMD processors in a way that maintains computational density . Finally, we present descriptions of the implementations for a single CPU, as well as multicore CPUs with shared memory and SIMD architectures, with comparative results against state-of-the-art eikonal solvers.
- Published
- 2013
- Full Text
- View/download PDF
26. Influence of tumor histology on preoperative staging accuracy of breast metastases to the axilla.
- Author
-
Hackney L, Williams S, Bajwa S, Morley-Davies AJ, Kirby RM, and Britton I
- Subjects
- Adult, Aged, Aged, 80 and over, Axilla, Biopsy, Needle, Breast Neoplasms surgery, Carcinoma, Ductal, Breast surgery, Carcinoma, Lobular surgery, Female, Humans, Lymph Nodes surgery, Lymphatic Metastasis, Medical Audit, Middle Aged, Neoplasm Micrometastasis diagnostic imaging, Neoplasm Micrometastasis pathology, Neoplasm Staging, Predictive Value of Tests, Preoperative Care, Prospective Studies, Ultrasonography, Breast Neoplasms pathology, Carcinoma, Ductal, Breast secondary, Carcinoma, Lobular secondary, Lymph Nodes diagnostic imaging, Lymph Nodes pathology
- Abstract
Histologic confirmation of axillary nodal metastases preoperatively avoids a sentinel node biopsy and enables a one step surgical procedure. The aim of this study was to establish the local positive predictive value of axillary ultrasound (AUS) and guided needle core biopsy (NCB) in axillary staging of breast cancer, and to identify factors influencing yield. A prospective audit of 142 consecutive patients (screening and symptomatic) presenting from 1st December 2008-31st May 2009 with breast lesions categorized R4-R5, who underwent a preoperative AUS, and proceeded to surgery was undertaken. Ultrasound-guided NCB was performed on nodes radiologically classified R3-R5. Lymph node size, number, and morphological features were documented. Yield was correlated with tumor size, grade, and histologic type. AUS/NCB was correlated with post surgical pathologic findings to determine sensitivity, specificity, positive and negative predictive value of AUS and NCB. A total of 142 patients underwent surgery, of whom 52 (37%) had lymph node metastases on histology. All had a preoperative AUS, 51 (36%) had abnormal ultrasound findings. 46 (90%) underwent axillary node NCB of which 24 (52%) were positive. The smallest tumor size associated with positive nodes at surgery was 11.5 mm. The sensitivity of AUS was 65%. Specificity was 81%, with a positive predictive value (PPV) of 67% and negative predictive (NPV) value of 80%. Sensitivity of U/S-guided NCB was 75%, with a specificity of 100%, PPV 100% and NPV 64%. Sensitivity of AUS for lobular carcinoma was 36% versus 76% for all other histologies. Sensitivity of NCB for lobular cancer was 33% versus 79% for all other histologies. The most significant factor producing discordance between preoperative AUS and definitive histologic evidence of lymph node metastasis was tumor type. Accurate preoperative lymph node staging was prejudiced by lobular histology (p < 0.0019)., (© 2012 Wiley Periodicals, Inc.)
- Published
- 2013
- Full Text
- View/download PDF
27. Epileptic seizure abolition with aromatase inhibition.
- Author
-
Chan KW, Kalra S, Kirby RM, Brunt AM, and Hawkins CP
- Subjects
- Antineoplastic Agents, Hormonal therapeutic use, Breast Neoplasms complications, Electroencephalography, Female, Humans, Magnetic Resonance Imaging, Middle Aged, Multiple Sclerosis complications, Tamoxifen therapeutic use, Tomography, X-Ray Computed, Anticonvulsants, Aromatase Inhibitors therapeutic use, Seizures drug therapy
- Published
- 2012
- Full Text
- View/download PDF
28. ElVis: A System for the Accurate and Interactive Visualization of High-Order Finite Element Solutions.
- Author
-
Nelson B, Liu E, Kirby RM, and Haimes R
- Abstract
This paper presents the Element Visualizer (ElVis), a new, open-source scientific visualization system for use with high-order finite element solutions to PDEs in three dimensions. This system is designed to minimize visualization errors of these types of fields by querying the underlying finite element basis functions (e.g., high-order polynomials) directly, leading to pixel-exact representations of solutions and geometry. The system interacts with simulation data through runtime plugins, which only require users to implement a handful of operations fundamental to finite element solvers. The data in turn can be visualized through the use of cut surfaces, contours, isosurfaces, and volume rendering. These visualization algorithms are implemented using NVIDIA's OptiX GPU-based ray-tracing engine, which provides accelerated ray traversal of the high-order geometry, and CUDA, which allows for effective parallel evaluation of the visualization algorithms. The direct interface between ElVis and the underlying data differentiates it from existing visualization tools. Current tools assume the underlying data is composed of linear primitives; high-order data must be interpolated with linear functions as a result. In this work, examples drawn from aerodynamic simulations-high-order discontinuous Galerkin finite element solutions of aerodynamic flows in particular-will demonstrate the superiority of ElVis' pixel-exact approach when compared with traditional linear-interpolation methods. Such methods can introduce a number of inaccuracies in the resulting visualization, making it unclear if visual artifacts are genuine to the solution data or if these artifacts are the result of interpolation errors. Linear methods additionally cannot properly visualize curved geometries (elements or boundaries) which can greatly inhibit developers' debugging efforts. As we will show, pixel-exact visualization exhibits none of these issues, removing the visualization scheme as a source of uncertainty for engineers using ElVis.
- Published
- 2012
- Full Text
- View/download PDF
29. A rare cystic breast lump - schwannoma of the breast.
- Author
-
Casey P, Stephens M, and Kirby RM
- Subjects
- Aged, Breast Neoplasms diagnostic imaging, Breast Neoplasms pathology, Female, Humans, Mammography, Mastectomy, Segmental, Neurilemmoma diagnostic imaging, Neurilemmoma pathology, Ultrasonography, Breast Neoplasms diagnosis, Neurilemmoma diagnosis
- Published
- 2012
- Full Text
- View/download PDF
30. Topology verification for isosurface extraction.
- Author
-
Etiene T, Nonato LG, Scheidegger C, Tierny J, Peters TJ, Pascucci V, Kirby RM, and Silva CT
- Abstract
The broad goals of verifiable visualization rely on correct algorithmic implementations. We extend a framework for verification of isosurfacing implementations to check topological properties. Specifically, we use stratified Morse theory and digital topology to design algorithms which verify topological invariants. Our extended framework reveals unexpected behavior and coding mistakes in popular publicly available isosurface codes.
- Published
- 2012
- Full Text
- View/download PDF
31. Direct isosurface visualization of hex-based high-order geometry and attribute representations.
- Author
-
Martin T, Cohen E, and Kirby RM
- Abstract
In this paper, we present a novel isosurface visualization technique that guarantees the accurate visualization of isosurfaces with complex attribute data defined on (un)structured (curvi)linear hexahedral grids. Isosurfaces of high-order hexahedral-based finite element solutions on both uniform grids (including MRI and CT scans) and more complex geometry representing a domain of interest that can be rendered using our algorithm. Additionally, our technique can be used to directly visualize solutions and attributes in isogeometric analysis, an area based on trivariate high-order NURBS (Non-Uniform Rational B-splines) geometry and attribute representations for the analysis. Furthermore, our technique can be used to visualize isosurfaces of algebraic functions. Our approach combines subdivision and numerical root finding to form a robust and efficient isosurface visualization algorithm that does not miss surface features, while finding all intersections between a view frustum and desired isosurfaces. This allows the use of view-independent transparency in the rendering process. We demonstrate our technique through a straightforward CPU implementation on both complex-structured and complex-unstructured geometries with high-order simulation solutions, isosurfaces of medical data sets, and isosurfaces of algebraic functions.
- Published
- 2012
- Full Text
- View/download PDF
32. INTERACTIVE VISUALIZATION OF PROBABILITY AND CUMULATIVE DENSITY FUNCTIONS.
- Author
-
Potter K, Kirby RM, Xiu D, and Johnson CR
- Abstract
The probability density function (PDF), and its corresponding cumulative density function (CDF), provide direct statistical insight into the characterization of a random process or field. Typically displayed as a histogram, one can infer probabilities of the occurrence of particular events. When examining a field over some two-dimensional domain in which at each point a PDF of the function values is available, it is challenging to assess the global (stochastic) features present within the field. In this paper, we present a visualization system that allows the user to examine two-dimensional data sets in which PDF (or CDF) information is available at any position within the domain. The tool provides a contour display showing the normed difference between the PDFs and an ansatz PDF selected by the user and, furthermore, allows the user to interactively examine the PDF at any particular position. Canonical examples of the tool are provided to help guide the reader into the mapping of stochastic information to visual cues along with a description of the use of the tool for examining data generated from an uncertainty quantification exercise accomplished within the field of electrophysiology.
- Published
- 2012
- Full Text
- View/download PDF
33. GPU-based interactive cut-surface extraction from high-order finite element fields.
- Author
-
Nelson B, Haimes R, and Kirby RM
- Abstract
We present a GPU-based ray-tracing system for the accurate and interactive visualization of cut-surfaces through 3D simulations of physical processes created from spectral/hp high-order finite element methods. When used by the numerical analyst to debug the solver, the ability for the imagery to precisely reflect the data is critical. In practice, the investigator interactively selects from a palette of visualization tools to construct a scene that can answer a query of the data. This is effective as long as the implicit contract of image quality between the individual and the visualization system is upheld. OpenGL rendering of scientific visualizations has worked remarkably well for exploratory visualization for most solver results. This is due to the consistency between the use of first-order representations in the simulation and the linear assumptions inherent in OpenGL (planar fragments and color-space interpolation). Unfortunately, the contract is broken when the solver discretization is of higher-order. There have been attempts to mitigate this through the use of spatial adaptation and/or texture mapping. These methods do a better job of approximating what the imagery should be but are not exact and tend to be view-dependent. This paper introduces new rendering mechanisms that specifically deal with the kinds of native data generated by high-order finite element solvers. The exploratory visualization tools are reassessed and cast in this system with the focus on image accuracy. This is accomplished in a GPU setting to ensure interactivity., (© 2011 IEEE)
- Published
- 2011
- Full Text
- View/download PDF
34. Cardiac position sensitivity study in the electrocardiographic forward problem using stochastic collocation and boundary element methods.
- Author
-
Swenson DJ, Geneser SE, Stinstra JG, Kirby RM, and MacLeod RS
- Subjects
- Adult, Computer Simulation, Female, Humans, Male, Myocardial Ischemia diagnosis, Posture physiology, Sensitivity and Specificity, Stochastic Processes, Electrocardiography methods, Finite Element Analysis, Heart physiology
- Abstract
The electrocardiogram (ECG) is ubiquitously employed as a diagnostic and monitoring tool for patients experiencing cardiac distress and/or disease. It is widely known that changes in heart position resulting from, for example, posture of the patient (sitting, standing, lying) and respiration significantly affect the body-surface potentials; however, few studies have quantitatively and systematically evaluated the effects of heart displacement on the ECG. The goal of this study was to evaluate the impact of positional changes of the heart on the ECG in the specific clinical setting of myocardial ischemia. To carry out the necessary comprehensive sensitivity analysis, we applied a relatively novel and highly efficient statistical approach, the generalized polynomial chaos-stochastic collocation method, to a boundary element formulation of the electrocardiographic forward problem, and we drove these simulations with measured epicardial potentials from whole-heart experiments. Results of the analysis identified regions on the body-surface where the potentials were especially sensitive to realistic heart motion. The standard deviation (STD) of ST-segment voltage changes caused by the apex of a normal heart, swinging forward and backward or side-to-side was approximately 0.2 mV. Variations were even larger, 0.3 mV, for a heart exhibiting elevated ischemic potentials. These variations could be large enough to mask or to mimic signs of ischemia in the ECG. Our results suggest possible modifications to ECG protocols that could reduce the diagnostic error related to postural changes in patients possibly suffering from myocardial ischemia.
- Published
- 2011
- Full Text
- View/download PDF
35. Quantifying variability in radiation dose due to respiratory-induced tumor motion.
- Author
-
Geneser SE, Hinkle JD, Kirby RM, Wang B, Salter B, and Joshi S
- Subjects
- Humans, Motion, Radiation Dosage, Radiographic Image Interpretation, Computer-Assisted methods, Radiotherapy, Computer-Assisted methods, Reproducibility of Results, Sensitivity and Specificity, Body Burden, Lung Neoplasms diagnostic imaging, Lung Neoplasms radiotherapy, Radiation Protection methods, Respiratory Mechanics, Respiratory-Gated Imaging Techniques methods, Tomography, X-Ray Computed methods
- Abstract
State of the art radiation treatment methods such as hypo-fractionated stereotactic body radiation therapy (SBRT) can successfully destroy tumor cells and avoid damaging healthy tissue by delivering high-level radiation dose that precisely conforms to the tumor shape. Though these methods work well for stationary tumors, SBRT dose delivery is particularly susceptible to organ motion, and few techniques capable of resolving and compensating for respiratory-induced organ motion have reached clinical practice. The current treatment pipeline cannot accurately predict nor account for respiratory-induced motion in the abdomen that may result in significant displacement of target lesions during the breathing cycle. Sensitivity of dose deposition to respiratory-induced organ motion represents a significant challenge and may account for observed discrepancies between predictive treatment plan indicators and clinical patient outcomes. Improved treatment-planning and delivery of SBRT requires an accurate prediction of dose deposition uncertainties resulting from respiratory motion. To accomplish this goal, we developed a framework that models both organ displacement in response to respiration and the underlying random variations in patient-specific breathing patterns. Our organ deformation model is a four-dimensional maximum a posteriori (MAP) estimation of tissue deformation as a function of chest wall amplitudes computed from clinically obtained respiratory-correlated computed tomography (RCCT) images. We characterize patient-specific respiration as the probability density function (PDF) of chest wall amplitudes and model patient breathing patterns as a random process. We then combine the patient-specific organ motion and stochastic breathing models to calculate the resulting variability in radiation dose accumulation. This process allows us to predict uncertainties in dose delivery in the presence of organ motion and identify tissues at risk of receiving insufficient or harmful levels of radiation., (Copyright © 2010 Elsevier B.V. All rights reserved.)
- Published
- 2011
- Full Text
- View/download PDF
36. Finite-element-based discretization and regularization strategies for 3-D inverse electrocardiography.
- Author
-
Wang D, Kirby RM, and Johnson CR
- Subjects
- Animals, Computer Simulation, Dogs, Heart physiology, Humans, Phantoms, Imaging, Algorithms, Body Surface Potential Mapping methods, Finite Element Analysis, Models, Cardiovascular, Signal Processing, Computer-Assisted
- Abstract
We consider the inverse electrocardiographic problem of computing epicardial potentials from a body-surface potential map. We study how to improve numerical approximation of the inverse problem when the finite-element method is used. Being ill-posed, the inverse problem requires different discretization strategies from its corresponding forward problem. We propose refinement guidelines that specifically address the ill-posedness of the problem. The resulting guidelines necessitate the use of hybrid finite elements composed of tetrahedra and prism elements. Also, in order to maintain consistent numerical quality when the inverse problem is discretized into different scales, we propose a new family of regularizers using the variational principle underlying finite-element methods. These variational-formed regularizers serve as an alternative to the traditional Tikhonov regularizers, but preserves the L(2) norm and thereby achieves consistent regularization in multiscale simulations. The variational formulation also enables a simple construction of the discrete gradient operator over irregular meshes, which is difficult to define in traditional discretization schemes. We validated our hybrid element technique and the variational regularizers by simulations on a realistic 3-D torso/heart model with empirical heart data. Results show that discretization based on our proposed strategies mitigates the ill-conditioning and improves the inverse solution, and that the variational formulation may benefit a broader range of potential-based bioelectric problems.
- Published
- 2011
- Full Text
- View/download PDF
37. An optimization framework for inversely estimating myocardial transmembrane potentials and localizing ischemia.
- Author
-
Wang D, Kirby RM, Macleod RS, and Johnson CR
- Subjects
- Computer Simulation, Humans, Muscle Cells, Reproducibility of Results, Sensitivity and Specificity, Body Surface Potential Mapping methods, Diagnosis, Computer-Assisted methods, Heart Conduction System physiopathology, Membrane Potentials, Models, Cardiovascular, Myocardial Ischemia diagnosis, Myocardial Ischemia physiopathology
- Abstract
By combining a static bidomain heart model with a torso conduction model, we studied the inverse electrocardiographic problem of computing the transmembrane potentials (TMPs) throughout the myocardium from a body-surface potential map, and then used the recovered potentials to localize myocardial ischemia. Our main contribution is solving the inverse problem within a constrained optimization framework, which is a generalization of previous methods for calculating transmembrane potentials. The framework offers ample flexibility for users to apply various physiologically-based constraints, and is well supported by mature algorithms and solvers developed by the optimization community. By avoiding the traditional inverse ECG approach of building the lead-field matrix, the framework greatly reduces computation cost and, by setting the associated forward problem as a constraint, the framework enables one to flexibly set individualized resolutions for each physical variable, a desirable feature for balancing model accuracy, ill-conditioning and computation tractability. Although the task of computing myocardial TMPs at an arbitrary time instance remains an open problem, we showed that it is possible to obtain TMPs with moderate accuracy during the ST segment by assuming all cardiac cells are at the plateau phase. Moreover, the calculated TMPs yielded a good estimate of ischemic regions, which was of more clinical interest than the voltage values themselves. We conducted finite element simulations of a phantom experiment over a 2D torso model with synthetic ischemic data. Preliminary results indicated that our approach is feasible and suitably accurate for the common case of transmural myocardial ischemia.
- Published
- 2011
- Full Text
- View/download PDF
38. A FAST ITERATIVE METHOD FOR SOLVING THE EIKONAL EQUATION ON TRIANGULATED SURFACES.
- Author
-
Fu Z, Jeong WK, Pan Y, Kirby RM, and Whitaker RT
- Abstract
This paper presents an efficient, fine-grained parallel algorithm for solving the Eikonal equation on triangular meshes. The Eikonal equation, and the broader class of Hamilton-Jacobi equations to which it belongs, have a wide range of applications from geometric optics and seismology to biological modeling and analysis of geometry and images. The ability to solve such equations accurately and efficiently provides new capabilities for exploring and visualizing parameter spaces and for solving inverse problems that rely on such equations in the forward model. Efficient solvers on state-of-the-art, parallel architectures require new algorithms that are not, in many cases, optimal, but are better suited to synchronous updates of the solution. In previous work [W. K. Jeong and R. T. Whitaker, SIAM J. Sci. Comput., 30 (2008), pp. 2512-2534], the authors proposed the fast iterative method (FIM) to efficiently solve the Eikonal equation on regular grids. In this paper we extend the fast iterative method to solve Eikonal equations efficiently on triangulated domains on the CPU and on parallel architectures, including graphics processors. We propose a new local update scheme that provides solutions of first-order accuracy for both architectures. We also propose a novel triangle-based update scheme and its corresponding data structure for efficient irregular data mapping to parallel single-instruction multiple-data (SIMD) processors. We provide detailed descriptions of the implementations on a single CPU, a multicore CPU with shared memory, and SIMD architectures with comparative results against state-of-the-art Eikonal solvers.
- Published
- 2011
- Full Text
- View/download PDF
39. Passage of intestinal (small bowel) cast--an unusual complication of neutropenic sepsis.
- Author
-
Samee A, Kirby RM, and Brunt AM
- Subjects
- Anti-Bacterial Agents therapeutic use, Antifungal Agents therapeutic use, Breast Neoplasms drug therapy, Defecation, Disease Progression, Female, Humans, Middle Aged, Neutropenia drug therapy, Neutropenia etiology, Sepsis drug therapy, Sepsis etiology, Chemotherapy, Adjuvant adverse effects, Intestinal Mucosa microbiology, Intestine, Small, Mycoses diagnosis
- Abstract
A 52-year-old woman was admitted with neutropenic sepsis, 3 days following the final cycle of adjuvant chemotherapy for breast cancer. Her condition deteriorated with progressive abdominal distension, bilious vomiting and diarrhoea. Abdominal examination revealed a mild degree of peritonism. Five days later she passed a small bowel cast per rectum, showing gross fungal contamination on histology. She was managed conservatively with antibiotics and antifungal medications and nutritional support.
- Published
- 2010
- Full Text
- View/download PDF
40. Resolution strategies for the finite-element-based solution of the ECG inverse problem.
- Author
-
Wang D, Kirby RM, and Johnson CR
- Subjects
- Algorithms, Computer Simulation, Humans, Electrocardiography methods, Finite Element Analysis, Image Processing, Computer-Assisted methods, Models, Biological, Signal Processing, Computer-Assisted
- Abstract
Successful employment of numerical techniques for the solution of forward and inverse ECG problems requires the ability to both quantify and minimize approximation errors introduced as part of the discretization process. Our objective is to develop discretization and refinement strategies involving hybrid-shaped finite elements so as to minimize approximation errors for the ECG inverse problem. We examine both the ill-posedness of the mathematical inverse problem and the ill-conditioning of the discretized system in order to propose strategies specifically designed for the ECG inverse problem. We demonstrate that previous discretization and approximation strategies may worsen the properties of the inverse problem approximation. We then demonstrate the efficacy of our strategies on both a simplified and a realistic 2-D torso model.
- Published
- 2010
- Full Text
- View/download PDF
41. Verifiable visualization for isosurface extraction.
- Author
-
Etiene T, Scheidegger C, Nonato LG, Kirby RM, and Silva CT
- Abstract
Visual representations of isosurfaces are ubiquitous in the scientific and engineering literature. In this paper, we present techniques to assess the behavior of isosurface extraction codes. Where applicable, these techniques allow us to distinguish whether anomalies in isosurface features can be attributed to the underlying physical process or to artifacts from the extraction process. Such scientific scrutiny is at the heart of verifiable visualization--subjecting visualization algorithms to the same verification process that is used in other components of the scientific pipeline. More concretely, we derive formulas for the expected order of accuracy (or convergence rate) of several isosurface features, and compare them to experimentally observed results in the selected codes. This technique is practical: in two cases, it exposed actual problems in implementations. We provide the reader with the range of responses they can expect to encounter with isosurface techniques, both under "normal operating conditions" and also under adverse conditions. Armed with this information--the results of the verification process--practitioners can judiciously select the isosurface extraction technique appropriate for their problem of interest, and have confidence in its behavior.
- Published
- 2009
- Full Text
- View/download PDF
42. Using the stochastic collocation method for the uncertainty quantification of drug concentration due to depot shape variability.
- Author
-
Preston JS, Tasdizen T, Terry CM, Cheung AK, and Kirby RM
- Subjects
- Algorithms, Anastomosis, Surgical, Animals, Hyperplasia prevention & control, Models, Biological, Models, Statistical, Monte Carlo Method, Polytetrafluoroethylene chemistry, Prostheses and Implants, Swine, Antibiotics, Antineoplastic pharmacokinetics, Computer Simulation, Finite Element Analysis, Models, Animal, Sirolimus pharmacokinetics
- Abstract
Numerical simulations entail modeling assumptions that impact outcomes. Therefore, characterizing, in a probabilistic sense, the relationship between the variability of model selection and the variability of outcomes is important. Under certain assumptions, the stochastic collocation method offers a computationally feasible alternative to traditional Monte Carlo approaches for assessing the impact of model and parameter variability. We propose a framework that combines component shape parameterization with the stochastic collocation method to study the effect of drug depot shape variability on the outcome of drug diffusion simulations in a porcine model. We use realistic geometries segmented from MR images and employ level-set techniques to create two alternative univariate shape parameterizations. We demonstrate that once the underlying stochastic process is characterized, quantification of the introduced variability is quite straightforward and provides an important step in the validation and verification process.
- Published
- 2009
- Full Text
- View/download PDF
43. Incorporating patient breathing variability into a stochastic model of dose deposition for stereotactic body radiation therapy.
- Author
-
Geneser SE, Kirby RM, Wang B, Salter B, and Joshi S
- Subjects
- Algorithms, Computer Simulation, Humans, Models, Biological, Models, Statistical, Radiographic Image Enhancement methods, Radiographic Image Interpretation, Computer-Assisted methods, Reproducibility of Results, Sensitivity and Specificity, Artifacts, Imaging, Three-Dimensional methods, Radiosurgery methods, Respiratory Mechanics, Respiratory-Gated Imaging Techniques methods, Surgery, Computer-Assisted methods, Tomography, X-Ray Computed methods
- Abstract
Hypo-fractionated stereotactic body radiation therapy (SBRT) employs precisely-conforming high-level radiation dose delivery to improve tumor control probabilities and sparing of healthy tissue. However, the delivery precision and conformity of SBRT renders dose accumulation particularly susceptible to organ motion, and respiratory-induced motion in the abdomen may result in significant displacement of lesion targets during the breathing cycle. Given the maturity of the technology, sensitivity of dose deposition to respiratory-induced organ motion represents a significant factor in observed discrepancies between predictive treatment plan indicators and clinical patient outcome statistics and one of the major outstanding unsolved problems in SBRT. Techniques intended to compensate for respiratory-induced organ motion have been investigated, but very few have yet reached clinical practice. To improve SBRT, it is necessary to overcome the challenge that uncertainties in dose deposition due to organ motion present. This requires incorporating an accurate prediction of the effects of the random nature of the respiratory process on SBRT dose deposition for improved treatment planning and delivery of SBRT. We introduce a means of characterizing the underlying day-to-day variability of patient breathing and calculate the resulting stochasticity in dose accumulation.
- Published
- 2009
- Full Text
- View/download PDF
44. Particle-based sampling and meshing of surfaces in multimaterial volumes.
- Author
-
Meyer M, Whitaker R, Kirby RM, Ledergerber C, and Pfister H
- Abstract
Methods that faithfully and robustly capture the geometry of complex material interfaces in labeled volume data are important for generating realistic and accurate visualizations and simulations of real-world objects. The generation of such multimaterial models from measured data poses two unique challenges: first, the surfaces must be well-sampled with regular, efficient tessellations that are consistent across material boundaries; and second, the resulting meshes must respect the nonmanifold geometry of the multimaterial interfaces. This paper proposes a strategy for sampling and meshing multimaterial volumes using dynamic particle systems, including a novel, differentiable representation of the material junctions that allows the particle system to explicitly sample corners, edges, and surfaces of material intersections. The distributions of particles are controlled by fundamental sampling constraints, allowing Delaunay-based meshing algorithms to reliably extract watertight meshes of consistently high-quality.
- Published
- 2008
- Full Text
- View/download PDF
45. The need for verifiable visualization.
- Author
-
Kirby RM and Silva CT
- Subjects
- Computer Graphics, Computer Simulation, Models, Theoretical, User-Computer Interface
- Abstract
Visualization is often employed as part of the simulation science pipeline-it's the window through which scientists examine their data for deriving new science, and the lens used to view modeling and discretization interactions within their simulations. We advocate that as a component of the simulation science pipeline, visualization must be explicitly considered as part of the validation and verification (V&V) process. In this article, the authors define V&V in the context of computational science, discuss the role of V&V in the scientific process, and present arguments for the need for verifiable visualization.
- Published
- 2008
- Full Text
- View/download PDF
46. Patient choice significantly affects mastectomy rates in the treatment of breast cancer.
- Author
-
Kirby RM, Basit A, and Manimaran N
- Abstract
Mastectomy rates may be affected by patient choice. 203 patients who had a Total Mastectomy for breast cancer were invited to complete questionnaires at routine follow up clinics to ascertain if they had been offered a choice of Breast Conserving Surgery (BCS), and to establish the reasons for their preference. Questionnaires were checked against medical and nursing records to confirm the reasons for the patients' choice of mastectomy. 130 patients (64%) chose to have a mastectomy, reporting that they felt safer (n = 119); wanted to decrease the risk of further surgery (n = 87) and/or wished to avoid radiotherapy (n = 34). Some were advised not to have BCS if they had a large tumour size, central or multifocal tumours and/or associated extensive microcalcification on mammography (n = 29). 24 patients had BCS as first operation but had repeat surgery for involved or narrow excision margins. Despite being advised that there is no difference between survival rates of this and breast conserving surgery, many patients still feel safer with mastectomy.
- Published
- 2008
- Full Text
- View/download PDF
47. Investigation of smoothness-increasing accuracy-conserving filters for improving streamline integration through discontinuous fields.
- Author
-
Steffen M, Curtis S, Kirby RM, and Ryan JK
- Subjects
- Computer Simulation, Finite Element Analysis, Models, Theoretical, Numerical Analysis, Computer-Assisted, Systems Integration, Algorithms, Computer Graphics, Image Enhancement methods, Image Interpretation, Computer-Assisted methods, Imaging, Three-Dimensional methods, Rheology methods, Signal Processing, Computer-Assisted
- Abstract
Streamline integration of fields produced by computational fluid mechanics simulations is a commonly used tool for the investigation and analysis of fluid flow phenomena. Integration is often accomplished through the application of ordinary differential equation (ODE) integrators--integrators whose error characteristics are predicated on the smoothness of the field through which the streamline is being integrated--smoothness which is not available at the inter-element level of finite volume and finite element data. Adaptive error control techniques are often used to ameliorate the challenge posed by inter-element discontinuities. As the root of the difficulties is the discontinuous nature of the data, we present a complementary approach of applying smoothness-enhancing accuracy-conserving filters to the data prior to streamline integration. We investigate whether such an approach applied to uniform quadrilateral discontinuous Galerkin (high-order finite volume) data can be used to augment current adaptive error control approaches. We discuss and demonstrate through numerical example the computational trade-offs exhibited when one applies such a strategy.
- Published
- 2008
- Full Text
- View/download PDF
48. Application of stochastic finite element methods to study the sensitivity of ECG forward modeling to organ conductivity.
- Author
-
Geneser SE, Kirby RM, and MacLeod RS
- Subjects
- Computer Simulation, Electric Conductivity, Finite Element Analysis, Humans, Models, Statistical, Stochastic Processes, Body Surface Potential Mapping methods, Diagnosis, Computer-Assisted methods, Electrocardiography methods, Heart Conduction System physiopathology, Models, Cardiovascular
- Abstract
Because numerical simulation parameters may significantly influence the accuracy of the results, evaluating the sensitivity of simulation results to variations in parameters is essential. Although the field of sensitivity analysis is well developed, systematic application of such methods to complex biological models is limited due to the associated high computational costs and the substantial technical challenges for implementation. In the specific case of the forward problem in electrocardiography, the lack of robust, feasible, and comprehensive sensitivity analysis has left many aspects of the problem unresolved and subject to empirical and intuitive evaluation rather than sound, quantitative investigation. In this study, we have developed a systematic, stochastic approach to the analysis of sensitivity of the forward problem of electrocardiography to the parameter of inhomogeneous tissue conductivity. We apply this approach to a two-dimensional, inhomogeneous, geometric model of a slice through the human thorax. We assigned probability density functions for various organ conductivities and applied stochastic finite elements based on the generalized polynomial chaos-stochastic Galerkin (gPC-SG) method to obtain the standard deviation of the resulting stochastic torso potentials. This method utilizes a spectral representation of the stochastic process to obtain numerically accurate stochastic solutions in a fraction of the time required when employing classic Monte Carlo methods. We have shown that a systematic study of sensitivity is not only easily feasible with the gPC-SG approach but can also provide valuable insight into characteristics of the specific simulation.
- Published
- 2008
- Full Text
- View/download PDF
49. Three stage axillary lymphatic massage optimizes sentinel lymph node localisation using blue dye.
- Author
-
Kirby RM, Basit A, Nguyen QT, Jaipersad A, and Billingham R
- Abstract
Aims: This paper describes a simple technique of axillary and breast massage which improves the successful identification of blue sentinel nodes using patent blue dye alone., Methods: Patent blue dye was injected in the subdermal part of the retroaroelar area in 167 patients having surgical treatment for invasive breast cancer. Three stage axillary lymphatic massage was performed prior to making the axillary incision for sentinel lymph node biopsy. All patients had completion axillary sampling or clearance., Results: A blue lymphatic duct leading to lymph nodes of the first drainage was identified in 163 (97%) of the patients. Results are compared with 168 patients who had sentinel lymph node biopsy using blue dye without axillary massage. Allergic reactions were observed in four patients (1.2%)., Conclusion: Three stage axillary lymphatic massage improves the successful identification of a blue sentinel lymph node in breast cancer patients.
- Published
- 2007
- Full Text
- View/download PDF
50. Topology, accuracy, and quality of isosurface meshes using dynamic particles.
- Author
-
Meyer M, Kirby RM, and Whitaker R
- Abstract
This paper describes a method for constructing isosurface triangulations of sampled, volumetric, three-dimensional scalar fields. The resulting meshes consist of triangles that are of consistently high quality, making them well suited for accurate interpolation of scalar and vector-valued quantities, as required for numerous applications in visualization and numerical simulation. The proposed method does not rely on a local construction or adjustment of triangles as is done, for instance, in advancing wavefront or adaptive refinement methods. Instead, a system of dynamic particles optimally samples an implicit function such that the particles' relative positions can produce a topologically correct Delaunay triangulation. Thus, the proposed method relies on a global placement of triangle vertices. The main contributions of the paper are the integration of dynamic particles systems with surface sampling theory and PDE-based methods for controlling the local variability of particle densities, as well as detailing a practical method that accommodates Delaunay sampling requirements to generate sparse sets of points for the production of high-quality tessellations.
- Published
- 2007
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.