17 results on '"Michele Guindani"'
Search Results
2. Generalized Species Sampling Priors With Latent Beta Reinforcements
- Author
-
Edoardo M. Airoldi, Federico Bassetti, Thiago Costa, Fabrizio Leisen, Michele Guindani, Edoardo M. Airoldi, Federico Bassetti, Thiago Costa, Fabrizio Leisen, and Michele Guindani
- Published
- 2015
- Full Text
- View/download PDF
3. Generalized Species Sampling Priors With Latent Beta Reinforcements
- Author
-
Edoardo M. Airoldi, Federico Bassetti, Thiago Costa, Fabrizio Leisen, Michele Guindani, Edoardo M. Airoldi, Federico Bassetti, Thiago Costa, Fabrizio Leisen, and Michele Guindani
- Published
- 2015
- Full Text
- View/download PDF
4. Nonparametric Bayes approach for a semi-mechanistic pharmacokinetic and pharmacodynamic model
- Author
-
Christensen, Ronald, Ronald Christensen, Michele Guindani, Gabriel Huerta, Erik Barry Erhardt, Dong, Yan, Christensen, Ronald, Ronald Christensen, Michele Guindani, Gabriel Huerta, Erik Barry Erhardt, and Dong, Yan
- Subjects
- Dirichlet ProcessProcess; Nonparametric Bayes; Pharmacokinetics; Pharmacodynamics; Ordinary differential equations
- Abstract
Both frequentist and Bayesian approaches have been used to characterize population pharmacokinetics and pharmacodynamics(PK/PD) models. These methods focus on estimating the population parameters and assessing the association between the characteristics of PK/PD and the subject covariates. In this work, we propose a Dirichlet process mixture model to classify the patients based on their individualized pharmacokinetic and pharmacodynamic profiles. Then we can predict the new patients dose-response curves given their concentration-time profiles. Additionally, we implement a modern Markov Chain Monte Carlo algorithm for sampling inference of parameters. The detailed sampling procedures as well as the results are discussed in a simulation data and a real data example. We also evaluate an approximate solution of a system of nonlinear differential equations from Euler's method and compare the results with a general numerical solver, ode from R package, deSolve.
- Published
- 2015
5. Nonparametric Bayes approach for a semi-mechanistic pharmacokinetic and pharmacodynamic model
- Author
-
Christensen, Ronald, Ronald Christensen, Michele Guindani, Gabriel Huerta, Erik Barry Erhardt, Dong, Yan, Christensen, Ronald, Ronald Christensen, Michele Guindani, Gabriel Huerta, Erik Barry Erhardt, and Dong, Yan
- Subjects
- Dirichlet ProcessProcess; Nonparametric Bayes; Pharmacokinetics; Pharmacodynamics; Ordinary differential equations
- Abstract
Both frequentist and Bayesian approaches have been used to characterize population pharmacokinetics and pharmacodynamics(PK/PD) models. These methods focus on estimating the population parameters and assessing the association between the characteristics of PK/PD and the subject covariates. In this work, we propose a Dirichlet process mixture model to classify the patients based on their individualized pharmacokinetic and pharmacodynamic profiles. Then we can predict the new patients dose-response curves given their concentration-time profiles. Additionally, we implement a modern Markov Chain Monte Carlo algorithm for sampling inference of parameters. The detailed sampling procedures as well as the results are discussed in a simulation data and a real data example. We also evaluate an approximate solution of a system of nonlinear differential equations from Euler's method and compare the results with a general numerical solver, ode from R package, deSolve.
- Published
- 2015
6. Generalized Species Sampling Priors With Latent Beta Reinforcements
- Author
-
Edoardo M. Airoldi, Federico Bassetti, Thiago Costa, Fabrizio Leisen, Michele Guindani, Edoardo M. Airoldi, Federico Bassetti, Thiago Costa, Fabrizio Leisen, and Michele Guindani
- Published
- 2014
- Full Text
- View/download PDF
7. Generalized Species Sampling Priors With Latent Beta Reinforcements
- Author
-
Edoardo M. Airoldi, Federico Bassetti, Thiago Costa, Fabrizio Leisen, Michele Guindani, Edoardo M. Airoldi, Federico Bassetti, Thiago Costa, Fabrizio Leisen, and Michele Guindani
- Published
- 2014
- Full Text
- View/download PDF
8. Stable isotope sourcing using sampling
- Author
-
Bedrick, Edward, Edward John Bedrick, Ronald Christensen, Gabriel Huerta, Michele Guindani, Erhardt, Erik, Bedrick, Edward, Edward John Bedrick, Ronald Christensen, Gabriel Huerta, Michele Guindani, and Erhardt, Erik
- Subjects
- Physiological ecology--Statistical methods
- Abstract
Stable isotope sourcing is used to estimate proportional contributions of sources to a mixture, such as in the analysis of animal diets, plant nutrient use, geochemistry, pollution, and forensics. We focus on animal ecology because of the particular complexities due to the process of digestion and assimilation. Parameter estimation has been a challenge because there are often many sources and few isotopes leading to an underconstrained linear system for the diet probability vector. This dissertation offers three primary contributions to the mixing model community. (1) We detail and provide an R implementation of a better algorithm (SISUS) for representing possible solutions in the underconstrained case (many sources, few isotopes) when no variance is considered (Phillips and Gregg, 2003). (2) We provide general methods for performing frequentist estimation in the perfectly-constrained case using the delta method and the bootstrap, which extends previous work applying the delta method to two- and three-source problems (Phillips and Gregg, 2001). (3) We propose two Bayesian models, the implicit representation model estimating the population mean diet through the mean mixture isotope ratio, and the explicit representation model estimating the population mean diet through mixture-specific diets given individual isotope ratios. Secondary contributions include (4) estimation using summaries from the literature in lieu of observation-level data, (5) multiple methods for incorporating isotope ratio discrimination (fractionation) in the analysis, (6) the use of measurement error to account for and partition more uncertainty, (7) estimation improvements by pooling multiple estimates, and (8) detailing scenarios when one model is preferred over another. We show that the Bayesian explicit representation model provides more precise diet estimates than other models when measurement error is small and informed by the necessary calibration measurements.
- Published
- 2010
9. Contributions to partial least squares regression and supervised principal component analysis modeling
- Author
-
Bedrick, Edward, Edward John Bedrick, Michele Guindani, Gabriel Huerta, Huining Kang, Jiang, Yizho, Bedrick, Edward, Edward John Bedrick, Michele Guindani, Gabriel Huerta, Huining Kang, and Jiang, Yizho
- Subjects
- Latent structure analysis
- Abstract
Latent structure techniques have recently found extensive use in regression analysis for high dimensional data. This thesis attempts to examine and expand two of such methods, Partial Least Squares (PLS) regression and Supervised Principal Component Analysis (SPCA). We propose several new algorithms, including a quadratic spline PLS, a cubic spline PLS, two fractional polynomial PLS algorithms and two multivariate SPCA algorithms. These new algorithms were compared to several popular PLS algorithms using real and simulated datasets. Cross validation was used to assess the goodness-of-fit and prediction accuracy of the various models. Strengths and weaknesses of each method were also discussed based on model stability, robustness and parsimony. The linear PLS and the multivariate SPCA methods were found to be the most robust among the methods considered, and usually produced models with good t and prediction. Nonlinear PLS methods are generally more powerful in fitting nonlinear data, but they had the tendency to over-fit, especially with small sample sizes. A forward stepwise predictor pre-screening procedure was proposed for multivariate SPCA and our examples demonstrated its effectiveness in picking a smaller number of predictors than the standard univariate testing procedure.
- Published
- 2010
10. Airborne particulate contamination effect on high voltage breakdowns during tube conditioning
- Author
-
Storlie, Curtis B, Curtis B. Storlie, Marianna D. LaNoue, Michele Guindani, Carbajal, Armida J, Storlie, Curtis B, Curtis B. Storlie, Marianna D. LaNoue, Michele Guindani, and Carbajal, Armida J
- Subjects
- Neutron sources
- Abstract
In this research we examine high voltage breakdowns (HVBs) during neutron tube conditioning which has been a problem for decades. In the recent past there has been much debate on whether or not to procure a real-time airborne monitoring system for the commercial production of neutron tubes in order to determine the effect and calculate the impact of airborne particles. The main problem is, such monitoring system is costly, and with the exact causes of HVBs not being fully known, the expense must be justified. The goal of this thesis was to analyze the instrumentation used in airborne particle monitoring in order to assert that the instruments were reliable in obtaining the data needed to make improvements. General reliability studies on the instruments were conducted followed by a quasi-experiment which led to the finding that airborne particulates have a measureable effect on external HVBs. This finding led to an observational study on the production floor which examines internal HVBs. An exploratory analysis of the data obtained was conducted and preliminary results showed that the particles may influence the occurrence of internal HVBs in the tubes. As a result of this research the data justified the need to have a real-time airborne monitoring system in order to conduct further research and funding for the system was granted.
- Published
- 2010
11. Stable isotope sourcing using sampling
- Author
-
Bedrick, Edward, Edward John Bedrick, Ronald Christensen, Gabriel Huerta, Michele Guindani, Erhardt, Erik, Bedrick, Edward, Edward John Bedrick, Ronald Christensen, Gabriel Huerta, Michele Guindani, and Erhardt, Erik
- Subjects
- Physiological ecology--Statistical methods
- Abstract
Stable isotope sourcing is used to estimate proportional contributions of sources to a mixture, such as in the analysis of animal diets, plant nutrient use, geochemistry, pollution, and forensics. We focus on animal ecology because of the particular complexities due to the process of digestion and assimilation. Parameter estimation has been a challenge because there are often many sources and few isotopes leading to an underconstrained linear system for the diet probability vector. This dissertation offers three primary contributions to the mixing model community. (1) We detail and provide an R implementation of a better algorithm (SISUS) for representing possible solutions in the underconstrained case (many sources, few isotopes) when no variance is considered (Phillips and Gregg, 2003). (2) We provide general methods for performing frequentist estimation in the perfectly-constrained case using the delta method and the bootstrap, which extends previous work applying the delta method to two- and three-source problems (Phillips and Gregg, 2001). (3) We propose two Bayesian models, the implicit representation model estimating the population mean diet through the mean mixture isotope ratio, and the explicit representation model estimating the population mean diet through mixture-specific diets given individual isotope ratios. Secondary contributions include (4) estimation using summaries from the literature in lieu of observation-level data, (5) multiple methods for incorporating isotope ratio discrimination (fractionation) in the analysis, (6) the use of measurement error to account for and partition more uncertainty, (7) estimation improvements by pooling multiple estimates, and (8) detailing scenarios when one model is preferred over another. We show that the Bayesian explicit representation model provides more precise diet estimates than other models when measurement error is small and informed by the necessary calibration measurements.
- Published
- 2010
12. Airborne particulate contamination effect on high voltage breakdowns during tube conditioning
- Author
-
Storlie, Curtis B, Curtis B. Storlie, Marianna D. LaNoue, Michele Guindani, Carbajal, Armida J, Storlie, Curtis B, Curtis B. Storlie, Marianna D. LaNoue, Michele Guindani, and Carbajal, Armida J
- Subjects
- Neutron sources
- Abstract
In this research we examine high voltage breakdowns (HVBs) during neutron tube conditioning which has been a problem for decades. In the recent past there has been much debate on whether or not to procure a real-time airborne monitoring system for the commercial production of neutron tubes in order to determine the effect and calculate the impact of airborne particles. The main problem is, such monitoring system is costly, and with the exact causes of HVBs not being fully known, the expense must be justified. The goal of this thesis was to analyze the instrumentation used in airborne particle monitoring in order to assert that the instruments were reliable in obtaining the data needed to make improvements. General reliability studies on the instruments were conducted followed by a quasi-experiment which led to the finding that airborne particulates have a measureable effect on external HVBs. This finding led to an observational study on the production floor which examines internal HVBs. An exploratory analysis of the data obtained was conducted and preliminary results showed that the particles may influence the occurrence of internal HVBs in the tubes. As a result of this research the data justified the need to have a real-time airborne monitoring system in order to conduct further research and funding for the system was granted.
- Published
- 2010
13. Contributions to partial least squares regression and supervised principal component analysis modeling
- Author
-
Bedrick, Edward, Edward John Bedrick, Michele Guindani, Gabriel Huerta, Huining Kang, Jiang, Yizho, Bedrick, Edward, Edward John Bedrick, Michele Guindani, Gabriel Huerta, Huining Kang, and Jiang, Yizho
- Subjects
- Latent structure analysis
- Abstract
Latent structure techniques have recently found extensive use in regression analysis for high dimensional data. This thesis attempts to examine and expand two of such methods, Partial Least Squares (PLS) regression and Supervised Principal Component Analysis (SPCA). We propose several new algorithms, including a quadratic spline PLS, a cubic spline PLS, two fractional polynomial PLS algorithms and two multivariate SPCA algorithms. These new algorithms were compared to several popular PLS algorithms using real and simulated datasets. Cross validation was used to assess the goodness-of-fit and prediction accuracy of the various models. Strengths and weaknesses of each method were also discussed based on model stability, robustness and parsimony. The linear PLS and the multivariate SPCA methods were found to be the most robust among the methods considered, and usually produced models with good t and prediction. Nonlinear PLS methods are generally more powerful in fitting nonlinear data, but they had the tendency to over-fit, especially with small sample sizes. A forward stepwise predictor pre-screening procedure was proposed for multivariate SPCA and our examples demonstrated its effectiveness in picking a smaller number of predictors than the standard univariate testing procedure.
- Published
- 2010
14. Monte Carlo strategies for calibration in climate models
- Author
-
Huerta, Gabriel, Gabriel Huerta, Edward John Bedrick, Michele Guindani, Joseph Galewsky, Villagran-Hernandez, Alejandro, Huerta, Gabriel, Gabriel Huerta, Edward John Bedrick, Michele Guindani, Joseph Galewsky, and Villagran-Hernandez, Alejandro
- Subjects
- Climatology--Statistical methods
- Abstract
Intensive computational methods have been used by Earth scientists in a wide range of problems in data inversion and uncertainty quantification such as earthquake epicenter location and climate projections. To quantify the uncertainties resulting from a range of plausible model configurations it is necessary to estimate a multidimensional probability distribution. The computational cost of estimating these distributions for geoscience applications is impractical using traditional methods such as Metropolis/Gibbs algorithms as simulation costs limit the number of experiments that can be obtained reasonably. Several alternate sampling strategies have been proposed that could improve on the sampling efficiency including Multiple Very Fast Simulated Annealing (MVFSA) and Adaptive Metropolis algorithms. As a goal of this research, the performance of these proposed sampling strategies are evaluated with a surrogate climate model that is able to approximate the noise and response behavior of a realistic atmospheric general circulation model (AGCM). The surrogate model is fast enough that its evaluation can be embedded in these Monte Carlo algorithms. The goal of this thesis is to show that adaptive methods can be superior to MVFSA to approximate the known posterior distribution with fewer forward evaluations. However, the adaptive methods can also be limited by inadequate sample mixing. The Single Component and Delayed Rejection Adaptive Metropolis algorithms were found to resolve these limitations, although challenges remain to approximating multi-modal distributions. The results show that these advanced methods of statistical inference can provide practical solutions to the climate model calibration problem and challenges in quantifying climate projection uncertainties. The computational methods would also be useful to problems outside climate prediction, particularly those where sampling is limited by availability of computational resources.
- Published
- 2009
15. Using control charts for computer-aided diagnosis of brain images
- Author
-
Huzurbazar, Aparna, Aparna Huzurbazar, Michele Guindani, Rex E. Jung, Williams, Sumner, IV, Huzurbazar, Aparna, Aparna Huzurbazar, Michele Guindani, Rex E. Jung, and Williams, Sumner, IV
- Subjects
- Brain--Magnetic resonance imaging--Data processing--Statistical methods
- Abstract
We consider a novel application of quality control charts to brain scans performed using magnetic resonance imaging (MRI). Although our primary focus is on volume measures resulting from brain scans, issues related to the MRI scanner are also considered. The project evaluates a population of healthy control subjects using control charts to assess the MRIs obtained for the subjects. The results demonstrate our ability to automatically detect brain volumes that are statistical outliers, and can provide a potential cost savings of 10% for a moderately sized study. More importantly, our applied results will increase the sensitivity of tests from comparisons made between subject populations due to the fact that certain populations of subjects can be quickly, efficiently, and automatically identified based on outlier analysis.
- Published
- 2009
16. Using control charts for computer-aided diagnosis of brain images
- Author
-
Huzurbazar, Aparna, Aparna Huzurbazar, Michele Guindani, Rex E. Jung, Williams, Sumner, IV, Huzurbazar, Aparna, Aparna Huzurbazar, Michele Guindani, Rex E. Jung, and Williams, Sumner, IV
- Subjects
- Brain--Magnetic resonance imaging--Data processing--Statistical methods
- Abstract
We consider a novel application of quality control charts to brain scans performed using magnetic resonance imaging (MRI). Although our primary focus is on volume measures resulting from brain scans, issues related to the MRI scanner are also considered. The project evaluates a population of healthy control subjects using control charts to assess the MRIs obtained for the subjects. The results demonstrate our ability to automatically detect brain volumes that are statistical outliers, and can provide a potential cost savings of 10% for a moderately sized study. More importantly, our applied results will increase the sensitivity of tests from comparisons made between subject populations due to the fact that certain populations of subjects can be quickly, efficiently, and automatically identified based on outlier analysis.
- Published
- 2009
17. Monte Carlo strategies for calibration in climate models
- Author
-
Huerta, Gabriel, Gabriel Huerta, Edward John Bedrick, Michele Guindani, Joseph Galewsky, Villagran-Hernandez, Alejandro, Huerta, Gabriel, Gabriel Huerta, Edward John Bedrick, Michele Guindani, Joseph Galewsky, and Villagran-Hernandez, Alejandro
- Subjects
- Climatology--Statistical methods
- Abstract
Intensive computational methods have been used by Earth scientists in a wide range of problems in data inversion and uncertainty quantification such as earthquake epicenter location and climate projections. To quantify the uncertainties resulting from a range of plausible model configurations it is necessary to estimate a multidimensional probability distribution. The computational cost of estimating these distributions for geoscience applications is impractical using traditional methods such as Metropolis/Gibbs algorithms as simulation costs limit the number of experiments that can be obtained reasonably. Several alternate sampling strategies have been proposed that could improve on the sampling efficiency including Multiple Very Fast Simulated Annealing (MVFSA) and Adaptive Metropolis algorithms. As a goal of this research, the performance of these proposed sampling strategies are evaluated with a surrogate climate model that is able to approximate the noise and response behavior of a realistic atmospheric general circulation model (AGCM). The surrogate model is fast enough that its evaluation can be embedded in these Monte Carlo algorithms. The goal of this thesis is to show that adaptive methods can be superior to MVFSA to approximate the known posterior distribution with fewer forward evaluations. However, the adaptive methods can also be limited by inadequate sample mixing. The Single Component and Delayed Rejection Adaptive Metropolis algorithms were found to resolve these limitations, although challenges remain to approximating multi-modal distributions. The results show that these advanced methods of statistical inference can provide practical solutions to the climate model calibration problem and challenges in quantifying climate projection uncertainties. The computational methods would also be useful to problems outside climate prediction, particularly those where sampling is limited by availability of computational resources.
- Published
- 2009
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.