38,321 results on '"MARKOV chain Monte Carlo"'
Search Results
152. Pollutant Concentration Prediction by Random Forest to Estimate a Contaminant Source Position
- Author
-
Alaoui, Sidi Mohammed, Djemal, Khalifa, Feiz, Amir Ali, Ngae, Pierre, Rannenberg, Kai, Editor-in-Chief, Soares Barbosa, Luís, Editorial Board Member, Carette, Jacques, Editorial Board Member, Tatnall, Arthur, Editorial Board Member, Neuhold, Erich J., Editorial Board Member, Stiller, Burkhard, Editorial Board Member, Stettner, Lukasz, Editorial Board Member, Pries-Heje, Jan, Editorial Board Member, Kreps, David, Editorial Board Member, Rettberg, Achim, Editorial Board Member, Furnell, Steven, Editorial Board Member, Mercier-Laurent, Eunika, Editorial Board Member, Winckler, Marco, Editorial Board Member, Malaka, Rainer, Editorial Board Member, Maglogiannis, Ilias, editor, Iliadis, Lazaros, editor, Macintyre, John, editor, Avlonitis, Markos, editor, and Papaleonidas, Antonios, editor
- Published
- 2024
- Full Text
- View/download PDF
153. Advanced ML Methods: Bridging SAR Images and Structural Health Monitoring
- Author
-
Entezami, Alireza, Behkamal, Bahareh, De Michele, Carlo, Entezami, Alireza, Behkamal, Bahareh, and De Michele, Carlo
- Published
- 2024
- Full Text
- View/download PDF
154. UNDER THE INFERENCE.
- Author
-
Moehrke, Patrick and Yongchao Huang
- Subjects
INFERENCE (Logic) ,MARKOV chain Monte Carlo ,BAYES' theorem ,MARGINAL distributions - Abstract
This article explores the use of Bayesian neural networks (BNNs) in mortality modeling, specifically in the context of South Africa. BNNs are gaining popularity due to their ability to incorporate prior assumptions and provide uncertainty quantification. The article outlines the steps involved in training a BNN, including assigning priors, defining likelihood models, and using inference methods such as Markov chain Monte Carlo (MCMC) or variational inference. The results show that the BNN performs well in capturing trends and making predictions compared to traditional models, but further exploration and testing are recommended. The authors acknowledge the expertise of professionals in the field. [Extracted from the article]
- Published
- 2024
155. Left-turn queue spillback identification based on single-section license plate recognition data.
- Author
-
Hao Wu, Jiarong Yao, Yumin Cao, and Keshuang Tang
- Subjects
- *
AUTOMOBILE license plates , *MARKOV chain Monte Carlo , *INTELLIGENT transportation systems , *VIDEO compression , *IDENTIFICATION - Published
- 2024
- Full Text
- View/download PDF
156. Bayesian Inference of Rock Rheological Constitutive Model with NUTS-MCMC: A Case Study on Baihetan's Slope Engineering.
- Author
-
Shi, Anchi, Lyu, Changhao, Fan, Xuewen, Yang, Sheng, and Xu, Weiya
- Subjects
- *
MARKOV chain Monte Carlo , *ROCK slopes , *SLOPES (Soil mechanics) , *BAYESIAN field theory , *WATER power - Abstract
In evaluating the safety of rock slopes engineering, it is imperative to account for rheological effects. These effects can lead to significant deformations that may adversely impact the overall structural integrity. Consequently, accurate determination of the rheological mechanical parameters of slope rocks is essential. However, the application of rheological parameters obtained from laboratory tests encounters limitations due to the rock's inherent heterogeneity, scale effects, and inevitable sample dispersion. By contrast, on-site monitoring data serve as critical assets for real-time calibration and risk assessment in the evaluation of rheological parameters and prediction of slope deformation. To integrate on-site monitoring data with rheological mechanical mechanisms, this study introduces a probabilistic inverse model for evaluating rock slope rheological parameters, grounded in Bayesian theory, and incorporating a No-U-Turn Sampler (NUTS) based on Markov Chain Monte Carlo (MCMC) sampling algorithm. In terms of methodological efficiency, we compared the NUTS method with the traditional Metropolis–Hastings (M-H) approach, demonstrating the superior efficiency of the former. Additionally, sensitivity analysis of rheological parameters was conducted using the Burgers constitutive model. By combining the NUTS-based MCMC method with this model, the uncertainty of creep parameters was successfully evaluated. Utilizing these updated posterior parameters, up to 3-year deformation forecast for the slope was executed, the findings demonstrate that the deformation on the left bank slope is slight, indicating a state of safety. This study integrates monitoring data with rheological mechanics to establish a physical-data-driven rheological safety assessment mechanism. It offers a scientifically robust and effective approach for the uncertainty evaluation of rheological parameters and deformation prediction, providing significant support for the safety assessment of the left bank slope of the Baihetan hydropower station, China. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
157. Bayesian analysis for two-part latent variable model with application to fractional data.
- Author
-
Chen, Jinye, Zheng, Linyi, and Xia, Yemao
- Subjects
- *
MARKOV chain Monte Carlo , *BAYESIAN analysis , *REGRESSION analysis , *PARAMETER estimation , *ECONOMIC surveys - Abstract
Fractional data suffering from large proportion of values at boundaries are very common in the social and economic surveys. Existing literature usually separates the whole data into three parts and specifies a three-part regression model to them. In this article, we develop an attractive two-part latent variable model for fractional data. The separated three parts are synthesized into two parts to characterize the association among the whole data. Moveover, latent variables are incorporated into the data analysis to interpret extra heterogeneity and item-dependence. We also include a structural equation to explore the interrelationships among the multiple factors. To downweight the influence of the distributional deviations and/or outliers, we develop a semiparametric Bayesian analysis procedure. Parameter estimation and model assessment are obtained via Markov Chain Monte Carlo sampling method. A real example pertaining to the cocaine use is presented to illustrate the proposed methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
158. Statistical inference of multi-state transition model for longitudinal data with measurement error and heterogeneity.
- Author
-
Qin, Jiajie and Guan, Jing
- Subjects
- *
MARKOV chain Monte Carlo , *RANDOM matrices , *EXPECTATION-maximization algorithms , *MEASUREMENT errors , *INFERENTIAL statistics , *SOCIAL medicine , *MATRIX effect , *COVARIANCE matrices - Abstract
Multi-state transition model is typically used to analyze longitudinal data in medicine and sociology. Moreover, variables in longitudinal studies usually are error-prone, and random effects are heterogeneous, which will result in biased estimates of the interest parameters. This article is intended to estimate the parameters of the multi-state transition model for longitudinal data with measurement error and heterogeneous random effects and further consider the covariate related to the covariance matrix of random effects is also error-prone when the covariate in the transition model is error-prone. We model the covariance matrix of random effects through the modified Cholesky decomposition and propose a pseudo-likelihood method based on the Monte Carlo expectation-maximization algorithm and the Bayesian method based on Markov Chain Monte Carlo to infer and calculate the whole estimates. Meanwhile, we obtain the asymptotic properties and evaluate the finite sample performance of the proposed method by simulation, which is well in terms of Bias, RMSE, and coverage rate of confidence intervals. In addition, we apply the proposed method to the MFUS data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
159. Bayesian inference of vorticity in unbounded flow from limited pressure measurements.
- Author
-
Eldredge, Jeff D. and Le Provost, Mathieu
- Subjects
GAUSSIAN mixture models ,MACHINE theory ,PRESSURE measurement ,FLOW measurement ,BAYESIAN field theory ,MARKOV chain Monte Carlo - Abstract
We study the instantaneous inference of an unbounded planar flow from sparse noisy pressure measurements. The true flow field comprises one or more regularized point vortices of various strength and size. We interpret the true flow's measurements with a vortex estimator, also consisting of regularized vortices, and attempt to infer the positions and strengths of this estimator assuming little prior knowledge. The problem often has several possible solutions, many due to a variety of symmetries. To deal with this ill posedness and to quantify the uncertainty, we develop the vortex estimator in a Bayesian setting. We use Markov-chain Monte Carlo and a Gaussian mixture model to sample and categorize the probable vortex states in the posterior distribution, tailoring the prior to avoid spurious solutions. Through experiments with one or more true vortices, we reveal many aspects of the vortex inference problem. With fewer sensors than states, the estimator infers a manifold of equally possible states. Using one more sensor than states ensures that no cases of rank deficiency arise. Uncertainty grows rapidly with distance when a vortex lies outside of the vicinity of the sensors. Vortex size cannot be reliably inferred, but the position and strength of a larger vortex can be estimated with a much smaller one. In estimates of multiple vortices their individual signs are discernible because of the nonlinear coupling in the pressure. When the true vortex state is inferred from an estimator of fewer vortices, the estimate approximately aggregates the true vortices where possible. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
160. The VISCACHA Survey – XI. Benchmarking SIESTA: a new synthetic CMD fitting code.
- Author
-
Ferreira, Bernardo P L, Santos Jr., João F C, Dias, Bruno, Maia, Francisco F S, Kerber, Leandro O, Gardin, João Francisco, Oliveira, Raphael A P, Westera, Pieter, Rocha, João Pedro S, Souza, Stefano O, Hernandez-Jimenez, Jose A, Santrich, Orlando Katime, Villegas, Angeles Pérez, Garro, Elisa R, Baume, Gustavo L, Fernández-Trincado, José G, de Bórtoli, Bruno, Parisi, Maria Celeste, and Bica, Eduardo
- Subjects
- *
MARKOV chain Monte Carlo , *SMALL magellanic cloud , *STELLAR populations , *DISTRIBUTION of stars , *STATISTICAL matching - Abstract
We present a novel code, named SIESTA (Statistical matchIng between rEal and Synthetic sTellar popuLations), designed for performing statistical isochrone fitting to colour–magnitude diagrams (CMDs) of single stellar populations by leveraging comparisons between the observed stellar distribution and predictions from synthetic populations, simulated on top of a grid of isochrones. These synthetic populations encompass determinant factors such as the cluster's initial mass function (IMF), the presence of non-resolved binaries, as well as the expected photometric errors, and observational completeness (or the observed luminosity function). Employing Markov Chain Monte Carlo within a Bayesian framework, SIESTA allows for the determination of a cluster's age, metallicity, distance, colour excess, and binary fraction (with masses exceeding a certain ratio). In this study, we rigorously benchmark the SIESTA code utilizing synthetic populations and evaluate its performance against observations from the VISCACHA Survey in the Small Magellanic Cloud, focusing on five star clusters: Lindsay 114, NGC 152, Lindsay 91, Lindsay 113, and NGC 121. These clusters were chosen for their diverse age range, spanning from 0.04 to 10 Gyr. Our findings demonstrate the capability of the SIESTA code to accurately represent the observed CMDs of these clusters. Furthermore, we compare the results obtained with SIESTA to previous characterizations of these clusters, highlighting the consistency between the derived metallicity and spectroscopic determinations from various sources. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
161. Stationary distribution and density function analysis of stochastic SIQS epidemic model with Markov chain.
- Author
-
Cao, Yusi and Fu, Jing
- Subjects
- *
STOCHASTIC analysis , *PROBABILITY density function , *MARKOV processes , *FOKKER-Planck equation , *EPIDEMICS , *STOCHASTIC models , *MARKOV chain Monte Carlo - Abstract
In this paper, a stochastic SIQS epidemic model perturbed by both white and telephone noises is investigated. By constructing several suitable Lyapunov functions, we obtain sufficient conditions for the existence of ergodic stationary distribution of the positive solution. Moreover, by solving the Fokker–Planck equation, we obtain the exact expression of probability density function around the quasi-equilibrium of the stochastic model. In addition, sufficient conditions for the extinction are established. Finally, the results of this paper are further verified by numerical simulation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
162. Exploration of the MCMC Wald test with linear regression.
- Author
-
Woller, Michael P. and Enders, Craig K.
- Subjects
- *
MARKOV chain Monte Carlo , *FALSE positive error , *FREQUENTIST statistics , *STRUCTURAL equation modeling , *STATISTICAL hypothesis testing - Abstract
Recently, Asparouhov and Muthén Structural Equation Modeling: A Multidisciplinary Journal, 28, 1–14, (2021a, 2021b) proposed a variant of the Wald test that uses Markov chain Monte Carlo machinery to generate a chi-square test statistic for frequentist inference. Because the test's composition does not rely on analytic expressions for sampling variation and covariation, it potentially provides a way to get honest significance tests in cases where the likelihood-based test statistic's assumptions break down (e.g., in small samples). The goal of this study is to use simulation to compare the new MCM Wald test to its maximum likelihood counterparts, with respect to both their type I error rate and power. Our simulation examined the test statistics across different levels of sample size, effect size, and degrees of freedom (test complexity). An additional goal was to assess the robustness of the MCMC Wald test with nonnormal data. The simulation results uniformly demonstrated that the MCMC Wald test was superior to the maximum likelihood test statistic, especially with small samples (e.g., sample sizes less than 150) and complex models (e.g., models with five or more predictors). This conclusion held for nonnormal data as well. Lastly, we provide a brief application to a real data example. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
163. Nonresponse in name generators across countries and survey modes.
- Author
-
González, Ricardo, Muñoz, Esteban, and Fuentes, Adolfo
- Subjects
MARKOV chain Monte Carlo ,NONRESPONSE (Statistics) ,SOCIAL isolation ,INTERNET surveys ,LOGISTIC regression analysis - Abstract
Past research indicates interviewer effects lead to an underestimation of network size and higher nonresponse to the "important matters" name generator. Self-administered surveys offer a potential solution, but evidence is mixed and context-specific. We employ a logistic multilevel regression, estimated using a Bayesian Markov Chain Monte Carlo approach, to analyze nonresponse to this name generator from 33 post-electoral surveys across 21 countries in the Comparative National Election Project. We find higher nonresponse in interviewer-administered surveys compared to self-administered surveys, particularly among specific demographic groups. Finally, we discuss the trade-offs in selecting survey modes for collecting ego-network data using this instrument. • Cross-country analysis of interview modes and nonresponse to the name generator. • Positive association between interviewer-administered surveys and nonresponse. • Consistency with prior studies on survey modes and nonresponse behavior. • Exploration of interplay between interviewer presence and respondent characteristics. • Caution advised in interpreting nonresponse as a sole indicator of social isolation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
164. Modelling scale effects in rating data: a Bayesian approach.
- Author
-
Iannario, Maria, Kateri, Maria, and Tarantola, Claudia
- Subjects
MONTE Carlo method ,PROBABILITY measures ,BAYESIAN analysis ,HETEROSCEDASTICITY ,DATA analysis ,MARKOV chain Monte Carlo - Abstract
We present a Bayesian approach for the analysis of rating data when a scaling component is taken into account, thus incorporating a specific form of heteroskedasticity. Model-based probability effect measures for comparing distributions of several groups, adjusted for explanatory variables affecting both location and scale components, are proposed. Markov Chain Monte Carlo techniques are implemented to obtain parameter estimates of the fitted model and the associated effect measures. An analysis on students' evaluation of a university curriculum counselling service is carried out to assess the performance of the method and demonstrate its valuable support for the decision-making process. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
165. Inference of stress-strength reliability based on adaptive progressive type-Ⅱ censing from Chen distribution with application to carbon fiber data
- Author
-
Essam A. Ahmed and Laila A. Al-Essa
- Subjects
chen distribution ,stress-strength reliability ,maximum likelihood estimator ,delta method ,bootstrap ,bayes estimator ,markov chain monte carlo ,adaptive progressive censored ,Mathematics ,QA1-939 - Abstract
In this paper, we used the maximum likelihood estimation (MLE) and the Bayes methods to perform estimation procedures for the reliability of stress-strength $ R = P(Y < X) $ based on independent adaptive progressive censored samples that were taken from the Chen distribution. An approximate confidence interval of $ R $ was constructed using a variety of classical techniques, such as the normal approximation of the MLE, the normal approximation of the log-transformed MLE, and the percentile bootstrap (Boot-p) procedure. Additionally, the asymptotic distribution theory and delta approach were used to generate the approximate confidence interval. Further, the Bayesian estimation of $ R $ was obtained based on the balanced loss function, which came in two versions here, the symmetric balanced squared error (BSE) loss function and the asymmetric balanced linear exponential (BLINEX) loss function. When estimating $ R $ using the Bayesian approach, all the unknown parameters of the Chen distribution were assumed to be independently distributed and to have informative gamma priors. Additionally, a mixture of Gibbs sampling algorithm and Metropolis-Hastings algorithm was used to compute the Bayes estimate of $ R $ and the associated highest posterior density credible interval. In the end, simulation research was used to assess the general overall performance of the proposed estimators and a real dataset was provided to exemplify the theoretical results.
- Published
- 2024
- Full Text
- View/download PDF
166. Searching for magnetar binaries disrupted by core-collapse supernovae.
- Author
-
Sherman, Myles B, Ravi, Vikram, El-Badry, Kareem, Sharma, Kritti, Ocker, Stella Koch, Kosogorov, Nikita, Connor, Liam, and Faber, Jakob T
- Subjects
- *
MAGNETARS , *MARKOV chain Monte Carlo , *SUPERNOVAE , *STELLAR populations , *MONTE Carlo method , *SUPERNOVA remnants - Abstract
Core-collapse supernovae (CCSNe) are considered the primary magnetar formation channel, with 15 magnetars associated with supernova remnants (SNRs). A large fraction of these should occur in massive stellar binaries that are disrupted by the explosion, meaning that |$\sim 45~{{\ \rm per\ cent}}$| of magnetars should be nearby high-velocity stars. Here, we conduct a multiwavelength search for unbound stars, magnetar binaries, and SNR shells using public optical (uvgrizy bands), infrared (J, H, K , and Ks bands), and radio (888 MHz, 1.4 GHz, and 3 GHz) catalogues. We use Monte Carlo analyses of candidates to estimate the probability of association with a given magnetar based on their proximity, distance, proper motion, and magnitude. In addition to recovering a proposed magnetar binary, a proposed unbound binary, and 13 of 15 magnetar SNRs, we identify two new candidate unbound systems: an OB star from the Gaia catalogue we associate with SGR J1822.3−1606, and an X-ray pulsar we associate with 3XMM J185246.6 + 003317. Using a Markov Chain Monte Carlo simulation that assumes all magnetars descend from CCSNe, we constrain the fraction of magnetars with unbound companions to |$5\lesssim f_u \lesssim 24~{{\ \rm per\ cent}}$| , which disagrees with neutron star population synthesis results. Alternate formation channels are unlikely to wholly account for the lack of unbound binaries as this would require |$31\lesssim f_{nc} \lesssim 66~{{\ \rm per\ cent}}$| of magnetars to descend from such channels. Our results support a high fraction (|$48\lesssim f_m \lesssim 86~{{\ \rm per\ cent}}$|) of pre-CCSN mergers, which can amplify fossil magnetic fields to preferentially form magnetars. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
167. SMART: spectral energy distributions Markov chain analysis with radiative transfer models.
- Author
-
Varnava, Charalambia and Efstathiou, Andreas
- Subjects
- *
SPECTRAL energy distribution , *RADIATIVE transfer , *MARKOV processes , *MARKOV chain Monte Carlo , *ACTIVE galactic nuclei - Abstract
In this paper we present the publicly available open-source spectral energy distribution (SED) fitting code SMART (Spectral energy distributions Markov chain Analysis with Radiative Transfer models). Implementing a Bayesian Markov chain Monte Carlo (MCMC) method, SMART fits the ultraviolet to millimetre SEDs of galaxies exclusively with radiative transfer models that currently constitute four types of pre-computed libraries, which describe the starburst, active galactic nucleus (AGN) torus, host galaxy, and polar dust components. An important novelty of SMART is that, although it fits SEDs exclusively with radiative transfer models, it takes comparable time to popular energy balance methods to run. Here we describe the key features of SMART and test it by fitting the multiwavelength SEDs of the 42 local ultraluminous infrared galaxies (ULIRGs) that constitute the HERschel Ultraluminous Infrared Galaxy Survey (HERUS) sample. The Spitzer spectroscopy data of the HERUS ULIRGs are included in the fitting at a spectral resolution, which is matched to that of the radiative transfer models. We also present other results that highlight the performance and versatility of SMART. SMART promises to be a useful tool for studying galaxy evolution in the JWST era. SMART is developed in Python and is available at https://github.com/ch-var/SMART.git. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
168. Maximal entropy prior for the simple step‐stress accelerated test.
- Author
-
Moala, Fernando Antonio and Chagas, Karlla Delalibera
- Subjects
- *
BAYES' estimation , *ACCELERATED life testing , *MONTE Carlo method , *MARKOV chain Monte Carlo , *MARGINAL distributions , *ENTROPY - Abstract
The step‐stress procedure is a popular accelerated test used to analyze the lifetime of highly reliable components. This paper considers a simple step‐stress accelerated test assuming a cumulative exposure model with uncensored lifetime data following a Weibull distribution. The maximum likelihood approach is often used to analyze accelerated stress test data. Another approach is to use the Bayesian inference, which is useful when there is limited data available. In this paper, the parameters of the model are estimated based on the objective Bayesian viewpoint using non‐informative priors. Our main aim is to propose the maximal data information prior (MDIP) presented by Zellner (1984) as an alternative prior to the conventional independent gamma priors for the unknown parameters, in situations where there is little or no a priori knowledge about the parameters. We also obtain the Bayes estimators based on both classes of priors, assuming three different loss functions: square error loss function (SELF), linear‐exponential loss function (LINEX), and generalized entropy loss function (GELF). The proposed MDIP prior is compared with the gamma priors via Monte Carlo simulations by examining their biases and mean square errors under the three loss functions, and coverage probability. Additionally, we employ the Markov Chain Monte Carlo (MCMC) algorithm to extract characteristics of marginal posterior distributions, such as the Bayes estimator and credible intervals. Finally, a real lifetime data is presented to illustrate the proposed methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
169. Uncertainty of microseismic sources identification and probabilistic location in underground excavation.
- Author
-
Liang, Xu
- Subjects
MARKOV chain Monte Carlo ,ROCK excavation ,BENCHMARK problems (Computer science) ,CROP allocation - Abstract
Microseismic (MS) source location is an integral component of MS technology and essential to understanding the rock failure mechanism and avoiding potential geological hazards in underground rock excavation. However, accurate location remains challenging owing to the complex geological conditions and unknown rock failure mechanisms. In this study, a novel location framework was developed to locate the MS source positions and their uncertainties based on probabilistic programming. Probabilistic programming was utilized to determine the coordinates of the MS source and its variation using the Markov Chain Monte Carlo (MCMC) method based on the waveform equation. A classical benchmark problem was utilized to verify and illustrate the developed framework. The developed framework can not only locate the position of the MS source but also determine its variation due to the uncertainty during the monitoring and excavation. The located MS source is in agreement with the actual positions. The results show that the developed framework is a scientific, accurate, reasonable, and promising tool for the location of MS sources. Then, the developed framework was applied to locate the position of the blasting in a practical mine. This further proved that the developed framework could locate the MS source, providing an excellent uncertainty analysis tool for underground rock excavation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
170. Predicting nodal metastasis progression of oral tongue cancer using a hidden Markov model in MRI.
- Author
-
Qiangqiang Gang, Jie Feng, Hans-Ulrich Kauczor, and Ke Zhang
- Subjects
TONGUE cancer ,HIDDEN Markov models ,MARKOV chain Monte Carlo ,ORAL cancer ,LYMPHATIC metastasis ,METASTASIS - Abstract
Objectives: The presence of occult nodal metastases in patients with oral tongue squamous cell carcinomas (OTSCCs) has implications for treatment. More than 30% of patients will have occult nodal metastases, yet a considerable number of patients undergo unnecessary invasive neck dissection to confirm nodal status. In this work, we propose a probabilistic model for lymphatic metastatic spread that can quantify the risk of microscopic involvement at the lymph node level (LNL) given the location of macroscopic metastases and the tumor stage using the MRI method. Materials and methods: A total of 108 patients of OTSCCs were included in the study. A hidden Markov model (HMM) was used to compute the probabilities of transitions between states over time based on MRI. Learning of the transition probabilities was performed via Markov chain Monte Carlo sampling and was based on a dataset of OTSCC patients for whom involvement of individual LNLs was reported. Results: Our model found that the most common involvement was that of level I and level II, corresponding to a high probability of ?b1 = 0.39 ± 0.05, ?b2 = 0.53 ± 0.09; lymph node level I had metastasis, and the probability of metastasis in lymph node II was high (93.79%); lymph node level II had metastasis, and the probability of metastasis in lymph node III was small (7.88%). Lymph nodes progress faster in the early stage and slower in the late stage. Conclusion: An HMM can produce an algorithm that is able to predict nodal metastasis evolution in patients with OTSCCs by analyzing the macroscopic metastases observed in the upstream levels, and tumor category. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
171. Treed Gaussian processes for animal movement modeling.
- Author
-
Rieber, Camille J., Hefley, Trevor J., and Haukos, David A.
- Subjects
- *
ANIMAL mechanics , *GAUSSIAN processes , *MARKOV chain Monte Carlo , *ANIMAL radio tracking , *ANIMAL ecology , *PRAIRIES , *STATISTICAL learning - Abstract
Wildlife telemetry data may be used to answer a diverse range of questions relevant to wildlife ecology and management. One challenge to modeling telemetry data is that animal movement often varies greatly in pattern over time, and current continuous‐time modeling approaches to handle such nonstationarity require bespoke and often complex models that may pose barriers to practitioner implementation. We demonstrate a novel application of treed Gaussian process (TGP) modeling, a Bayesian machine learning approach that automatically captures the nonstationarity and abrupt transitions present in animal movement. The machine learning formulation of TGPs enables modeling to be nearly automated, while their Bayesian formulation allows for the derivation of movement descriptors with associated uncertainty measures. We demonstrate the use of an existing R package to implement TGPs using the familiar Markov chain Monte Carlo algorithm. We then use estimated movement trajectories to derive movement descriptors that can be compared across individuals and populations. We applied the TGP model to a case study of lesser prairie‐chickens (Tympanuchus pallidicinctus) to demonstrate the benefits of TGP modeling and compared distance traveled and residence times across lesser prairie‐chicken individuals and populations. For broad usability, we outline all steps necessary for practitioners to specify relevant movement descriptors (e.g., turn angles, speed, contact points) and apply TGP modeling and trajectory comparison to their own telemetry datasets. Combining the predictive power of machine learning and the statistical inference of Bayesian methods to model movement trajectories allows for the estimation of statistically comparable movement descriptors from telemetry studies. Our use of an accessible R package allows practitioners to model trajectories and estimate movement descriptors, facilitating the use of telemetry data to answer applied management questions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
172. Advanced Copula-Based Models for Type II Censored Data: Applications in Industrial and Medical Settings.
- Author
-
Almetwally, Ehab M., Fayomi, Aisha, and Qura, Maha E.
- Subjects
- *
MAXIMUM likelihood statistics , *RANDOM variables , *COPULA functions , *CENSORING (Statistics) , *INFERENTIAL statistics , *DIABETIC nephropathies , *BIVARIATE analysis - Abstract
Copula models are increasingly recognized for their ability to capture complex dependencies among random variables. In this study, we introduce three innovative bivariate models utilizing copula functions: the XLindley (XL) distribution with Frank, Gumbel, and Clayton copulas. The results highlight the fundamental characteristics and effectiveness of these newly introduced bivariate models. Statistical inference for the distribution parameters is conducted using a Type II censored sampling design. This employs maximum likelihood and Bayesian estimation techniques. Asymptotic and credible confidence intervals are calculated, and numerical analysis is performed using the Markov Chain Monte Carlo method. The proposed methodology's applicability is illustrated by analyzing several real-world datasets. The initial dataset examines burr formation occurrences and consists of two observation sets. Additionally, the second and third datasets contain medical information. The second dataset focuses on diabetic nephropathy, while the third dataset explores infection and recurrence time among kidney patients. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
173. Order selection for heterogeneous semiparametric hidden Markov models.
- Author
-
Zou, Yudan, Song, Xinyuan, and Zhao, Qian
- Subjects
- *
MARKOV processes , *MARKOV chain Monte Carlo , *PANEL analysis , *ALZHEIMER'S disease , *PARAMETER estimation - Abstract
Hidden Markov models (HMMs), which can characterize dynamic heterogeneity, are valuable tools for analyzing longitudinal data. The order of HMMs (ie, the number of hidden states) is typically assumed to be known or predetermined by some model selection criterion in conventional analysis. As prior information about the order frequently lacks, pairwise comparisons under criterion‐based methods become computationally expensive with the model space growing. A few studies have conducted order selection and parameter estimation simultaneously, but they only considered homogeneous parametric instances. This study proposes a Bayesian double penalization (BDP) procedure for simultaneous order selection and parameter estimation of heterogeneous semiparametric HMMs. To overcome the difficulties in updating the order, we create a brand‐new Markov chain Monte Carlo algorithm coupled with an effective adjust‐bound reversible jump strategy. Simulation results reveal that the proposed BDP procedure performs well in estimation and works noticeably better than the conventional criterion‐based approaches. Application of the suggested method to the Alzheimer's Disease Neuroimaging Initiative research further supports its usefulness. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
174. Generating Independent Replicates Directly from the Posterior Distribution for a Class of Spatial Hierarchical Models.
- Author
-
Bradley, Jonathan R. and Clinch, Madelyn
- Abstract
AbstractMarkov chain Monte Carlo (MCMC) allows one to generate dependent replicates from a posterior distribution for effectively any Bayesian hierarchical model. However, MCMC can produce a significant computational burden. This motivates us to consider finding expressions of the posterior distribution that are computationally straightforward to obtain independent replicates from directly. We focus on a broad class of Bayesian hierarchical models for spatially dependent data, which are often modeled via a latent Gaussian process (LGP). First, we derive a new class of distributions referred to as the generalized conjugate multivariate (GCM) distribution. The GCM distribution’s theoretical development follows that of the conjugate multivariate (CM) distribution with two main differences: the GCM allows for latent Gaussian process assumptions, and the GCM explicitly accounts for hyperparameters through marginalization. The development of GCM is needed to obtain independent replicates directly from the exact posterior distribution, which has an efficient regression form. Hence, we refer to our method as Exact Posterior Regression (EPR). Simulation studies with weakly stationary spatial processes and spatial basis function expansions are provided. We provide an analysis of poverty incidence from the U.S. Census Bureau, and an analysis of high-dimensional remote sensing data. Supplementary materials for this article are available online. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
175. Impact assessment of self-medication on COVID-19 prevalence in Gauteng, South Africa, using an age-structured disease transmission modelling framework.
- Author
-
Avusuglo, Wisdom S., Han, Qing, Woldegerima, Woldegebriel Assefa, Bragazzi, Nicola, Asgary, Ali, Ahmadi, Ali, Orbinski, James, Wu, Jianhong, Mellado, Bruce, and Kong, Jude Dzevela
- Subjects
- *
INFECTIOUS disease transmission , *SELF medication , *MARKOV chain Monte Carlo , *BASIC reproduction number , *MEDICAL model - Abstract
Objective: To assess the impact of self-medication on the transmission dynamics of COVID-19 across different age groups, examine the interplay of vaccination and self-medication in disease spread, and identify the age group most prone to self-medication. Methods: We developed an age-structured compartmentalized epidemiological model to track the early dynamics of COVID-19. Age-structured data from the Government of Gauteng, encompassing the reported cumulative number of cases and daily confirmed cases, were used to calibrate the model through a Markov Chain Monte Carlo (MCMC) framework. Subsequently, uncertainty and sensitivity analyses were conducted on the model parameters. Results: We found that self-medication is predominant among the age group 15-64 (74.52%), followed by the age group 0-14 (34.02%), and then the age group 65+ (11.41%). The mean values of the basic reproduction number, the size of the first epidemic peak (the highest magnitude of the disease), and the time of the first epidemic peak (when the first highest magnitude occurs) are 4.16499, 241,715 cases, and 190.376 days, respectively. Moreover, we observed that self-medication among individuals aged 15-64 results in the highest spreading rate of COVID-19 at the onset of the outbreak and has the greatest impact on the first epidemic peak and its timing. Conclusion: Studies aiming to understand the dynamics of diseases in areas prone to self-medication should account for this practice. There is a need for a campaign against COVID-19-related self-medication, specifically targeting the active population (ages 15-64). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
176. Kriging-based surrogate data-enriching artificial neural network prediction of strength and permeability of permeable cement-stabilized base.
- Author
-
Wang, Xiaoming, Xiao, Yuanjie, Li, Wenqi, Wang, Meng, Zhou, Yanbin, Chen, Yuliang, and Li, Zhiyong
- Subjects
ARTIFICIAL neural networks ,KRIGING ,MARKOV chain Monte Carlo ,MONTE Carlo method ,DISTRIBUTION (Probability theory) ,PERMEABILITY - Abstract
Limited test data hinder the accurate prediction of mechanical strength and permeability of permeable cement-stabilized base materials (PCBM). Here we show a kriging-based surrogate model assisted artificial neural network (KS-ANN) framework that integrates laboratory testing, mathematical modeling, and machine learning. A statistical distribution model was established from limited test data to enrich the dataset through the combination of markov chain monte carlo simulation and kriging-based surrogate modeling. Subsequently, an artificial neural network (ANN) model was trained using the enriched dataset. The results demonstrate that the well-trained KS-ANN model effectively captures the actual data distribution characteristics. The accurate prediction of the mechanical strength and permeability of PCBM under the constraint of limited data validates the effectiveness of the proposed framework. As compared to traditional ANN models, the KS-ANN model improves the prediction accuracy of PCBM's mechanical strength by 21%. Based on the accurate prediction of PCBM's mechanical strength and permeability by the KS-ANN model, an optimization function was developed to determine the optimal cement content and compaction force range of PCBM, enabling it to concurrently satisfy the requirements of mechanical strength and permeability. This study provides a cost-effective and rapid solution for evaluating the performance and optimizing the design of PCBM and similar materials. Limited data hinders accurate predictions of strength and permeability of permeable cement-stabilized base materials. Here, the authors propose a kriging-based surrogate model assisted neural networks, which improves accuracy by 21% over traditional models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
177. SNP-slice resolves mixed infections: simultaneously unveiling strain haplotypes and linking them to hosts.
- Author
-
Ju, Nianqiao, Liu, Jiawei, and He, Qixin
- Subjects
- *
MARKOV chain Monte Carlo , *MIXED infections , *MALARIA , *HAPLOTYPES - Abstract
Motivation Multi-strain infection is a common yet under-investigated phenomenon of many pathogens. Currently, biologists analyzing SNP information sometimes have to discard mixed infection samples as many downstream analyses require monogenomic inputs. Such a protocol impedes our understanding of the underlying genetic diversity, co-infection patterns, and genomic relatedness of pathogens. A scalable tool to learn and resolve the SNP-haplotypes from polygenomic data is an urgent need in molecular epidemiology. Results We develop a slice sampling Markov Chain Monte Carlo algorithm, named SNP-Slice, to learn not only the SNP-haplotypes of all strains in the populations but also which strains infect which hosts. Our method reconstructs SNP-haplotypes and individual heterozygosities accurately without reference panels and outperforms the state-of-the-art methods at estimating the multiplicity of infections and allele frequencies. Thus, SNP-Slice introduces a novel approach to address polygenomic data and opens a new avenue for resolving complex infection patterns in molecular surveillance. We illustrate the performance of SNP-Slice on empirical malaria and HIV datasets and provide recommendations for using our method on empirical datasets. Availability and Implementation The implementation of the SNP-Slice algorithm, as well as scripts to analyze SNP-Slice outputs, are available at https://github.com/nianqiaoju/snp-slice. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
178. Epidemiology and evolution of dengue outbreaks in Bangladesh (2020-2023).
- Author
-
Hasan, Mehedi and Islam, Suprova
- Subjects
- *
MARKOV chain Monte Carlo , *DENGUE viruses , *PROTEIN microarrays , *TRANSMEMBRANE domains , *AMINO acid sequence , *DENGUE hemorrhagic fever - Abstract
This article provides an overview of the epidemiology and evolution of dengue outbreaks in Bangladesh from 2020 to 2023. The study found that dengue fever cases peaked earlier than usual in 2023, resulting in the highest annual death toll ever. A novel genotype of the dengue virus was identified, and the virus strains were found to be rapidly evolving. The study suggests that this information can help in developing strategies to control future outbreaks. In 2023, there were a total of 1,705 deaths recorded, with the highest number of deaths occurring in August, September, and October. The Dhaka division had the most severe outbreak, and the majority of cases were male. Amino acid substitutions were observed in the virus, indicating the emergence of a new genotype. The study also found that the evolutionary rate of the virus was higher for one strain compared to another. The incidence rate of dengue has been increasing globally, and Bangladesh experienced a severe outbreak in 2023. The outbreak may be attributed to climate change and increased rainfall. Dengue surpassed COVID-19 as the leading cause of mortality in Bangladesh during this period. The majority of cases occurred in young adults, and there was a male predominance. The study highlights the importance of continuously monitoring the virus and its variants to prepare for future outbreaks. The article acknowledges the limitations of the study and declares no conflict of interest. The data supporting the findings are available upon request. [Extracted from the article]
- Published
- 2024
- Full Text
- View/download PDF
179. Phylogenetic placement of bizarre karschiellid earwigs.
- Author
-
Kočárek, Petr, Horká, Ivona, Bonczek, Vojtěch, and Kirstová, Markéta
- Subjects
- *
EARWIGS , *MARKOV chain Monte Carlo - Abstract
This article discusses the phylogenetic placement of the Karschiellidae family of earwigs, which has been a subject of debate due to a lack of DNA-grade material for analysis. The study successfully sequenced representatives of the Karschiellidae family for the first time and proposed a probable evolutionary scheme and natural classification based on a constructed multigene phylogeny. The study found that the Karschiellidae family is nested within the Protodermaptera infraorder in the Pygidicranidae family. The article provides detailed information on the methodology and results of the study, including phylogenetic trees and morphological support for the inclusion of Karschiellidae in Pygidicranidae. [Extracted from the article]
- Published
- 2024
- Full Text
- View/download PDF
180. Dynamic probability control limits for the adaptive multivariate EWMA chart.
- Author
-
Haq, Abdul
- Subjects
- *
MARKOV chain Monte Carlo , *DISTRIBUTION (Probability theory) , *GEOMETRIC distribution , *ADAPTIVE control systems , *MONTE Carlo method , *QUALITY control charts - Abstract
Effective monitoring of the control charts requires the establishment of appropriate control limits. Various methods have been proposed to determine the fixed control limits (FCLs) based on a specified in‐control average run‐length value, such as Markov chains, integral equations, and Monte Carlo simulations.With the FCLs, the conditional false alarm rate (CFAR) of a control chart varies over time in an unexpected and undesirable way, where the CFAR refers to the probability of a false alarm at a particular time given no previous false alarm. To address this issue, dynamic probability control limits (DPCLs) can be employed to keep the CFAR constant over time. In this study, we determine the DPCLs using fixed and time‐varying sample sizes for the adaptive multivariate EWMA chart when the underlying process is assumed to follow a multivariate normal distribution. The DPCLs are designed to automatically adjust to changes in the probability distribution of the sample size, while maintaining a consistent CFAR. This results in a closely matched run‐length performance for an in‐control process compared to that of the geometric distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
181. Modeling of Resident Space Object Light Curves with Blender Software.
- Author
-
Kudak, Viktor, Perig, Vasyl, Dzhumelia, Viktor, and Kryoka, Oleksandr
- Subjects
- *
LIGHT curves , *MARKOV chain Monte Carlo , *ARTISTS , *MARKOV processes , *ARTIFICIAL satellites - Abstract
Modeling the behavior and shape of space objects is widely used in modern astrophysical research methods. Such studies are often used to determine the shape and modeling of physical parameters of variable stars and asteroids. Therefore, based on the database of photometric observations of resident space objects (RSO) available in the Laboratory of Space Research of Uzhhorod National University, it was decided to find a means for modeling light curves to confirm the shape of objects and determine the parameters of their rotation by analogy with objects in deep space. We attempted to use Blender software to model the RSO synthetic light curves (LCs). While Blender has been a popular open-source software among animators and visual effects artists, in recent years, it has also become a tool for researchers: for example, it is used for visualizing astrophysical datasets and generating asteroid light curves. In the process of modeling, we used all the advantages of Blender software such as Python scripting and used GPU. We made synthetic LCs for two objects – TOPEX/Poseidon and COSMOS-2502. A 3D model for Topex/Poseidon was available on the NASA website, but after research of official datasheets, we figured out that the available 3D model requires corrections in the dimensions of the RSO body and solar panel. A 3D model of COSMOS-2502 was made according to available information from the internet. A manual modeling process was performed according to well-known RSO's self-rotation parameters. For example, we also show the results of LC modeling using the Markov chain Monte Carlo (MCMC) method. All synthetic LCs obtained in the research process are well correlated with real observed LCs. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
182. Approximate inferences for Bayesian hierarchical generalised linear regression models.
- Author
-
Berman, Brandon, Johnson, Wesley O., and Shen, Weining
- Subjects
- *
REGRESSION analysis , *MARKOV chain Monte Carlo , *BAYESIAN field theory , *RANDOM effects model , *POISSON regression , *BIG data , *LATENT variables - Abstract
Summary: Generalised linear mixed regression models are fundamental in statistics. Modelling random effects that are shared by individuals allows for correlation among those individuals. There are many methods and statistical packages available for analysing data using these models. Most require some form of numerical or analytic approximation because the likelihood function generally involves intractable integrals over the latents. The Bayesian approach avoids this issue by iteratively sampling the full conditional distributions for various blocks of parameters and latent random effects. Depending on the choice of the prior, some full conditionals are recognisable while others are not. In this paper we develop a novel normal approximation for the random effects full conditional, establish its asymptotic correctness and evaluate how well it performs. We make the case for hierarchical binomial and Poisson regression models with canonical link functions, for hierarchical gamma regression models with log link and for other cases. We also develop what we term a sufficient reduction (SR) approach to the Markov Chain Monte Carlo algorithm that allows for making inferences about all model parameters by replacing the full conditional for the latent variables with a considerably reduced dimensional function of the latents. We expect that this approximation could be quite useful in situations where there are a very large number of latent effects, which may be occurring in an increasingly 'Big Data' world. In the sequel, we compare our methods with INLA, which is a particularly popular method and which has been shown to be excellent in terms of speed and accuracy across a variety of settings. Our methods appear to be comparable to theirs in terms of accuracy, while INLA was faster, for the settings we considered. In addition, we note that our methods and those of others that involve Gibbs sampling trivially handle parameters that are functions of multiple parameters, while INLA approximations do not. Our primary illustration is for a three‐level hierarchical binomial regression model for data on health outcomes for patients who are clustered within physicians who are clustered within particular hospitals or hospital systems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
183. A search for short-period Tausworthe generators over Fb with application to Markov chain quasi-Monte Carlo.
- Author
-
Harase, Shin
- Subjects
- *
MARKOV processes , *FINITE fields , *MONTE Carlo method , *SHIFT registers , *MARKOV chain Monte Carlo - Abstract
A one-dimensional sequence $ u_0, u_1, u_2, \ldots \in [0, 1) $ u 0 , u 1 , u 2 , ... ∈ [ 0 , 1) is said to be completely uniformly distributed (CUD) if overlapping s-blocks $ (u_i, u_{i+1}, \ldots, u_{i+s-1}) $ (u i , u i + 1 , ... , u i + s − 1) , $ i = 0, 1, 2, \ldots $ i = 0 , 1 , 2 , ... , are uniformly distributed for every dimension $ s \geq 1 $ s ≥ 1. This concept naturally arises in Markov chain quasi-Monte Carlo (QMC). However, the definition of CUD sequences is not constructive, and thus there remains the problem of how to implement the Markov chain QMC algorithm in practice. Harase [A table of short-period Tausworthe generators for Markov chain quasi-Monte Carlo. J Comput Appl Math. 2021;384:Paper No. 113136, 12.] focussed on the t-value, which is a measure of uniformity widely used in the study of QMC, and implemented short-period Tausworthe generators (i.e. linear feedback shift register generators) over the two-element field $ \mathbb {F}_2 $ F 2 that approximate CUD sequences by running for the entire period. In this paper, we generalize a search algorithm over $ \mathbb {F}_2 $ F 2 to that over arbitrary finite fields $ \mathbb {F}_b $ F b with b elements and conduct a search for Tausworthe generators over $ \mathbb {F}_b $ F b with t-values zero (i.e. optimal) for dimension s = 3 and small for $ s \geq ~4 $ s ≥ 4 , especially in the case where b = 3, 4, and 5. We provide a parameter table of Tausworthe generators over $ \mathbb {F}_4 $ F 4 , and report a comparison between our new generators over $ \mathbb {F}_4 $ F 4 and existing generators over $ \mathbb {F}_2 $ F 2 in numerical examples using Markov chain QMC. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
184. Statistical inference for a two-parameter distribution with a bathtub-shaped or increasing hazard rate function based on record values and inter-record times with an application to COVID-19 data.
- Author
-
Khoshkhoo Amiri, Z. and MirMostafaee, S.M.T.K.
- Subjects
- *
HAZARD function (Statistics) , *INFERENTIAL statistics , *PARAMETER estimation , *COVID-19 , *MARKOV chain Monte Carlo - Abstract
In this paper, we study the problem of estimation and prediction for a two-parameter distribution with a bathtub-shaped or increasing failure rate function based on lower records and inter-record times, and based on lower records without considering the inter-record times. The maximum likelihood and Bayesian approaches are employed to estimate the unknown parameters. As it seems that the Bayes estimates cannot be derived in a closed form, the Metropolis-Hastings within Gibbs algorithm is implemented to obtain the approximate Bayes point estimates. Bayesian prediction of a future record value is also discussed. A simulation study is conducted to evaluate the proposed point and interval estimators. A real data set consisting of COVID-19 data from Iran is analyzed to illustrate the application of the theoretical results of the paper. Moreover, a simulated data example is presented. Several concluding remarks end the paper. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
185. Genome-wide association study of age at puberty and its (co)variances with fertility and stature in growing and lactating Holstein-Friesian dairy cattle.
- Author
-
Stephen, M.A., Burke, C.R., Steele, N., Pryce, J.E., Meier, S., Amer, P.R., Phyn, C.V.C., and Garrick, D.J.
- Subjects
- *
CATTLE fertility , *HOLSTEIN-Friesian cattle , *GENOME-wide association studies , *LACTATION in cattle , *DAIRY cattle , *FERTILITY , *LACTATION - Abstract
Reproductive performance is a key determinant of cow longevity in a pasture-based, seasonal dairy system. Unfortunately, direct fertility phenotypes such as intercalving interval or pregnancy rate tend to have low heritabilities and occur relatively late in an animal's life. In contrast, age at puberty (AGEP) is a moderately heritable, early-in-life trait that may be estimated using an animal's age at first measured elevation in blood plasma progesterone (AGEP4) concentrations. Understanding the genetic architecture of AGEP4 in addition to genetic relationships between AGEP4 and fertility traits in lactating cows is important, as is its relationship with body size in the growing animal. Thus, the objectives of this research were 3-fold. First, to estimate the genetic and phenotypic (co)variances between AGEP4 and subsequent fertility during first and second lactations. Second, to quantify the associations between AGEP4 and height, length, and BW measured when animals were approximately 11 mo old (standard deviation = 0.5). Third, to identify genomic regions that are likely to be associated with variation in AGEP4. We measured AGEP4, height, length, and BW in approximately 5,000 Holstein-Friesian or Holstein-Friesian × Jersey crossbred yearling heifers across 54 pasture-based herds managed in seasonal calving farm systems. We also obtained calving rate (CR42, success or failure to calve within the first 42 d of the seasonal calving period), breeding rate (PB21, success or failure to be presented for breeding within the first 21 d of the seasonal breeding period) and pregnancy rate (PR42, success or failure to become pregnant within the first 42 d of the seasonal breeding period) phenotypes from their first and second lactations. The animals were genotyped using the Weatherby's Versa 50K SNP array (Illumina, San Diego, CA). The estimated heritabilities of AGEP4, height, length, and BW were 0.34 (90% credibility interval [CRI]: 0.30, 0.37), 0.28 (90% CRI: 0.25, 0.31), 0.21 (90% CRI: 0.18, 0.23), and 0.33 (90% CRI: 0.30, 0.36), respectively. In contrast, the heritabilities of CR42, PB21 and PR42 were all <0.05 in both first and second lactations. The genetic correlations between AGEP4 and these fertility traits were generally moderate, ranging from 0.11 to 0.60, whereas genetic correlations between AGEP4 and yearling body-conformation traits ranged from 0.02 to 0.28. Our GWAS highlighted a genomic window on chromosome 5 that was strongly associated with variation in AGEP4. We also identified 4 regions, located on chromosomes 14, 6, 1, and 11 (in order of decreasing importance), that exhibited suggestive associations with AGEP4. Our results show that AGEP4 is a reasonable predictor of estimated breeding values for fertility traits in lactating cows. Although the GWAS provided insights into genetic mechanisms underpinning AGEP4, further work is required to test genomic predictions of fertility that use this information. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
186. Spatial-temporal hurdle model vs. spatial zero-inflated GARCH model: analysis of weekly dengue fever cases.
- Author
-
Chen, Cathy W. S. and Chen, Chun-Shu
- Subjects
- *
ARBOVIRUS diseases , *DENGUE , *DENGUE hemorrhagic fever , *GARCH model , *MARKOV chain Monte Carlo - Abstract
Dengue fever is transmitted to humans through the bite of an infected mosquito and is prevalent in all tropical and subtropical climates worldwide. It is thus essential to model weekly dengue fever counts and other infectious diseases that exhibit spatial-temporal dynamics, overdispersion, spatial dependence, and a high number of zeros. To address these characteristics, this study introduces a spatial hurdle integer-valued GARCH (INGARCH) model and an improved version of the spatial zero-inflated generalized Poisson (ZIGP) INGARCH model with and without meteorological variables. Implementing two parameters in the distance function influences the spatial weight between two locations: one controls the decay rate, while the other shapes the decay curve. We employ these newly designed models to analyze time-series counts of infectious diseases - specifically, weekly cases of dengue hemorrhagic fever in four northeastern provinces of Thailand. Applying these models allow us to offer inferences, predictions, and model selections within a Bayesian framework through Markov chain Monte Carlo (MCMC) methods. We then compare models based on the Bayes factors and the mean squared error of fitting errors. The results for the spatial ZIGP INGARCH models are remarkably good, but the spatial INGARCH model incorporating meteorological variables outperforms the other two. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
187. Near-Surface Rayleigh Wave Dispersion Curve Inversion Algorithms: A Comprehensive Comparison.
- Author
-
Yang, Xiao-Hui, Zhou, Yuanyuan, Han, Peng, Feng, Xuping, and Chen, Xiaofei
- Subjects
- *
RAYLEIGH model , *RAYLEIGH waves , *MARKOV chain Monte Carlo , *PROCESS capability , *GLOBAL optimization , *ALGORITHMS , *MACHINE learning , *INVERSIONS (Geometry) - Abstract
Rayleigh wave exploration is a powerful method for estimating near-surface shear-wave (S-wave) velocities, providing valuable insights into the stiffness properties of subsurface materials inside the Earth. The dispersion curve inversion of Rayleigh wave corresponds to the optimization process of searching for the optimal solutions of earth model parameters based on the measured dispersion curves. At present, diversified inversion algorithms have been introduced into the process of Rayleigh wave inversion. However, limited studies have been conducted to uncover the variations in inversion performance among commonly used inversion algorithms. To obtain a comprehensive understanding of the optimization performance of these inversion algorithms, we systematically investigate and quantitatively assess the inversion performance of two bionic algorithms, two probabilistic algorithms, a gradient-based algorithm, and two neural network algorithms. The evaluation indices include the computational cost, accuracy, stability, generalization ability, noise effects, and field data processing capability. It is found that the Bound-constrained limited-memory Broyden–Fletcher–Goldfarb–Shanno (L-BFGS-B) algorithm and the broad learning (BL) network have the lowest computational cost among candidate algorithms. Furthermore, the transitional Markov Chain Monte Carlo algorithm, deep learning (DL) network, and BL network outperform the other four algorithms regarding accuracy, stability, resistance to noise effects, and capability to process field data. The DL and BL networks demonstrate the highest level of generalization compared to the other algorithms. The comparison results reveal the variations in candidate algorithms for the inversion task, causing a clear understanding of the inversion performance of candidate algorithms. This study can promote the S-wave velocity estimation by Rayleigh wave inversion. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
188. Bayesian discrete conditional transformation models.
- Author
-
Carlan, Manuel and Kneib, Thomas
- Subjects
- *
MARKOV chain Monte Carlo , *PROPORTIONAL hazards models , *CONDITIONED response , *FOREST health , *POISSON regression , *PATENT applications - Abstract
We propose a novel Bayesian model framework for discrete ordinal and count data based on conditional transformations of the responses. The conditional transformation function is estimated from the data in conjunction with an a priori chosen reference distribution. For count responses, the resulting transformation model is novel in the sense that it is a Bayesian fully parametric yet distribution-free approach that can additionally account for excess zeros with additive transformation function specifications. For ordinal categoric responses, our cumulative link transformation model allows the inclusion of linear and non-linear covariate effects that can additionally be made category-specific, resulting in (non-)proportional odds or hazards models and more, depending on the choice of the reference distribution. Inference is conducted by a generic modular Markov chain Monte Carlo algorithm where multivariate Gaussian priors enforce specific properties such as smoothness on the functional effects. To illustrate the versatility of Bayesian discrete conditional transformation models, applications to counts of patent citations in the presence of excess zeros and on treating forest health categories in a discrete partial proportional odds model are presented. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
189. Estimation in shape mixtures of skew-normal linear regression models via ECM coupled with Gibbs sampling.
- Author
-
Alizadeh Ghajari, Zakaria, Zare, Karim, and Shokri, Soheil
- Subjects
- *
MARKOV chain Monte Carlo , *GIBBS sampling , *REGRESSION analysis , *EXPECTATION-maximization algorithms - Abstract
In this paper, we study linear regression models in which the error term has shape mixtures of skew-normal distribution. This type of distribution belongs to the skew-normal (SN) distribution class that can be used for heavy tails and asymmetry data. For the first time, for the classical (non-Bayesian) estimation of the parameters of the SN family, we apply the Markov chains Monte Carlo ECM (MCMC-ECM) algorithm where the samples are generated by Gibbs sampling, denoted by Gibbs-ECM, and also, we extend two other types of the EM algorithm for the above model. Finally, the proposed method is evaluated through a simulation and compared with the Numerical Math-ECM algorithm and Monte Carlo ECM (MC-ECM) using a real data set. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
190. Using the motion of S2 to constrain vector clouds around Sgr A.
- Author
-
Collaboration, GRAVITY
- Subjects
- *
MARKOV chain Monte Carlo , *SUPERMASSIVE black holes , *MONTE Carlo method , *BLACK holes , *VECTOR fields - Abstract
The dark compact object at the centre of the Milky Way is well established to be a supermassive black hole with mass |$M_{\bullet } \sim 4.3 \times 10^6 \, {\rm M}_{\odot }$| , but the nature of its environment is still under debate. In this work, we used astrometric and spectroscopic measurements of the motion of the star S2, one of the closest stars to the massive black hole, to determine an upper limit on an extended mass composed of a massive vector field around Sagittarius A*. For a vector with effective mass |$10^{-19} \lesssim m_\mathrm{ s} \lesssim 10^{-18} \, \rm eV$| , our Markov chain Monte Carlo analysis shows no evidence for such a cloud, placing an upper bound |$M_{\rm cloud} \lesssim 0.1 \% \, M_{\bullet }$| at 3σ confidence level. We show that dynamical friction exerted by the medium on S2 motion plays no role in the analysis performed in this and previous works, and can be neglected thus. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
191. An efficient Bayesian multi-model framework to analyze reliability of rock structures with limited investigation data.
- Author
-
Kumar, Akshay and Tiwari, Gaurav
- Subjects
- *
MARKOV chain Monte Carlo , *UNCERTAINTY (Information theory) , *ROCK slopes , *RESPONSE surfaces (Statistics) , *BAYESIAN field theory - Abstract
Availability of insufficient data is a frequent issue resulting in the inaccurate probabilistic characterization of properties and, finally the inaccurate reliability estimates of rock structures. This study presents a Bayesian multi-model inference methodology which couples multi-model inference with traditional Bayesian approach to characterize uncertainties in both—(1) probability models, and (2) model parameters of rock properties arising due to insufficient data, and to estimate the reliability of rock slopes and tunnels considering their effect. Further, this methodology was coupled with Sobol's sensitivity, metropolis–hastings Markov chain Monte Carlo sampling and moving least square-response surface method to improve the computational efficiency and applicability for problems with implicit performance functions (PFs). Methodology is demonstrated for a Himalayan rock slope (implicit PF) prone to stress-controlled failure in India. Analysis is also performed using recently developed limited data reliability methods, i.e., traditional Bayesian (considers uncertainty in model parameters only) and bootstrap-based re-sampling reliability methods (considers uncertainties in model types and parameters). Proposed methodology is concluded to be superior to other methods due to its capability of considering uncertainties in both model types and parameters, and to include the prior information in the analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
192. Quantification and visualization of uncertainties in reconstructed penumbral images of implosions at Omega.
- Author
-
Kunimune, J. H., Heuer, P. V., Reichelt, B. L., Johnson, T. M., and Frenje, J. A.
- Subjects
- *
IMAGE reconstruction algorithms , *IMPLOSIONS , *MARKOV chain Monte Carlo , *PLASMA diagnostics , *IMAGE analysis , *IMAGE fusion - Abstract
Penumbral imaging is a technique used in plasma diagnostics in which a radiation source shines through one or more large apertures onto a detector. To interpret a penumbral image, one must reconstruct it to recover the original source. The inferred source always has some error due to noise in the image and uncertainty in the instrument geometry. Interpreting the inferred source thus requires quantification of that inference's uncertainty. Markov chain Monte Carlo algorithms have been used to quantify uncertainty for similar problems but have never been used for the inference of the shape of an image. Because of this, there are no commonly accepted ways of visualizing uncertainty in two-dimensional data. This paper demonstrates the application of the Hamiltonian Monte Carlo algorithm to the reconstruction of penumbral images of fusion implosions and presents ways to visualize the uncertainty in the reconstructed source. This methodology enables more rigorous analysis of penumbral images than has been done in the past. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
193. A Bayesian approach to elastic full-waveform inversion: application to two synthetic near surface models.
- Author
-
BERTI, S., ALEARDI, M., and STUCCHI, E.
- Subjects
- *
MARKOV chain Monte Carlo , *COSINE function , *DISCRETE cosine transforms , *RAYLEIGH waves , *WAVE analysis - Abstract
Imaging of the first metres of the subsurface with seismic methods constitutes a key challenge for several applications. In this context, the analysis of Rayleigh waves can reveal information about the S-wave velocity structure in the first metres of the subsurface. The waves recorded can be inverted using several techniques, of which the most widely used is the multichannel analysis of surface waves, where dispersion curves are picked on the velocity-frequency spectrum. A full-waveform inversion of surface waves has been implemented, offering the possibility to exploit the complete information content of the recorded seismograms. This method has only recently been tested with elastic approximation on synthetic data, as the application in near-surface scenarios is very challenging due to the high nonlinearity of the problem and the considerable computational costs. This paper presents a gradient-based Markov chain Monte Carlo elastic full-waveform inversion method, where posterior sampling is accelerated by compressing data and model spaces through the discrete cosine transform and, also, by defining a proposal that is a local, Gaussian approximation of the target posterior probability density. The applicability of the approach is demonstrated by performing two synthetic inversion tests on two different near-surface models: a two-layered model with lateral velocity variations, and a four-layered model with velocity inversions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
194. Bayesian estimation for heterogeneous spatial autoregressive models with variance modelling.
- Author
-
Tian, Ruiqin, Xu, Dengke, and Du, Jiang
- Subjects
- *
GIBBS sampling , *MARKOV chain Monte Carlo , *AUTOREGRESSIVE models - Abstract
In this paper, we introduce a new class of heterogeneous spatial autoregressive models (heterogeneous SAR models) where the variance parameters are modeled in terms of covariates. In order to estimate the model parameters, as well as their corresponding standard error estimates, we proposed a computational efficient MCMC method which combines the Gibbs sampler with Metropolis-Hastings algorithm. The proposed estimate method is illustrated through numerous simulations and is applied to the Boston housing data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
195. Approximate Bayesian estimator for the random-coefficients model.
- Author
-
Wang, Jie Jiang Lichun
- Subjects
- *
BAYES' estimation , *MAXIMUM likelihood statistics , *ANALYSIS of variance , *MARKOV chain Monte Carlo - Abstract
This article constructs an approximate Bayesian estimator for the parameter vector consisted of variance components in a random-coefficients regression (RCR) model with unbalanced data. Its superiority over the analysis of variance estimator (ANOVAE) is strictly proved in terms of the mean squared error matrix (MSEM) criterion. Compared with the usual Bayes estimator computed via the MCMC method, the proposed approximate Bayesian estimator is simple and easy to interpret and use. Also, we compare it with the maximum likelihood estimator (MLE) and the restricted maximum likelihood estimator (RMLE) of the variance components. Numerical computations show that the approximate Bayesian estimator has a good approximation performance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
196. Variational inference for the latent shrinkage position model.
- Author
-
Gwee, Xian Yao, Gormley, Isobel Claire, and Fop, Michael
- Subjects
- *
MARKOV chain Monte Carlo , *FRUIT drying - Abstract
The latent position model (LPM) is a popular method used in network data analysis where nodes are assumed to be positioned in a p$$ p $$‐dimensional latent space. The latent shrinkage position model (LSPM) is an extension of the LPM which automatically determines the number of effective dimensions of the latent space via a Bayesian nonparametric shrinkage prior. However, the LSPM's reliance on Markov chain Monte Carlo for inference, while rigorous, is computationally expensive, making it challenging to scale to networks with large numbers of nodes. We introduce a variational inference approach for the LSPM, aiming to reduce computational demands while retaining the model's ability to intrinsically determine the number of effective latent dimensions. The performance of the variational LSPM is illustrated through simulation studies and its application to real‐world network data. To promote wider adoption and ease of implementation, we also provide open‐source code. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
197. Bayesian variational time-lapse full waveform inversion.
- Author
-
Zhang, Xin and Curtis, Andrew
- Subjects
- *
MARKOV chain Monte Carlo , *SEISMIC surveys , *ROBUST optimization , *ESTIMATES - Abstract
Time-lapse seismic full-waveform inversion (FWI) provides estimates of dynamic changes in the Earth's subsurface by performing multiple seismic surveys at different times. Since FWI problems are highly non-linear and non-unique, it is important to quantify uncertainties in such estimates to allow robust decision making based on the results. Markov chain Monte Carlo (McMC) methods have been used for this purpose, but due to their high computational cost, those studies often require a pre-existing accurate baseline model and estimates of the locations of potential velocity changes, and neglect uncertainty in the baseline velocity model. Such detailed and accurate prior information is not always available in practice. In this study we use an efficient optimization method called stochastic Stein variational gradient descent (sSVGD) to solve time-lapse FWI problems without assuming such prior knowledge, and to estimate uncertainty both in the baseline velocity model and the velocity change over time. We test two Bayesian strategies: separate Bayesian inversions for each seismic survey, and a single joint inversion for baseline and repeat surveys, and compare the methods with standard linearized double difference inversion. The results demonstrate that all three methods can produce accurate velocity change estimates in the case of having fixed (exactly repeatable) acquisition geometries. However, the two Bayesian methods generate significantly more accurate results when acquisition geometries changes between surveys. Furthermore, joint inversion provides the most accurate velocity change and uncertainty estimates in all cases tested. We therefore conclude that Bayesian time-lapse inversion using a joint inversion strategy may be useful to image and monitor subsurface changes, in particular where variations in the results would lead to different consequent decisions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
198. 2-D probabilistic inversion of MT data and uncertainty quantification using the Hamiltonian Monte Carlo method.
- Author
-
Peng, Ronghua, Han, Bo, Hu, Xiangyun, Li, Jianhui, and Liu, Yajun
- Subjects
- *
MONTE Carlo method , *MARKOV chain Monte Carlo , *DISTRIBUTION (Probability theory) , *PREDICATE calculus - Abstract
Bayesian methods provide a valuable framework for rigorously quantifying the model uncertainty arising from the inherent non-uniqueness in the magnetotelluric (MT) inversion. However, widely used Markov chain Monte Carlo (MCMC) sampling approaches usually require a significant number of model samples for accurate uncertainty estimates, making their applications computationally challenging for 2-D or 3-D MT problems. In this study, we explore the applicability of the Hamiltonian Monte Carlo (HMC) method for 2-D probabilistic MT inversion. The HMC provides a mechanism for efficient exploration in high-dimensional model space by making use of gradient information of the posterior probability distribution, resulting in a substantial reduction in the number of samples needed for reliable uncertainty quantification compared to the conventional MCMC methods. Numerical examples with synthetic data demonstrate that the HMC method achieves rapid convergence to the posterior probability distribution of model parameters with a limited number of model samples, indicating the computational advantages of the HMC in high-dimensional model space. Finally, we applied the developed approach to the COPROD2 field data set. The statistical models derived from the HMC approach agree well with previous results obtained by 2-D deterministic inversions. Most importantly, the probabilistic inversion provides valuable quantitative model uncertainty information associated with the resistivity structures derived from the observed data, which facilitates model interpretation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
199. The Role of Risk Factors in System Performance: A Comprehensive Study with Adaptive Progressive Type-II Censoring.
- Author
-
Ahmad, Hanan Haj, Aboshady, Mohamed, and Mansour, Mahmoud
- Subjects
- *
CENSORING (Statistics) , *RELIABILITY in engineering , *MAXIMUM likelihood statistics , *SYSTEM failures , *MARKOV chain Monte Carlo , *INFERENTIAL statistics , *PERFORMANCE theory - Abstract
The quality performance of many vital systems depends on how long the units are performing; hence, research works started focusing on increasing the reliability of systems while taking into consideration that many factors may cause the failures of operating systems. In this study, the combination of a parametric generalized linear failure rate distribution model and an adaptive progressive Type-II censoring scheme for practical purposes is explored. A comprehensive investigation is performed on the risk factors that cause failure and determines which of the factors has a more harmful effect on the units. A lifetime experiment is performed under the condition of an adaptive progressive Type-II censoring scheme to obtain observations as a result of the competing factors of failures. The obtained observations are assumed to follow a three-parameter generalized linear failure rate distribution and are assumed to be competing to cause failure. Two statistical inference methods are employed for estimating this model's parameters: the frequentist maximum likelihood method and the Bayesian approach. Our model's validity is demonstrated through extensive simulations and real data applications in the medical and electrical engineering fields. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
200. Impact of new direct‐acting antiviral therapy on the prevalence and undiagnosed proportion of chronic hepatitis C infection.
- Author
-
Forouzannia, Farinaz, Hamadeh, Abdullah, Passos‐Castilho, Ana Maria, Erman, Aysegul, Yu, Amanda, Feng, Zeny, Janjua, Naveed Z., Sander, Beate, Greenaway, Christina, and Wong, William W. L.
- Subjects
- *
CHRONIC hepatitis C , *HEPATITIS C , *MARKOV chain Monte Carlo , *CANADIAN provinces - Abstract
Background: Patients with chronic hepatitis C (CHC) can be cured with the new highly effective interferon‐free combination treatments (DAA) that were approved in 2014. However, CHC is a largely silent disease, and many individuals are unaware of their infections until the late stages of the disease. The impact of wider access to effective treatments and improved awareness of the disease on the number of infections and the number of patients who remain undiagnosed is not known in Canada. Such evidence can guide the development of strategies and interventions to reduce the burden of CHC and meet World Health Organization's (WHO) 2030 elimination targets. The purpose of this study is to use a back‐calculation framework informed by provincial population‐level health administrative data to estimate the prevalence of CHC and the proportion of cases that remain undiagnosed in the three most populated provinces in Canada: British Columbia (BC), Ontario and Quebec. Methods: We have conducted a population‐based retrospective analysis of health administrative data for the three provinces to generate the annual incidence of newly diagnosed CHC cases, decompensated cirrhosis (DC), hepatocellular carcinoma (HCC) and HCV treatment initiations. For each province, the data were stratified in three birth cohorts: individuals born prior to 1945, individuals born between 1945 and 1965 and individuals born after 1965. We used a back‐calculation modelling approach to estimate prevalence and the undiagnosed proportion of CHC. The historical prevalence of CHC was inferred through a calibration process based on a Bayesian Markov chain Monte Carlo (MCMC) algorithm. The algorithm constructs the historical prevalence of CHC for each cohort by comparing the model‐generated outcomes of the annual incidence of the CHC‐related health events against the data set of observed diagnosed cases generated in the retrospective analysis. Results: The results show a decreasing trend in both CHC prevalence and undiagnosed proportion in BC, Ontario and Quebec. In 2018, CHC prevalence was estimated to be 1.23% (95% CI:.96%–1.62%),.91% (95% CI:.82%–1.04%) and.57% (95% CI:.51%–.64%) in BC, Ontario and Quebec respectively. The CHC undiagnosed proportion was assessed to be 35.44% (95% CI: 27.07%–45.83%), 34.28% (95% CI: 26.74%–41.62%) and 46.32% (95% CI: 37.85%–52.80%) in BC, Ontario and Quebec, respectively, in 2018. Also, since the introduction of new DAA treatment in 2014, CHC prevalence decreased from 1.39% to 1.23%,.97% to.91% and.65% to.57% in BC, Ontario and Quebec respectively. Similarly, the CHC undiagnosed proportion decreased from 38.78% to 35.44%, 38.70% to 34.28% and 47.54% to 46.32% in BC, Ontario and Quebec, respectively, from 2014 to 2018. Conclusions: We estimated that the CHC prevalence and undiagnosed proportion have declined for all three provinces since the new DAA treatment has been approved in 2014. Yet, our findings show that a significant proportion of HCV cases remain undiagnosed across all provinces highlighting the need to increase investment in screening. Our findings provide essential evidence to guide decisions about current and future HCV strategies and help achieve the WHO goal of eliminating hepatitis C in Canada by 2030. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.