11,622 results on '"Gibbs Sampling"'
Search Results
2. Genetic parameters for milk production traits of Simmental cows with random regression test-day model
- Author
-
Otwinowska-Mindur, A., Ptak, E., Jagusiak, W., and Zarnecki, A.
- Published
- 2025
- Full Text
- View/download PDF
3. Path integral Monte Carlo in a discrete variable representation with Gibbs sampling: Dipolar planar rotor chain.
- Author
-
Zhang, Wenxue, Moeed, Muhammad Shaeer, Bright, Andrew, Serwatka, Tobias, De Oliveira, Estevao, and Roy, Pierre-Nicholas
- Subjects
- *
DENSITY matrices , *PATH integrals , *MATRIX multiplications , *RENORMALIZATION group , *DEGREES of freedom , *GIBBS sampling - Abstract
In this work, we propose a path integral Monte Carlo approach based on discretized continuous degrees of freedom and rejection-free Gibbs sampling. The ground state properties of a chain of planar rotors with dipole–dipole interactions are used to illustrate the approach. Energetic and structural properties are computed and compared to exact diagonalization and numerical matrix multiplication for N ≤ 3 to assess the systematic Trotter factorization error convergence. For larger chains with up to N = 100 rotors, Density Matrix Renormalization Group calculations are used as a benchmark. We show that using Gibbs sampling is advantageous compared to traditional Metropolis–Hastings rejection importance sampling. Indeed, Gibbs sampling leads to lower variance and correlation in the computed observables. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
4. Topology of thermodynamic potentials using physical models: Helmholtz, Gibbs, Grand, and Null.
- Author
-
Nitzke, Isabel, Stephan, Simon, and Vrabec, Jadran
- Subjects
- *
THERMODYNAMIC potentials , *HELMHOLTZ free energy , *ETHANES , *GIBBS' free energy , *MONTE Carlo method , *TOPOLOGY , *HELMHOLTZ equation , *GIBBS sampling - Abstract
Thermodynamic potentials play a substantial role in numerous scientific disciplines and serve as basic constructs for describing the behavior of matter. Despite their significance, comprehensive investigations of their topological characteristics and their connections to molecular interactions have eluded exploration due to experimental inaccessibility issues. This study addresses this gap by analyzing the topology of the Helmholtz energy, Gibbs energy, Grand potential, and Null potential that are associated with different isothermal boundary conditions. By employing Monte Carlo simulations in the NVT, NpT, and μVT ensembles and a molecular-based equation of state, methane, ethane, nitrogen, and methanol are investigated over a broad range of thermodynamic conditions. The predictions from the two independent methods are overall in very good agreement. Although distinct quantitative differences among the fluids are observed, the overall topology of the individual thermodynamic potentials remains unaffected by the molecular architecture, which is in line with the corresponding states principle—as expected. Furthermore, a comparative analysis reveals significant differences between the total potentials and their residual contributions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Pseudo-marginal approximation to the free energy in a micro–macro Markov chain Monte Carlo method.
- Author
-
Vandecasteele, Hannes and Samaey, Giovanni
- Subjects
- *
MARKOV processes , *GIBBS sampling , *BOLTZMANN factor , *MARKOV chain Monte Carlo , *DEGREES of freedom - Abstract
We introduce a generalized micro–macro Markov chain Monte Carlo (mM-MCMC) method with pseudo-marginal approximation to the free energy that is able to accelerate sampling of the microscopic Gibbs distributions when there is a time-scale separation between the macroscopic dynamics of a reaction coordinate and the remaining microscopic degrees of freedom. The mM-MCMC method attains this efficiency by iterating four steps: (i) propose a new value of the reaction coordinate, (ii) accept or reject the macroscopic sample, (iii) run a biased simulation that creates a microscopic molecular instance that lies close to the newly sampled macroscopic reaction coordinate value, and (iv) microscopic accept/reject step for the new microscopic sample. In the present paper, we eliminate the main computational bottleneck of earlier versions of this method: the necessity to have an accurate approximation of free energy. We show that the introduction of a pseudo-marginal approximation significantly reduces the computational cost of the microscopic accept/reject step while still providing unbiased samples. We illustrate the method's behavior on several molecular systems with low-dimensional reaction coordinates. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Microbiome Enterotype Detection via a Latent Variable Allocation Model
- Author
-
Giampino, Alice, Ascari, Roberto, Migliorati, Sonia, Pollice, Alessio, editor, and Mariani, Paolo, editor
- Published
- 2025
- Full Text
- View/download PDF
7. Finite Element Model Updating Using Modal Data
- Author
-
Kiran, Rajpurohit, Bansal, Sahil, Chaari, Fakher, Series Editor, Gherardini, Francesco, Series Editor, Ivanov, Vitalii, Series Editor, Haddar, Mohamed, Series Editor, Cavas-Martínez, Francisco, Editorial Board Member, di Mare, Francesca, Editorial Board Member, Kwon, Young W., Editorial Board Member, Tolio, Tullio A. M., Editorial Board Member, Trojanowska, Justyna, Editorial Board Member, Schmitt, Robert, Editorial Board Member, Xu, Jinyang, Editorial Board Member, Sidhardh, Sai, editor, Prakash, S. Suriya, editor, Annabattula, Ratna Kumar, editor, and Mylavarapu, Phani, editor
- Published
- 2025
- Full Text
- View/download PDF
8. Outcome-guided Bayesian clustering for disease subtype discovery using high-dimensional transcriptomic data.
- Author
-
Meng, Lingsong and Huo, Zhiguang
- Subjects
- *
GIBBS sampling , *FALSE discovery rate , *TREATMENT effectiveness , *ALZHEIMER'S disease , *CLUSTER sampling - Abstract
Due to the tremendous heterogeneity of disease manifestations, many complex diseases that were once thought to be single diseases are now considered to have disease subtypes. Disease subtyping analysis, that is the identification of subgroups of patients with similar characteristics, is the first step to accomplish precision medicine. With the advancement of high-throughput technologies, omics data offers unprecedented opportunity to reveal disease subtypes. As a result, unsupervised clustering analysis has been widely used for this purpose. Though promising, the subtypes obtained from traditional quantitative approaches may not always be clinically meaningful (i.e. correlate with clinical outcomes). On the other hand, the collection of rich clinical data in modern epidemiology studies has the great potential to facilitate the disease subtyping process via omics data and to discovery clinically meaningful disease subtypes. Thus, we developed an outcome-guided Bayesian clustering (GuidedBayesianClustering) method to fully integrate the clinical data and the high-dimensional omics data. A Gaussian mixed model framework was applied to perform sample clustering; a spike-and-slab prior was utilized to perform gene selection; a mixture model prior was employed to incorporate the guidance from a clinical outcome variable; and a decision framework was adopted to infer the false discovery rate of the selected genes. We deployed conjugate priors to facilitate efficient Gibbs sampling. Our proposed full Bayesian method is capable of simultaneously (i) obtaining sample clustering (disease subtype discovery); (ii) performing feature selection (select genes related to the disease subtype); and (iii) utilizing clinical outcome variable to guide the disease subtype discovery. The superior performance of the GuidedBayesianClustering was demonstrated through simulations and applications of breast cancer expression data and Alzheimer's disease. An R package has been made publicly available on GitHub to improve the applicability of our method. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
9. Bayesian Hierarchy model for population pharmacokinetics of amikacin in Japanese clinical population.
- Author
-
Zhou, Ziyue, Li, Guodong, Xu, Zhaosi, and Zhu, Liping
- Subjects
- *
MARKOV chain Monte Carlo , *GIBBS sampling , *JAPANESE people , *MARKOV processes , *RESPIRATORY infections - Abstract
Amikacin is one of the aminoglycosides with a narrow therapeutic window, significant dose–response relationship, and substantial interindividual pharmacokinetics (PK) variability, thus requiring an individualized dosing regimen. In this paper, a three-stage Bayesian hierarchical model was developed based on the known the PK parameters of amikacin obtained from a nonlinear mixed-effects model established for the Japanese population, and the weights were assigned to the priori and posteriori parts before two-dimensional Gibbs sampling, and simulations were performed using the data of 24 elderly patients with respiratory tract infections in Japan, after analyzing the predicted values and the range between effective trough concentrations (${C_{trough}}$Ctrough) and peak concentrations (${C_{peak}}$Cpeak), and residual plots, the dose for patients 3, 7, 9, and 16 was increased to 600 mg/day, and the dose for patient 20 was decreased to 400 mg/day, while keeping the remaining patients’ doses unchanged, and the serum concentration at the time of the last administration was predicted, which showed that the Bayesian hierarchical model and the Markov chain Monte Carlo algorithm in this study performed well. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
10. Quantum memory at nonzero temperature in a thermodynamically trivial system.
- Author
-
Hong, Yifan, Guo, Jinkang, and Lucas, Andrew
- Subjects
ISING model ,GIBBS sampling ,TRANSITION temperature ,CRITICAL temperature ,PHASE transitions - Abstract
Passive error correction protects logical information forever (in the thermodynamic limit) by updating the system based only on local information and few-body interactions. A paradigmatic example is the classical two-dimensional Ising model: a Metropolis-style Gibbs sampler retains the sign of the initial magnetization (a logical bit) for thermodynamically long times in the low-temperature phase. Known models of passive quantum error correction similarly exhibit thermodynamic phase transitions to a low-temperature phase wherein logical qubits are protected by thermally stable topological order. Here, in contrast, we show that certain families of constant-rate classical and quantum low-density parity check codes have no thermodynamic phase transitions at nonzero temperature, but nonetheless exhibit ergodicity-breaking dynamical transitions: below a critical nonzero temperature, the mixing time of local Gibbs sampling diverges in the thermodynamic limit. Slow Gibbs sampling of such codes enables fault-tolerant passive quantum error correction using finite-depth circuits. This strategy is well suited to measurement-free quantum error correction, and may present a desirable experimental alternative to conventional quantum error correction based on syndrome measurements and active feedback. It has been commonly assumed that self-correcting quantum memories are only possible in systems with finite-temperature phase transitions to topological order. Here the authors show a complete breakdown of this expectation in quantum low-density parity-check codes. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
11. Estimating Posterior Sensitivities with Application to Structural Analysis of Bayesian Vector Autoregressions.
- Author
-
Jacobi, Liana, Zhu, Dan, and Joshi, Mark
- Subjects
MARKOV chain Monte Carlo ,TIME series analysis ,IMPULSE response ,GIBBS sampling ,AUTOMATIC differentiation - Abstract
The inherent feature of Bayesian empirical analysis is the dependence of posterior inference on prior parameters, which researchers typically specify. However, quantifying the magnitude of this dependence remains difficult. This article extends Infinitesimal Perturbation Analysis, widely used in classical simulation, to compute asymptotically unbiased and consistent sensitivities of posterior statistics with respect to prior parameters from Markov chain Monte Carlo inference via Gibbs sampling. The method demonstrates the possibility of efficiently computing the complete set of prior sensitivities for a wide range of posterior statistics, alongside the estimation algorithm using Automatic Differentiation. The method's application is exemplified in Bayesian Vector Autoregression analysis of fiscal policy in U.S. macroeconomic time series data. The analysis assesses the sensitivities of posterior estimates, including the Impulse response functions and Forecast error variance decompositions, to prior parameters under common Minnesota shrinkage priors. The findings illuminate the significant and intricate influence of prior specification on the posterior distribution. This effect is particularly notable in crucial posterior statistics, such as the substantial absolute eigenvalue of the companion matrix, ultimately shaping the structural analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
12. The Win Ratio Approach in Bayesian Monitoring for Two‐Arm Phase II Clinical Trial Designs With Multiple Time‐To‐Event Endpoints.
- Author
-
Huang, Xinran, Wang, Jian, and Ning, Jing
- Subjects
- *
TREATMENT effectiveness , *GIBBS sampling , *LOGNORMAL distribution , *CANCER relapse , *FRUSTRATION - Abstract
To assess the preliminary therapeutic impact of a novel treatment, futility monitoring is commonly employed in Phase II clinical trials to facilitate informed decisions regarding the early termination of trials. Given the rapid evolution in cancer treatment development, particularly with new agents like immunotherapeutic agents, the focus has often shifted from objective response to time‐to‐event endpoints. In trials involving multiple time‐to‐event endpoints, existing monitoring designs typically select one as the primary endpoint or employ a composite endpoint as the time to the first occurrence of any event. However, relying on a single efficacy endpoint may not adequately evaluate an experimental treatment. Additionally, the time‐to‐first‐event endpoint treats all events equally, ignoring their differences in clinical priorities. To tackle these issues, we propose a Bayesian futility monitoring design for a two‐arm randomized Phase II trial, which incorporates the win ratio approach to account for the clinical priority of multiple time‐to‐event endpoints. A joint lognormal distribution was assumed to model the time‐to‐event variables for the estimation. We conducted simulation studies to assess the operating characteristics of the proposed monitoring design and compared them to those of conventional methods. The proposed design allows for early termination for futility if the endpoint with higher clinical priority (e.g., death) deteriorates in the treatment arm, compared to the time‐to‐first‐event approach. Meanwhile, it prevents an aggressive early termination if the endpoint with lower clinical priority (e.g., cancer recurrence) shows deterioration in the treatment arm, offering a more tailored approach to decision‐making in clinical trials with multiple time‐to‐event endpoints. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Robust Bayesian Modeling of Counts with Zero Inflation and Outliers: Theoretical Robustness and Efficient Computation.
- Author
-
Hamura, Yasuyuki, Irie, Kaoru, and Sugasawa, Shonosuke
- Subjects
- *
NEGATIVE binomial distribution , *MARKOV chain Monte Carlo , *GAUSSIAN processes , *BETA distribution , *POISSON regression , *GIBBS sampling - Abstract
AbstractCount data with zero inflation and large outliers are ubiquitous in many scientific applications. However, posterior analysis under a standard statistical model, such as Poisson or negative binomial distribution, is sensitive to such contamination. This study introduces a novel framework for Bayesian modeling of counts that is robust to both zero inflation and large outliers. In doing so, we introduce rescaled beta distribution and adopt it to absorb undesirable effects from zero and outlying counts. The proposed approach has two appealing features: the efficiency of the posterior computation via a custom Gibbs sampling algorithm and a theoretically guaranteed posterior robustness, where extreme outliers are automatically removed from the posterior distribution. We demonstrate the usefulness of the proposed method by applying it to trend filtering and spatial modeling using predictive Gaussian processes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. A sparse latent class model incorporating response times.
- Author
-
He, Siqi, Culpepper, Steven Andrew, and Douglas, Jeffrey A.
- Subjects
- *
PERSONALITY assessment , *MONTE Carlo method , *GIBBS sampling , *STIMULUS & response (Psychology) , *PARAMETER estimation - Abstract
Diagnostic models (DM) have been widely used to classify respondents' latent attributes in cognitive and non‐cognitive assessments. The integration of response times (RTs) with DM presents additional evidence to understand respondents' problem‐solving behaviours. While recent research has explored using sparse latent class models (SLCM) to infer the latent structure of items based on item responses, the incorporation of RT data within these models remains underexplored. This study extends the SLCM framework to include RT, relaxing the conditional independence assumption between RT and latent attributes given individual speed. This adaptation provides a more flexible framework for jointly modelling RT and item responses. While the proposed model holds promise for applications in educational assessment, this study applied the model to the Fisher Temperament Inventory, yielding findings that provide a novel perspective on utilizing DM with RT in personality assessments. Additionally, a Gibbs sampling algorithm is proposed for parameter estimation. Results from Monte Carlo simulations demonstrate the algorithm's accuracy and efficiency. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. An Extension of the Unified Skew-Normal Family of Distributions and its Application to Bayesian Binary Regression.
- Author
-
Onorati, Paolo and Liseo, Brunero
- Subjects
- *
LOGISTIC regression analysis , *REGRESSION analysis , *CUMULANTS , *DATA analysis , *GIBBS sampling , *ALGORITHMS , *GAUSSIAN mixture models - Abstract
AbstractWe consider the Bayesian binary regression model and we introduce a new class of distributions, the Perturbed Unified Skew-Normal ( pSUN , henceforth), which generalizes the Unified Skew-Normal ( SUN ) class. We show that the new class is conjugate to any binary regression model, provided that the link function may be expressed as a scale mixture of Gaussian CDFs. We discuss in detail the popular logit case, and we show that, when a logistic regression model is combined with a Gaussian prior, posterior summaries such as cumulants and normalizing constants can easily be obtained through the use of an importance sampling approach, opening the way to straightforward variable selection procedures. For more general prior distributions, the proposed methodology is based on a simple Gibbs sampler algorithm. We also claim that, in the p>n case, our proposal presents better performances - both in terms of mixing and accuracy - compared to the existing methods.We illustrate the performance through several simulation studies and two data analyses. Supplementary Materials for this article, including the R package pSUN , are available online. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. A hierarchical random effects state-space model for modeling brain activities from electroencephalogram data.
- Author
-
Guo, Xingche, Yang, Bin, Loh, Ji Meng, Wang, Qinxia, and Wang, Yuanjia
- Subjects
- *
RANDOM effects model , *TREATMENT effect heterogeneity , *GIBBS sampling , *MENTAL depression , *RANDOM matrices , *ELECTROENCEPHALOGRAPHY - Abstract
Mental disorders present challenges in diagnosis and treatment due to their complex and heterogeneous nature. Electroencephalogram (EEG) has shown promise as a source of potential biomarkers for these disorders. However, existing methods for analyzing EEG signals have limitations in addressing heterogeneity and capturing complex brain activity patterns between regions. This paper proposes a novel random effects state-space model (RESSM) for analyzing large-scale multi-channel resting-state EEG signals, accounting for the heterogeneity of brain connectivities between groups and individual subjects. We incorporate multi-level random effects for temporal dynamical and spatial mapping matrices and address non-stationarity so that the brain connectivity patterns can vary over time. The model is fitted under a Bayesian hierarchical model framework coupled with a Gibbs sampler. Compared to previous mixed-effects state-space models, we directly model high-dimensional random effects matrices of interest without structural constraints and tackle the challenge of identifiability. Through extensive simulation studies, we demonstrate that our approach yields valid estimation and inference. We apply RESSM to a multi-site clinical trial of major depressive disorder (MDD). Our analysis uncovers significant differences in resting-state brain temporal dynamics among MDD patients compared to healthy individuals. In addition, we show the subject-level EEG features derived from RESSM exhibit a superior predictive value for the heterogeneous treatment effect compared to the EEG frequency band power, suggesting the potential of EEG as a valuable biomarker for MDD. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. A Bayesian joint model for mediation analysis with matrix-valued mediators.
- Author
-
Liu, Zijin, Liu, Zhihui (Amy), Hosni, Ali, Kim, John, Jiang, Bei, and Saarela, Olli
- Subjects
- *
RADIOTHERAPY treatment planning , *FEATURE extraction , *GIBBS sampling , *ANUS , *PRINCIPAL components analysis , *MEDIATION (Statistics) - Abstract
Unscheduled treatment interruptions may lead to reduced quality of care in radiation therapy (RT). Identifying the RT prescription dose effects on the outcome of treatment interruptions, mediated through doses distributed into different organs at risk (OARs), can inform future treatment planning. The radiation exposure to OARs can be summarized by a matrix of dose-volume histograms (DVH) for each patient. Although various methods for high-dimensional mediation analysis have been proposed recently, few studies investigated how matrix-valued data can be treated as mediators. In this paper, we propose a novel Bayesian joint mediation model for high-dimensional matrix-valued mediators. In this joint model, latent features are extracted from the matrix-valued data through an adaptation of probabilistic multilinear principal components analysis (MPCA), retaining the inherent matrix structure. We derive and implement a Gibbs sampling algorithm to jointly estimate all model parameters, and introduce a Varimax rotation method to identify active indicators of mediation among the matrix-valued data. Our simulation study finds that the proposed joint model has higher efficiency in estimating causal decomposition effects compared to an alternative two-step method, and demonstrates that the mediation effects can be identified and visualized in the matrix form. We apply the method to study the effect of prescription dose on treatment interruptions in anal canal cancer patients. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. BAYESIAN APPROACH FOR HEAVY-TAILED MODEL FITTING IN TWO LOMAX POPULATIONS.
- Author
-
Lingutla, Vijay Kumar and Nadiminti, Nagamani
- Subjects
- *
BAYESIAN analysis , *RELIABILITY in engineering - Abstract
Heavy-tailed data are commonly encountered in various real-world applications, particularly in finance, insurance, and reliability engineering. This study focuses on the Lomax distribution, a powerful tool for modeling heavy-tailed phenomena. We investigate the estimation of parameters in two Lomax populations characterized by a common shape parameter and distinct scale parameters. Our analysis employs both Maximum Likelihood Estimation (MLE) and Bayesian estimation techniques, recognizing the absence of closed-form solutions for the estimators. We utilize the Newton-Raphson method for numerical evaluation of the MLE and implement Lindley’s approximation for Bayesian estimators with different priors, under symmetric loss function. Additionally, we estimate posterior densities using Gibbs sampling and bootstrapping methods to manage uncertainty. A Monte Carlo simulation study is conducted to assess the performance of the proposed estimators, providing insights into their behavior under various scenarios. This paper also discusses the application of these methodologies through a real-life example, demonstrating the practical utility of the proposed estimation techniques for analyzing heavy-tailed data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
19. Estimating the Lifetime Parameters of the Odd-Generalized-Exponential–Inverse-Weibull Distribution Using Progressive First-Failure Censoring: A Methodology with an Application.
- Author
-
Ramadan, Mahmoud M., EL-Sagheer, Rashad M., and Abd-El-Monem, Amel
- Subjects
- *
FISHER information , *GIBBS sampling , *BAYES' estimation , *GAMMA distributions , *HAZARD function (Statistics) , *GAUSSIAN distribution - Abstract
This paper investigates statistical methods for estimating unknown lifetime parameters using a progressive first-failure censoring dataset. The failure mode's lifetime distribution is modeled by the odd-generalized-exponential–inverse-Weibull distribution. Maximum-likelihood estimators for the model parameters, including the survival, hazard, and inverse hazard rate functions, are obtained, though they lack closed-form expressions. The Newton–Raphson method is used to compute these estimations. Confidence intervals for the parameters are approximated via the normal distribution of the maximum-likelihood estimation. The Fisher information matrix is derived using the missing information principle, and the delta method is applied to approximate the confidence intervals for the survival, hazard rate, and inverse hazard rate functions. Bayes estimators were calculated with the squared error, linear exponential, and general entropy loss functions, utilizing independent gamma distributions for informative priors. Markov-chain Monte Carlo sampling provides the highest-posterior-density credible intervals and Bayesian point estimates for the parameters and reliability characteristics. This study evaluates these methods through Monte Carlo simulations, comparing Bayes and maximum-likelihood estimates based on mean squared errors for point estimates, average interval widths, and coverage probabilities for interval estimators. A real dataset is also analyzed to illustrate the proposed methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Rapid non-contact measurement of distance between two pins of flexspline in harmonic reducers based on standard/actual parts comparison.
- Author
-
Liu, Caitao, Cui, YuGuo, Liang, Dan, Liu, Li, and Lou, JunQiang
- Subjects
- *
MEASUREMENT errors , *ECCENTRICS (Machinery) , *MEASURING instruments , *NOISE measurement , *HARMONIC suppression filters , *GIBBS sampling - Abstract
In order to achieve rapid and precise measurement of distance between two pins of flexspline in harmonic reducers, an rapid non-contact measurement strategy based on standard/actual parts comparison is proposed. Firstly, to eliminate the installation eccentricity error of flexspline fixture, a sine-quadrant eccentricity error elimination method is designed. The sinusoidal curve and quadrant of the measured fixture eccentricity error with respect to the fixture rotation angle is used to calculate the eccentric error components along x and y axes, which has the advantages of simplicity and rapidity. Secondly, a Gaussian-Harmonic Wavelet Filtering (GHWF) algorithm is proposed to filter out the noise in the measurement process, which can effectively suppress the Gibbs phenomenon in harmonic wavelet transformation and improve the signal-to-noise ratio. Finally, an experimental platform including baseplate, turntable, flexspline, moving platform and laser sensor is constructed, in order to verify the performances of error elimination, noise filtering and distance measuring. Experimental results show that the measurement error of the proposed strategy is less than 7 μm, which is consistent with the accuracy obtained by the commercial high-precision gear measuring instrument. The average measurement time is about 29.6 s, much less than the 5 min of the commercial instrument, showing great application potential for the efficient distance measurement of gears and flexsplines. • A rapid non-contact measurement strategy for distance between two pins of flexspline in harmonic reducers. • To address the eccentricity issue in fixture installation, a sine-quadrant eccentricity error elimination method is proposed. • A Gaussian-Harmonic Wavelet Filtering algorithm combining harmonic wavelet with Gaussian function is proposed. • Various experiments and verifications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Full Bayesian analysis of triple seasonal autoregressive models.
- Author
-
Amin, Ayman A.
- Subjects
- *
BAYESIAN analysis , *TIME series analysis , *RANDOM variables , *AUTOREGRESSIVE models , *PROGRAMMING languages , *GIBBS sampling - Abstract
Summary: Seasonal autoregressive (SAR) time series models have been extended to fit time series exhibiting multiple seasonalities. However, hardly any research in Bayesian literature has been done on modelling multiple seasonalities. In this article, we propose a full Bayesian analysis of triple SAR (TSAR) models for time series with triple seasonality, considering identification, estimation and prediction for these TSAR models. In this Bayesian analysis of TSAR models, we assume the model errors to be normally distributed and the model order to be a random variable with a known maximum value, and we employ the g prior for the model coefficients and variance. Accordingly, we first derive the posterior mass function of the TSAR order in closed form, which then enables us to identify the best order of TSAR model as the order value with the highest posterior probability. In addition, we derive the conditional posteriors to be a multivariate normal for the TSAR coefficients and to be an inverse gamma for the TSAR variance; also, we derive the conditional predictive distribution to be a multivariate normal for future observations. Since these derived conditional distributions are in closed forms, we introduce the Gibbs sampler to present the Bayesian analysis of TSAR models and to easily produce multiple‐step‐ahead predictions. Using Julia programming language, we conduct an extensive simulation study, aiming to evaluate the accuracy of our proposed full Bayesian analysis for TSAR models. In addition, we apply our work on time series to hourly electricity load in some European countries. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. A computationally efficient Gibbs sampler based on data-augmentation strategy for estimating the reparameterized DINA model.
- Author
-
Zhang, Jiwei, Zhang, Zhaoyuan, and Lu, Jing
- Subjects
- *
ABILITY grouping (Education) , *BAYESIAN field theory , *REDUCTION potential , *ALGORITHMS , *GIBBS sampling , *PROBABILITY theory - Abstract
With the increasing demand for precise test feedback, cognitive diagnosis models (CDMs) have attracted more and more attention for fine classification of students with regard to their ability to master given skills. The aim of this paper is to use a highly effective Gibbs algorithm based on auxiliary variables (GAAV) to estimate the deterministic input noisy "and" gate (DINA) model that is widely used for cognitive diagnosis. The applicability of the algorithm to other CDMs is also discussed. Unlike the Metropolis–Hastings algorithm, this new algorithm does not require repeated adjustment of the turning parameters to achieve an appropriate acceptance probability, and it also overcomes the dependence of the traditional Gibbs sampling algorithm on the conjugate prior distribution. Four simulation studies are conducted, and a detailed analysis of fraction subtraction test data is carried out to further illustrate the proposed methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Ultimate Pólya Gamma Samplers–Efficient MCMC for Possibly Imbalanced Binary and Categorical Data.
- Author
-
Zens, Gregor, Frühwirth-Schnatter, Sylvia, and Wagner, Helga
- Subjects
- *
RANDOM variables , *DATA augmentation , *LOGISTIC regression analysis , *REGRESSION analysis , *MARKOV chain Monte Carlo , *LATENT variables , *GIBBS sampling - Abstract
Modeling binary and categorical data is one of the most commonly encountered tasks of applied statisticians and econometricians. While Bayesian methods in this context have been available for decades now, they often require a high level of familiarity with Bayesian statistics or suffer from issues such as low sampling efficiency. To contribute to the accessibility of Bayesian models for binary and categorical data, we introduce novel latent variable representations based on Pólya-Gamma random variables for a range of commonly encountered logistic regression models. From these latent variable representations, new Gibbs sampling algorithms for binary, binomial, and multinomial logit models are derived. All models allow for a conditionally Gaussian likelihood representation, rendering extensions to more complex modeling frameworks such as state space models straightforward. However, sampling efficiency may still be an issue in these data augmentation based estimation frameworks. To counteract this, novel marginal data augmentation strategies are developed and discussed in detail. The merits of our approach are illustrated through extensive simulations and real data applications. for this article are available online. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Bayesian regression models in gretl: the BayTool package.
- Author
-
Pedini, Luca
- Subjects
- *
GIBBS sampling , *ECONOMETRIC models , *INTEGRATED software , *REGRESSION analysis , *SAMPLING methods - Abstract
This article presents the gretl package BayTool which integrates the software functionalities, mostly concerned with frequentist approaches, with Bayesian estimation methods of commonly used econometric models. Computational efficiency is achieved by pairing an extensive use of Gibbs sampling for posterior simulation with the possibility of splitting single-threaded experiments into multiple cores or machines by means of parallelization. From the user's perspective, the package requires only basic knowledge of gretl scripting to fully access its functionality, while providing a point-and-click solution in the form of a graphical interface for a less experienced audience. These features, in particular, make BayTool stand out as an excellent teaching device without sacrificing more advanced or complex applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. A Novel Bayesian probabilistic distance clustering algorithm.
- Author
-
Tabibi Gilani, Morteza, Zarei, Reza, and Tabibi Gilani, Niloofar
- Abstract
Recently, Tortora et al. (SN Comput Sci 1:65, 2020) introduced two probabilistic d-clustering algorithms based on the multivariate Gaussian distribution and multivariate Student-t distributions, which exhibit superior performance compared to probabilistic d-clustering and k-means algorithms. However, these proposed algorithms may need help when the variances of individual clusters are heterogeneous. This paper presents a unified Bayesian approach to Gaussian probabilistic distance clustering to address this issue, employing the Gibbs posterior. We derived a closed-form posterior distribution for each unknown parameter using this approach. The effectiveness of the extended method was demonstrated through two numerical examples, including one simulation study and one real data analysis based on three datasets. The proposed method was further compared with conventional methods, demonstrating its superior accuracy. Simulation studies and real data analyses indicate that in many cases, mainly when there is correlation, overlap, or a data variance greater than one, as well as when overlap alone exists, the Bayesian Gaussian probabilistic distance clustering algorithm outperforms the Gaussian probabilistic distance clustering algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Inference of multi‐sample stage life testing model under Weibull distribution.
- Author
-
Samanta, Debashis and Kundu, Debasis
- Subjects
- *
WEIBULL distribution , *GIBBS sampling , *MAXIMUM likelihood statistics , *BAYESIAN field theory , *CONFIDENCE intervals - Abstract
In this article we consider the meta‐analysis of stage life testing experiments. We propose a method to combine the data obtained from s$s$ number of independent stage life testing experiments. We have assumed that there are only two stress levels for each stage life testing experiment and lifetime of the experimental units follows Weibull distribution at each stress level. The distributions under two stress levels are connected through Khamis–Higgings model assumption. We assume that the shape parameters of Weibull distribution are same for all the samples; however, the scale parameters are different. We provide the maximum likelihood estimation and the asymptotic confidence intervals of the model parameters. We also provide the Bayesian inference of the model parameters. The Bayes estimates and the associated credible intervals are obtained using Gibbs sampling technique since the explicit forms of the Bayes estimates do not exist. We have performed an extensive simulation study to see the performances of the different estimators, and the analyses of two data sets for illustrative purpose. The results are quite satisfactory. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. User Association in User-Centric Hybrid VLC/RF Cell-Free Massive MIMO Systems.
- Author
-
Almehdhar, Ahmed, Obeed, Mohanad, Chaaban, Anas, and Zummo, Salam A.
- Subjects
- *
GIBBS sampling , *MIMO systems , *OPTICAL communications , *TELECOMMUNICATION systems , *VISIBLE spectra - Abstract
A continuous goal in all communication systems is to enhance users' experience and provide them with the highest possible data rates. Recently, the concept of cell-free massive MIMO (CF-mMIMO) systems has been considered to enhance the performance of systems that operate solely with radio-frequency (RF) or visible light communication (VLC) technologies. In this paper, a hybrid VLC/RF cell-free massive MIMO system is proposed where an RF cell-free network and a VLC cell-free network coexist to serve the users. The idea is to utilize the benefits of each network and balance the load with the aim of maximizing the system's sum-rate. The system is evaluated using zero-forcing (ZF) precoding scheme. Two distinct user association algorithms are proposed for assigning users to either the VLC network or the RF network. In addition, two user-centric clustering approaches are proposed and evaluated. Simulation results show that the proposed association algorithms significantly outperform a random network association of users in terms of sum-rate. Results also show great potential for the proposed system compared to standalone cell-free networks. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Ordered probit Bayesian additive regression trees for ordinal data.
- Author
-
Lee, Jaeyong and Hwang, Beom Seuk
- Subjects
- *
GIBBS sampling , *SAMPLING (Process) , *CONFOUNDING variables , *PROBIT analysis , *REGRESSION trees - Abstract
Bayesian additive regression trees (BART) is a nonparametric model that is known for its flexibility and strong statistical foundation. To address a robust and flexible approach to analyse ordinal data, we extend BART into an ordered probit regression framework (OPBART). Further, we propose a semiparametric setting for OPBART (semi‐OPBART) to model covariates of interest parametrically and confounding variables nonparametrically. We also provide Gibbs sampling procedures to implement the proposed models. In both simulations and real data studies, the proposed models demonstrate superior performance over other competing ordinal models. We also highlight enhanced interpretability of semi‐OPBART in terms of inference through marginal effects. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. Artwork pricing model integrating the popularity and ability of artists.
- Author
-
Park, Jinsu, Lee, Yoonjin, Yang, Daewon, Park, Jongho, and Jung, Hohyun
- Abstract
Considerable research has been devoted to understanding the popularity effect on the art market dynamics, meaning that artworks by popular artists tend to have high prices. The hedonic pricing model has employed artists' reputation attributes, such as survey results, to understand the popularity effect, but the reputation attributes are constant and not properly defined at the point of artwork sales. Moreover, the artist's ability has been measured via random effect in the hedonic model, which fails to reflect ability changes. To remedy these problems, we present a method to define the popularity measure using the artwork sales dataset without relying on the artist's reputation attributes. Also, we propose a novel pricing model to appropriately infer the time-dependent artist's abilities using the presented popularity measure. An inference algorithm is presented using the EM algorithm and Gibbs sampling to estimate model parameters and artist abilities. We use the Artnet dataset to investigate the size of the rich-get-richer effect and the variables affecting artwork prices in real-world art market dynamics. We further conduct inferences about artists' abilities under the popularity effect and examine how ability changes over time for various artists with remarkable interpretations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. A clustering approach to integrative analyses of multiomic cancer data.
- Author
-
Yan, Dongyan and Guha, Subharup
- Subjects
- *
DNA probes , *OXYGENATORS , *GIBBS sampling , *LUNG cancer , *CANCER invasiveness - Abstract
Rapid technological advances have allowed for molecular profiling across multiple omics domains for clinical decision-making in many diseases, especially cancer. However, as tumor development and progression are biological processes involving composite genomic aberrations, key challenges are to effectively assimilate information from these domains to identify genomic signatures and druggable biological entities, develop accurate risk prediction profiles for future patients, and identify novel patient subgroups for tailored therapy and monitoring. We propose integrative frameworks for high-dimensional multiple-domain cancer data. These Bayesian mixture model-based approaches coherently incorporate dependence within and between domains to accurately detect tumor subtypes, thus providing a catalog of genomic aberrations associated with cancer taxonomy. The flexible and scalable Bayesian nonparametric strategy performs simultaneous bidirectional clustering of the tumor samples and genomic probes to achieve dimension reduction. We describe an efficient variable selection procedure that can identify relevant genomic aberrations and potentially reveal underlying drivers of disease. Although the work is motivated by lung cancer datasets, the proposed methods are broadly applicable in a variety of contexts involving high-dimensional data. The success of the methodology is demonstrated using artificial data and lung cancer omics profiles publicly available from The Cancer Genome Atlas. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. Parameter estimation for stable distributions and their mixture.
- Author
-
Hajjaji, Omar, Manou-Abi, Solym Mawaki, and Slaoui, Yousri
- Subjects
- *
NUMERICAL roots , *EXPECTATION-maximization algorithms , *GIBBS sampling , *CHARACTERISTIC functions , *MAXIMUM likelihood statistics - Abstract
In this paper, we consider estimating the parameters of univariate
α -stable distributions and their mixtures. First, using a Gaussian kernel density distribution estimator, we propose an estimation method based on the characteristic function. The optimal bandwidth parameter was selected using a plug-in method. We highlight another estimation procedure for the Maximum Likelihood framework based on the False position algorithm to find a numerical root of the log-likelihood through the score functions. For mixtures ofα -stable distributions, the EM algorithm and the Bayesian estimation method have been modified to propose an efficient and valuable tool for parameter estimation. The proposed methods can be generalised to multiple mixtures, although we have limited the mixture study to two components. A simulation study is carried out to evaluate the performance of our methods, which are then applied to real data. Our results appear to accurately estimate mixtures ofα -stable distributions. Applications concern the estimation of the number of replicates in the Mayotte COVID-19 dataset and the distribution of the N-acetyltransferase activity of the Bechtel et al. data for a urinary caffeine metabolite implicated in carcinogens. We compare the proposed methods, together with a detailed discussion. We conclude with the limitations of this study, together with other forthcoming work and a future implementation of an R package or Python library for the proposed methods in data modelling. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
32. Flexible Distribution Approaches to Enhance Regression and Deep Topic Modelling Techniques.
- Author
-
Koochemeshkian, Pantea and Bouguila, Nizar
- Subjects
- *
GIBBS sampling , *PROGRAMMING languages , *MACHINE learning , *CORPORA , *DATA modeling - Abstract
ABSTRACT This paper presents an extension of the Dirichlet multinomial regression (DMR) and deep Dirichlet multinomial regression (dDMR) topic modelling approaches by incorporating the generalised Dirichlet (GD) and Beta‐Liouville (BL) distributions using collapsed Gibbs sampling for parameter inference. The DMR and dDMR approaches have been shown to be effective in discovering latent topics in text corpora. However, these approaches have limitations when it comes to handling complex data structures and overfitting issues. To address these limitations, we introduce the GD and BL distributions, which have more flexibility in modelling complex data structures and handling sparse data. Additionally, we use collapsed Gibbs sampling to estimate the model parameters, which provides a computationally efficient method for inference. Experimental results on benchmark datasets demonstrate the effectiveness of the proposed approach in improving topic modelling performance, particularly in handling complex data structures and reducing overfitting. The proposed models also exhibit good interpretability of the learned topics, making them suitable for various applications in natural language processing and machine learning. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Efficient fully Bayesian approach to brain activity mapping with complex-valued fMRI data.
- Author
-
Wang, Zhengxin, Rowe, Daniel B., Li, Xinyi, and Andrew Brown, D.
- Subjects
- *
FUNCTIONAL magnetic resonance imaging , *IMAGE segmentation , *GIBBS sampling , *BRAIN mapping , *DATA mapping - Abstract
Functional magnetic resonance imaging (fMRI) enables indirect detection of brain activity changes via the blood-oxygen-level-dependent (BOLD) signal. Conventional analysis methods mainly rely on the real-valued magnitude of these signals. In contrast, research suggests that analyzing both real and imaginary components of the complex-valued fMRI (cv-fMRI) signal provides a more holistic approach that can increase power to detect neuronal activation. We propose a fully Bayesian model for brain activity mapping with cv-fMRI data. Our model accommodates temporal and spatial dynamics. Additionally, we propose a computationally efficient sampling algorithm, which enhances processing speed through image partitioning. Our approach is shown to be computationally efficient via image partitioning and parallel computation while being competitive with state-of-the-art methods. We support these claims with both simulated numerical studies and an application to real cv-fMRI data obtained from a finger-tapping experiment. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. A new LDA formulation with covariates.
- Author
-
Shimizu, Gilson Y., Izbicki, Rafael, and Valle, Denis
- Subjects
- *
GIBBS sampling , *GROCERY shopping , *REGRESSION analysis , *CORONAVIRUSES , *STRUCTURAL models - Abstract
The Latent Dirichlet Location (LDA) model is a popular method for creating mixed-membership clusters. Despite having been originally developed for text analysis, LDA has been used for a wide range of other applications. We propose a new formulation for the LDA model which incorporates covariates. In this model, a negative binomial regression is embedded within LDA, enabling straight-forward interpretation of the regression coefficients and the analysis of the quantity of cluster-specific elements in each sampling units (instead of the analysis being focused on modeling the proportion of each cluster, as in Structural Topic Models). We use slice sampling within a Gibbs sampling algorithm to estimate model parameters. We rely on simulations to show how our algorithm is able to successfully retrieve the true parameter values and the ability to make predictions for the abundance matrix using the information given by the covariates. The model is illustrated using real data sets from three different areas: text-mining of Coronavirus articles, analysis of grocery shopping baskets, and ecology of tree species on Barro Colorado Island (Panama). This model allows the identification of mixed-membership clusters in discrete data and provides inference on the relationship between covariates and the abundance of these clusters. HIGHLIGHTS: We propose a new formulation for the Latent Dirichlet Allocation (LDA) model which incorporates covariates. Our extension enables a straight-forward interpretation of the regression coefficients and the analysis of the quantity of cluster-specific elements in each sampling unit - including the prediction of these quantities in a new sample through covariates. We illustrate the benefits of this formulation using three data sets: text-mining of Coronavirus articles, analysis of grocery shopping baskets, and ecology of tree species in Barro Colorado Island. We provide an R package that enables users to readily apply our model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Variational Bayesian EM Algorithm for Quantile Regression in Linear Mixed Effects Models.
- Author
-
Wang, Weixian and Tian, Maozai
- Subjects
- *
EXPECTATION-maximization algorithms , *FIXED effects model , *GIBBS sampling , *BAYESIAN analysis , *QUANTILE regression , *DATA analysis - Abstract
This paper extends the normal-beta prime (NBP) prior to Bayesian quantile regression in linear mixed effects models and conducts Bayesian variable selection for the fixed effects of the model. The choice of hyperparameters in the NBP prior is crucial, and we employed the Variational Bayesian Expectation–Maximization (VBEM) for model estimation and variable selection. The Gibbs sampling algorithm is a commonly used Bayesian method, and it can also be combined with the EM algorithm, denoted as GBEM. The results from our simulation and real data analysis demonstrate that both the VBEM and GBEM algorithms provide robust estimates for the hyperparameters in the NBP prior, reflecting the sparsity level of the true model. The VBEM and GBEM algorithms exhibit comparable accuracy and can effectively select important explanatory variables. The VBEM algorithm stands out in terms of computational efficiency, significantly reducing the time and resource consumption in the Bayesian analysis of high-dimensional, longitudinal data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Exploring a Bayesian sparse factor model-based strategy for the genetic analysis of thousands of mid-infrared spectra traits for animal breeding.
- Author
-
Chen, Yansen, Atashi, Hadi, Qu, Jiayi, Delhez, Pauline, Runcie, Daniel, Soyeurt, Hélène, and Gengler, Nicolas
- Subjects
- *
MARKOV chain Monte Carlo , *ANIMAL development , *GIBBS sampling , *ANIMAL breeds , *GENETIC correlations , *LACTATION in cattle - Abstract
With the rapid development of animal phenomics and deep phenotyping, we can obtain thousands of traditional (but also molecular) phenotypes per individual. However, there is still a lack of exploration regarding how to handle this huge amount of data in the context of animal breeding, presenting a challenge that we are likely to encounter more and more in the future. This study aimed to (1) explore the use of the mega-scale linear mixed model (MegaLMM), a factor model-based approach that is able to simultaneously estimate (co)variance components and genetic parameters in the context of thousands of milk traits, hereafter called thousand-trait (TT) models; (2) compare the phenotype values and genomic breeding value (u) predictions for focal traits (i.e., traits that are targeted for prediction, compared with secondary traits that are helping to evaluate), from single-trait (ST) and TT models, respectively; (3) propose a new approximate method of GEBV (U) prediction with TT models and MegaLMM. We used a total of 3,421 milk mid-infrared (MIR) spectra wavepoints (called secondary traits) and 3 focal traits (average fat percentage [AFP], average methane production [ACH4], and average SCS [ASCS]) collected on 3,302 first-parity Holstein cows. The 3,421 milk MIR wavepoint traits were composed of 311 wavepoints in 11 classes (months in lactation). Genotyping information of 564,439 SNPs was available for all animals and was used to calculate the genomic relationship matrix. The MegaLMM was implemented in the framework of the Bayesian sparse factor model and solved through Gibbs sampling (Markov chain Monte Carlo). The heritabilities of the studied 3,421 milk MIR wavepoints gradually increased and then decreased in units of 311 wavepoints throughout the lactation. The genetic and phenotypic correlations between the first 311 wavepoints and the other 3,110 wavepoints were low. The accuracies of phenotype predictions from the ST model were lower than those from the TT model for AFP (0.51 vs. 0.93), ACH4 (0.30 vs. 0.86), and ASCS (0.14 vs. 0.33). The same trend was observed for the accuracies of u predictions for AFP (0.59 vs. 0.86), ACH4 (0.47 vs. 0.78), and ASCS (0.39 vs. 0.59). The average correlation between U predicted from the TT model and the new approximate method was 0.90. The new approximate method used for estimating U in MegaLMM will enhance the suitability of MegaLMM for applications in animal breeding. This study conducted an initial investigation into the application of thousands of traits in animal breeding and showed that the TT model is beneficial for the prediction of focal traits (phenotype and breeding values), especially for difficult-to-measure traits (e.g., ACH4). The list of standard abbreviations for JDS is available at adsa.org/jds-abbreviations-24. Nonstandard abbreviations are available in the Notes. [Display omitted] [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Water–ice phase transition in frozen soils: a mesoscopic numerical study based on lattice Boltzmann method.
- Author
-
Li, Xiaoyan, Wang, Qingyu, Hu, Yuyang, and Fang, Chao
- Subjects
- *
PHASE transitions , *FROST heaving , *FROZEN ground , *LATTICE Boltzmann methods , *WATERLOGGING (Soils) , *GIBBS sampling - Abstract
The phenomenon of water–ice phase transition in frozen soils is the key to explaining the mechanism of frost heaving and thawing settlement disaster. However, numerical analysis of water–ice phase transitions in frozen soils at mesoscale is rarely reported. This study combines the modified Gibbs-Thomson equation and the enthalpy-based lattice Boltzmann model, and develops a mesoscale numerical method to simulate the water–ice phase transition in the local pores of saturated frozen soil during freezing and thawing process. In order to accurately simulate this process, a new η coefficient is proposed to modify the Gibbs-Thomson equation, which is unrelated to the type of soil sample and gradation characteristics. According to the particle size distribution curve, the two-dimensional random circle generation method is used to reconstruct the soil pore structure. The freezing and thawing processes are verified with the experimental data. A larger non-uniformity coefficient Cu and curvature coefficient Cc of the grain size distribution result in a lower degree of subcooling and lower residual water content of the soil during the freezing process. The new model provides an effective means to understand the water–ice phase transition in frozen soil at a mesoscopic scale. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Statistical inference of a series reliability system using shock models with Weibull distribution.
- Author
-
Sarhan, Ammar M., Almetwally, Ehab M., Mustafa, Abdelfattah, and Tolba, Ahlam H.
- Subjects
- *
MARKOV chain Monte Carlo , *MARKOV processes , *WEIBULL distribution , *GAMMA distributions , *INFERENTIAL statistics , *GIBBS sampling - Abstract
In this study, we define a series system with n$n$ non‐independent and non‐identical components using a shock model with n+1$n+1$ sources of fatal shocks. Here, it is assumed that the shocks happen randomly and independently, following a Weibull distribution with various scale and shape parameters. A dependability model with 2(n+1)$2(n+1)$ unknown parameters is produced by this process. Making statistical conclusions about the model parameters is the main objective of this research. We apply the maximum likelihood and Bayes approaches to determine the model parameters' point and interval estimates. We shall demonstrate that no analytical solutions to the likelihood equations must be solved to obtain the parameters' maximum likelihood estimates. As a result, we will use the R program to approximate the parameter point and interval estimates. Additionally, we will use the bootstrap‐t and bootstrap‐p methods to approximate the confidence intervals. About the Bayesian approach, we presume that each model parameter is independent and follows a gamma prior distribution with a range of attached hyperparameter values. The model parameters' posterior distribution does not take a practical form. We are unable to derive the Bayes estimates in closed forms as a result. To solve this issue, we use the Gibbs sampler from the Metropolis‐Hasting algorithm based on the Markov chain Monte Carlo method to condense the posterior distribution. To demonstrate the relevance of this research, a real data set application is detailed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Equivalence of variance components between standard and recursive genetic models using LDL′ transformations.
- Author
-
Varona, Luis, López-Carbonell, David, Srihi, Houssemeddine, Hervás-Rivero, Carlos, González-Recio, Óscar, and Altarriba, Juan
- Subjects
BEEF cattle breeds ,GIBBS sampling ,MISSING data (Statistics) ,GENETIC models ,PHENOTYPES - Abstract
Background: Recursive models are a category of structural equation models that propose a causal relationship between traits. These models are more parameterized than multiple trait models, and they require imposing restrictions on the parameter space to ensure statistical identification. Nevertheless, in certain situations, the likelihood of recursive models and multiple trait models are equivalent. Consequently, the estimates of variance components derived from the multiple trait mixed model can be converted into estimates under several recursive models through LDL′ or block-LDL′ transformations. Results: The procedure was employed on a dataset comprising five traits (birth weight—BW, weight at 90 days—W90, weight at 210 days—W210, cold carcass weight—CCW and conformation—CON) from the Pirenaica beef cattle breed. These phenotypic records were unequally distributed among 149,029 individuals and had a high percentage of missing data. The pedigree used consisted of 343,753 individuals. A Bayesian approach involving a multiple-trait mixed model was applied using a Gibbs sampler. The variance components obtained at each iteration of the Gibbs sampler were subsequently used to estimate the variance components within three distinct recursive models. Conclusions: The LDL′ or block-LDL′ transformations applied to the variance component estimates achieved from a multiple trait mixed model enabled inference across multiple sets of recursive models, with the sole prerequisite of being likelihood equivalent. Furthermore, the aforementioned transformations simplify the handling of missing data when conducting inference within the realm of recursive models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. An initialization approach for metaheuristic algorithms by using Gibbs sampling.
- Author
-
Cuevas, Erik, Barba-Toscano, Oscar, Escobar, Héctor, Zaldívar, Daniel, and Rodríguez-Vázquez, Alma
- Subjects
- *
GIBBS sampling , *METAHEURISTIC algorithms , *DIFFERENTIAL evolution , *TEST methods - Abstract
Recently, several new initialization techniques have been proposed. Despite their good results in very low dimensions, their performance deteriorated significantly with an increase in the number of dimensions. This paper introduces a new method for initializing metaheuristic algorithms, based on the Gibbs sampling approach. The proposed method samples according to the Gibbs method, a multidimensional Gaussian that completely covers the search space defined by the objective function. In this process, each decision variable is sequentially sampled to ensure that the value of each is dependent only on the value of the variable sampled before it. This process leads to the generation of initial positions with significantly low mutual correlation, which is an advantageous feature that prevents the aggregation of initial solutions in specific areas of the search space. This issue is particularly prevalent in optimization problems with a greater number of dimensions. To test the effectiveness of this method, it was applied in conjunction with a Differential Evolution algorithm. The complete approach has been evaluated using a selection of pertinent and challenging functions in the field. The outcome of these experiments showed that the algorithm can establish a superior set of initial solutions, which allows consistent determination of the global solution, even as the complexity of the problem increases with more dimensions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Gibbs sampler for Bayesian prediction of triple seasonal autoregressive processes.
- Author
-
Amin, Ayman A.
- Subjects
- *
GIBBS sampling , *RESEARCH personnel , *PREDICTION models , *FORECASTING , *DENSITY - Abstract
Researchers have extended autoregressive (AR) time-series models to adequately fit and model time-series with triple seasonality. These AR extensions can be referred to as triple seasonal AR (TSAR) models. For these TSAR time-series models, only Bayesian estimation and identification have been introduced. Therefore, in this article, we aim to extend the existing work for presenting the Bayesian prediction for TSAR models using the Gibbs sampler algorithm. In this Bayesian prediction, we first assume the TSAR errors are identically normally distributed, and we employ normal-gamma and g priors for the TSAR parameters. Based on the normally distributed TSAR errors and the specified TSAR parameters' priors, we are able to derive the conditional predictive and posterior densities. Particularly, we show the conditional posterior densities of TSAR coefficients and variance are the multivariate normal and inverse-gamma, respectively. In addition, for the future TSAR observations, we show the conditional predictive density is the multivariate normal. Using these derived full conditional distributions, we propose the Gibbs sampler to efficiently approximate the joint predictive and posterior densities and to easily carry out multiple-step ahead predictions. We validate the accuracy of forecasting for the proposed Gibbs sampler by conducting a simulation study. Moreover, we apply the proposed Gibbs sampler to hourly electricity-load time-series datasets in some European countries, along with a comparison with the well-known long-short term memory (LSTM) recurrent neural model. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
42. Modeling feed efficiency over productive lifetime and integrating a submodel for body reserve management in Nordic dairy cattle.
- Author
-
Stephansen, R.B., Lassen, J., Thorup, V.M., Poulsen, B.G., Jensen, J., Sahana, G., and Christensen, O.F.
- Subjects
- *
SUSTAINABLE development , *JERSEY cattle , *GIBBS sampling , *DAIRY cattle , *GENETIC correlations , *MILK yield - Abstract
The list of standard abbreviations for JDS is available at adsa.org/jds-abbreviations-24. Nonstandard abbreviations are available in the Notes. Genetic enhancement of feed efficiency can improve the economic sustainability and environmental responsibility of dairy farming. Although genetic selection holds promise for improving feed efficiency across the lifespan of dairy cows, comprehensive data spanning whole lactations or even a productive lifetime are currently limited. To address this, we used production data and data from a camera-based feed intake and BW recording system, along with records of production, feed intake, and weight on Holstein cows from a research herd. We aimed to estimate variance components for a multivariate, multiparity model of production, feed intake, and BW data to calculate genetic residual feed intake (gRFI) for each of the Nordic breeds (Holstein, Jersey, and Red Dairy Cattle). Our approach included investigating a new definition of energy balance (EB body) calculated from changes in body reserves, serving as an energy sink in gRFI. The data in our analysis consisted of 4,751 Holstein cows (7,851 lactations), 2,068 Jersey cows (3,486 lactations), and 3,235 Red Dairy Cattle cows (5,419 lactations). We used Gibbs sampling to estimate posterior means and SD for all model parameters. Our findings revealed moderate lactation-wise heritability of gRFI (0.15–0.38) across all breeds and parities. Moreover, gRFI genetic correlations varied (−0.2 to 0.4) between early- and mid- to late-lactation stages across all breeds, and for lactation-wise gRFI, there were moderately high genetic correlations (0.39–0.59) between primi- and multiparous lactations across the 3 breeds. Those results suggest the importance of recording phenotypes in most time periods within and across lactations. Our analysis indicated that improving gRFI with one genetic SD unit corresponded to a 2% to 3% gain in net return profit per cow-year, with no or minimal impact on production and body reserve management. We demonstrated the feasibility of incorporating EB body into gRFI. Comparing gRFI calculated with EB body or changes in BW as an energy sink trait for body reserve management were highly genetically correlated (>0.95). This result shows that the choice of the energy sink trait for body reserve management in gRFI will yield limited reranking among cows and sires when based on BW records only. However, EB body offers an opportunity to incorporate BCS information without increasing the number of genetic parameters to be estimated, but it relies on parameters estimated in experimental settings. In conclusion, our study demonstrates the feasibility of developing a model for gRFI over most of the productive lifetime of dairy cattle, offering significant economic benefits without compromising productivity or body reserve management. Moving forward, comprehensive recording schemes covering whole lactations and productive lifetimes are advantageous for accurate selection indices of gRFI. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
43. Constrained estimation for the binomial AR(1) model: on Bayesian approach.
- Author
-
Zhang, Rui and Chen, Jin
- Subjects
- *
GIBBS sampling , *NUMERICAL integration , *LATENT variables , *BAYESIAN field theory , *MARKOV chain Monte Carlo - Abstract
In this paper, we consider the constrained estimation problem for the binomial AR(1) model using Bayesian approach. We show that by using Gibbs sampling algorithm, the constrains for the parameters and latent variables can be routinely implemented. While obtaining all the full conditional distributions under the unconstrained environment, we only need to generate samples from them and make restrictions to easily described cross-sections, respectively, which avoids complex numerical integrations under the overall constraint set. In the simulation study, the performance of our algorithm is checked. Finally, the method is applied to two real data examples. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
44. Bayesian Estimation of Fixed Effects Models with Large Datasets.
- Author
-
Qian, Hang
- Subjects
GIBBS sampling ,DUMMY variables ,DEPENDENT variables ,MORTGAGES ,HETEROGENEITY ,FIXED effects model - Abstract
In hierarchical prior longitudinal models, random effects are estimated by the Gibbs sampler. We show that fixed effects can be handled by a similar Gibbs sampler under a diffuse prior on the unobserved heterogeneity. The dummy variable approach for fixed effects is computationally intensive and has the out‐of‐memory risk, while the Gibbs sampler can reproduce the dummy variable estimator without creating dummy variables, and therefore avoids the memory burden. Compared to alternating projections and other classical approaches, our method simplifies both inference and estimation of the limited dependent variable models with fixed effects. The proposed method is applied to a real‐world mortgage dataset for classification with three‐way fixed effects on banks, regions, and loan purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
45. Ultrasound‐Based Phenotyping for Genetic Selection of Carcass Traits in Oreochromis niloticus: Integrating Imaging Technology Into Aquaculture Breeding.
- Author
-
Rezende, Cícero Eduardo, Perazza, Caio Augusto, Marçal, Danielle Cristina Pereira, Fernandes, Diana Carla Oliveira, Reis Neto, Rafael Vilhena, Freitas, Rilke Tadeu Fonseca, and Hilsdorf, Alexandre Wagner Silva
- Subjects
- *
FISH farming , *NILE tilapia , *GIBBS sampling , *ULTRASONIC imaging , *GENETIC correlations - Abstract
ABSTRACT Recent years have witnessed a remarkable global surge in fish production, with Nile tilapia (Oreochromis niloticus) emerging as a prominent contributor owing to its high demand as a nutritious food source. However, unlike terrestrial species, maintaining genealogical control and collecting phenotypic data in fish farming poses significant challenges, necessitating advancements to support genetic improvement programmes. While conventional methods, such as body measurements using rulers and photographs are prevalent in data collection, the potential of ultrasound—a less invasive and efficient tool for fish measurement—remains underexplored. This study assesses the viability of ultrasonography for genetically selecting carcass characteristics in Nile tilapia. The investigation encompasses data from 897 animals representing 53 full‐sib tilapia families maintained in the genetic improvement programme at the Federal University of Lavras. To measure carcass traits, the animals were sedated with benzocaine and ultrasound images were obtained at three distinct points. Subsequently, the animals were euthanised through medullary sectioning for further carcass processing. After evisceration, filleting and skinning, all weights were meticulously recorded. (Co)variance components and genetic parameters of the measured traits were estimated using the Bayesian approach by Gibbs sampling implemented in MTGSAM (Multiple Trait Gibbs Sampling in Animal Models) software. Heritabilities estimated for the studied carcass traits were moderate, ranging from 0.23 to 0.33. Notably, phenotypes derived from ultrasound images demonstrated substantial genetic correlations with fillet yield (0.83–0.92). In conclusion, this study confirms that indirect selection based on ultrasound images is effective and holds promise for integration into tilapia breeding programmes aimed at enhancing carcass yield. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Impact of Heat Stress on Milk Yield, Milk Fat-to-Protein Ratio, and Conception Rate in Thai–Holstein Dairy Cattle: A Phenotypic and Genetic Perspective.
- Author
-
Boonkum, Wuttigrai, Teawyoneyong, Watcharapong, Chankitisakul, Vibuntita, Duangjinda, Monchai, and Buaban, Sayan
- Subjects
- *
DAIRY cattle reproduction , *DAIRY cattle , *CATTLE genetics , *ENVIRONMENTAL indicators , *GIBBS sampling , *HERITABILITY , *GENETIC correlations , *PERCENTILES - Abstract
Simple Summary: Environmental indices are commonly used for detecting heat stress in dairy cattle; however, most studies have only focused on the productivity traits of dairy cattle in response to heat stress and have not considered their health and reproductive characteristics simultaneously. In this study, we aimed to determine the effects of heat stress on the production and reproduction performances of Thai–Holstein dairy cattle and the impact of the genetics of dairy cattle on their heat tolerance. We observed that heat stress significantly reduced milk yield and negatively affected the milk fat-to-protein ratio. Additionally, conception rates declined under heat stress, highlighting the challenges of finding an effective genetic approach for hot and humid regions, including Thailand. Genetic analysis revealed differences in heat tolerance among cows, indicating that the genetic improvement approach used in this study is suitable for planning future genetic improvements in dairy cattle. Heat stress severely affects dairy cattle production and reproduction performances in tropical regions. Genetic selection to maintain adequate yield and reproductive performance while enhancing their ability to withstand heat is essential for improving the genetics of dairy cows. Therefore, in this study, we aimed to estimate genetic parameters affecting production and reproduction performances under heat stress conditions in dairy cattle and to investigate the threshold point of heat stress for milk yield (MY), milk fat-to-protein ratio (FPR), and conception rate (CR) in Thai–Holstein dairy cattle. The data included 168,124 records related to MY and milk FPR and 21,278 records of CR in Thai–Holstein dairy cattle, covering the period from 1990 to 2007. A multiple-trait threshold-linear random regression model based on a Bayesian approach via Gibbs sampling was used to estimate variance components, genetic parameters (heritability values, and genetic correlations), and decline rates for each studied trait. The threshold point of heat stress was identified as a temperature and humidity index (THI) of 76. At THI76, a decline was observed in the MY, milk FPR, and CR of Thai dairy cattle. The heritability estimates for MY, milk FPR and CR were 0.347 ± 0.032, 0.293 ± 0.021, and 0.032 ± 0.001, respectively. The genetic correlation between MY and milk FPR and MY and CR were −0.24 and −0.53, respectively, whereas those between milk FPR and heat tolerance as well as between CR and heat tolerance were −0.48 and −0.49, respectively. In addition, the decline rates in MY, milk FPR, and CR were found to be associated with a high percentage of Holstein genetics. In conclusion, the results obtained in this study reveal that the simultaneous consideration of the MY, milk FPR, CR, and heat tolerance traits of Thai–Holstein dairy cattle is possible. In addition, developing a genetic model that incorporates THI is essential for sustainably addressing heat stress problems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Hyper Markov law in undirected graphical models with its applications.
- Author
-
Kang, Xiong and Yi Sun, Brian
- Subjects
- *
GIBBS sampling , *UNDIRECTED graphs , *LIKELIHOOD ratio tests , *MARKOV processes , *GENERALIZATION - Abstract
By exploring the prime decomposition of undirected graphs, this work investigates the hyper Markov property within the framework of arbitrary undirected graph, which can be seen as the generalization of that for decomposable graphical models proposed by Dawid and Laurizten. The proposed hyper Markov properties of this article can be used to characterize the conditional independence of a distribution or a statistical quantity and helpful to simplify the likelihood ratio functions for statistical test for two different graphs obtained by removing or adding one edge. As an application of these properties, the G-Wishart law is introduced as a prior law for graphical Gaussian models for Bayesian posterior updating, and a hypothesis test for precision matrix is designed to determine the model structures. Our simulation experiments are implemented by using the Gibbs Sampler algorithm and the results show that it performs better in convergence speed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. A composite Bayesian approach for quantile curve fitting with non-crossing constraints.
- Author
-
Wang, Qiao and Cai, Zhongheng
- Subjects
- *
CURVE fitting , *EXPECTATION-maximization algorithms , *BAYESIAN analysis , *SMOOTHNESS of functions , *QUANTILES , *GIBBS sampling - Abstract
To fit a set of quantile curves, Bayesian simultaneous quantile curve fitting methods face some challenges in properly specifying a feasible formulation and efficiently accommodating the non-crossing constraints. In this article, we propose a new minimization problem and develop its corresponding Bayesian analysis. The new minimization problem imposes two penalties to control not only the smoothness of fitted quantile curves but also the differences between quantile curves. This enables a direct inference on differences of quantile curves and facilitates improved information sharing among quantiles. After adopting B-spline approximation for the positive smoothing functions in the minimization problem, we specify the pseudo composite asymmetric Laplace likelihood and derive its priors. The computation algorithm, including partially collapsed Gibbs sampling for model parameters and Monte Carlo Expectation-Maximization algorithm for penalty parameters, are provided to carry out the proposed approach. The extensive simulation studies show that, compared with other candidate methods, the proposed approach yields more robust estimation. More advantages of the proposed approach are observed for the extreme quantiles, heavy-tailed random errors, and inference on the differences of quantiles. We also demonstrate the relative performances of the proposed approach and other competing methods through two real data analyses. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. All-to-all reconfigurability with sparse and higher-order Ising machines.
- Author
-
Nikhar, Srijan, Kannan, Sidharth, Aadit, Navid Anjum, Chowdhury, Shuvro, and Camsari, Kerem Y.
- Subjects
GATE array circuits ,GIBBS sampling ,GREEDY algorithms ,PARALLEL algorithms ,TEMPERING ,GRAPHICS processing units - Abstract
Domain-specific hardware to solve computationally hard optimization problems has generated tremendous excitement. Here, we evaluate probabilistic bit (p-bit) based Ising Machines (IM) on the 3-Regular 3-Exclusive OR Satisfiability (3R3X), as a representative hard optimization problem. We first introduce a multiplexed architecture that emulates all-to-all network functionality while maintaining highly parallelized chromatic Gibbs sampling. We implement this architecture in a single Field-Programmable Gate Array (FPGA) and show that running the adaptive parallel tempering algorithm demonstrates competitive algorithmic and prefactor advantages over alternative IMs by D-Wave, Toshiba, and Fujitsu. We also implement higher-order interactions that lead to better prefactors without changing algorithmic scaling for the XORSAT problem. Even though FPGA implementations of p-bits are still not quite as fast as the best possible greedy algorithms accelerated on Graphics Processing Units (GPU), scaled magnetic versions of p-bit IMs could lead to orders of magnitude improvements over the state of the art for generic optimization. Specialized hardware for hard optimization is gaining traction. Here, the authors introduce a sparse, multiplexed, and reconfigurable p-bit Ising Machine on Field-Programmable Gate Arrays, using adaptive parallel tempering and higher-order interactions to achieve competitive performance on the 3-Regular 3-XORSAT problem. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Gibbs state sampling via cluster expansions.
- Author
-
Eassa, Norhan M., Moustafa, Mahmoud M., Banerjee, Arnab, and Cohn, Jeffrey
- Subjects
GIBBS sampling ,BOLTZMANN machine ,SEMIDEFINITE programming ,TENSOR products ,SPECIFIC heat - Abstract
Gibbs states (i.e., thermal states) can be used for several applications such as quantum simulation, quantum machine learning, quantum optimization, and the study of open quantum systems. Moreover, semi-definite programming, combinatorial optimization problems, and training quantum Boltzmann machines can all be addressed by sampling from well-prepared Gibbs states. With that, however, comes the fact that preparing and sampling from Gibbs states on a quantum computer are notoriously difficult tasks. Such tasks can require large overhead in resources and/or calibration even in the simplest of cases, as well as the fact that the implementation might be limited to only a specific set of systems. We propose a method based on sampling from a quasi-distribution consisting of tensor products of mixed states on local clusters, i.e., expanding the full Gibbs state into a sum of products of local "Gibbs-cumulant" type states easier to implement and sample from on quantum hardware. We begin with presenting results for 4-spin linear chains with XY spin interactions, for which we obtain the ZZ dynamical spin-spin correlation functions and dynamical structure factor. We also present the results of measuring the specific heat of the 8-spin chain Gibbs state ρ
8 . [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.