21,611 results on '"Maximum likelihood estimation"'
Search Results
2. Revisiting the Briggs Ancient DNA Damage Model: A Fast Maximum Likelihood Method to Estimate Post‐Mortem Damage.
- Author
-
Zhao, Lei, Henriksen, Rasmus Amund, Ramsøe, Abigail, Nielsen, Rasmus, and Korneliussen, Thorfinn Sand
- Abstract
ABSTRACT One essential initial step in the analysis of ancient DNA is to authenticate that the DNA sequencing reads are actually from ancient DNA. This is done by assessing if the reads exhibit typical characteristics of post‐mortem damage (PMD), including cytosine deamination and nicks. We present a novel statistical method implemented in a fast multithreaded programme, ngsBriggs that enables rapid quantification of PMD by estimation of the Briggs ancient damage model parameters (Briggs parameters). Using a multinomial model with maximum likelihood fit, ngsBriggs accurately estimates the parameters of the Briggs model, quantifying the PMD signal from single and double‐stranded DNA regions. We extend the original Briggs model to capture PMD signals for contemporary sequencing platforms and show that ngsBriggs accurately estimates the Briggs parameters across a variety of contamination levels. Classification of reads into ancient or modern reads, for the purpose of decontamination, is significantly more accurate using ngsBriggs than using other methods available. Furthermore, ngsBriggs is substantially faster than other state‐of‐the‐art methods. ngsBriggs offers a practical and accurate method for researchers seeking to authenticate ancient DNA and improve the quality of their data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Inference on process capability index Spmk for a new lifetime distribution.
- Author
-
Karakaya, Kadir
- Abstract
In various applied disciplines, the modeling of continuous data often requires the use of flexible continuous distributions. Meeting this demand calls for the introduction of new continuous distributions that possess desirable characteristics. This paper introduces a new continuous distribution. Several estimators for estimating the unknown parameters of the new distribution are discussed and their efficiency is assessed through Monte Carlo simulations. Furthermore, the process capability index S pmk is examined when the underlying distribution is the proposed distribution. The maximum likelihood estimation of the S pmk is also studied. The asymptotic confidence interval is also constructed for S pmk . The simulation results indicate that estimators for both the unknown parameters of the new distribution and the S pmk provide reasonable results. Some practical analyses are also performed on both the new distribution and the S pmk . The results of the conducted data analysis indicate that the new distribution yields effective outcomes in modeling lifetime data in the literature. Similarly, the data analyses performed for S pmk illustrate that the new distribution can be utilized for process capability indices by quality controllers. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Reliability estimation and statistical inference under joint progressively Type-II right-censored sampling for certain lifetime distributions.
- Author
-
Lin, Chien-Tai, Chen, Yen-Chou, Yeh, Tzu-Chi, and Ng, Hon Keung Tony
- Abstract
AbstractIn this article, the parameter estimation of several commonly used two-parameter lifetime distributions, including the Weibull, inverse Gaussian, and Birnbaum–Saunders distributions, based on joint progressively Type-II right-censored sample is studied. Different numerical methods and algorithms are used to compute the maximum likelihood estimates of the unknown model parameters. These methods include the Newton–Raphson method, the stochastic expectation–maximization (SEM) algorithm, and the dual annealing (DA) algorithm. These estimation methods are compared in terms of accuracy (e.g. the bias and mean squared error), computational time and effort (e.g. the required number of iterations), the ability to obtain the largest value of the likelihood, and convergence issues by means of a Monte Carlo simulation study. Recommendations are made based on the simulated results. A real data set is analyzed for illustrative purposes. These methods are implemented in Python, and the computer programs are available from the authors upon request. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Zero-Inflated Binary Classification Model with Elastic Net Regularization.
- Author
-
Xin, Hua, Lio, Yuhlong, Chen, Hsien-Ching, and Tsai, Tzong-Ru
- Subjects
- *
MACHINE learning , *MAXIMUM likelihood statistics , *EXPECTATION-maximization algorithms , *OPEN-ended questions , *DIABETES - Abstract
Zero inflation and overfitting can reduce the accuracy rate of using machine learning models for characterizing binary data sets. A zero-inflated Bernoulli (ZIBer) model can be the right model to characterize zero-inflated binary data sets. When the ZIBer model is used to characterize zero-inflated binary data sets, overcoming the overfitting problem is still an open question. To improve the overfitting problem for using the ZIBer model, the minus log-likelihood function of the ZIBer model with the elastic net regularization rule for an overfitting penalty is proposed as the loss function. An estimation procedure to minimize the loss function is developed in this study using the gradient descent method (GDM) with the momentum term as the learning rate. The proposed estimation method has two advantages. First, the proposed estimation method can be a general method that simultaneously uses L 1 - and L 2 -norm terms for penalty and includes the ridge and least absolute shrinkage and selection operator methods as special cases. Second, the momentum learning rate can accelerate the convergence of the GDM and enhance the computation efficiency of the proposed estimation procedure. The parameter selection strategy is studied, and the performance of the proposed method is evaluated using Monte Carlo simulations. A diabetes example is used as an illustration. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Tensile Properties of Cattail Fibres at Various Phenological Development Stages.
- Author
-
Hossain, Mohammed Shahadat, Rahman, Mashiur, and Cicek, Nazim
- Subjects
- *
MAXIMUM likelihood statistics , *CALCIUM oxalate , *WEIBULL distribution , *INDUSTRIAL capacity , *GROWING season - Abstract
Cattails (Typha latifolia L.) are naturally occurring aquatic macrophytes with significant industrial potential because of their abundance, high-quality fibers, and high fiber yields. This study is the first attempt to investigate how phenological development and plant maturity impact the quality of cattail fibers as they relate to composite applications. It was observed that fibers from all five growth stages exhibited a Weibull shape parameter greater than 1.0, with a goodness-of-fit exceeding 0.8. These calculations were performed using both the Least Square Regression (LSR) and Maximum Likelihood Estimation (MLE) methods. Among the estimators, the MLE method provided the most conservative estimation of Weibull parameters. Based on the Weibull parameters obtained with all estimators, cattail fibers from all five growth stages appear suitable for composite applications. The consistency of shape parameters across all five growth stages can be attributed to the morphological and molecular developments of cattail fiber during the vegetative period. These developments were confirmed through the presence of calcium oxalate (CaOx) plates, elemental composition, and specific infrared peaks at 2360 cm−1 contributing to the strength, cellulose peaks at 1635 cm−1, 2920 cm−1, and 3430 cm−1. In conclusion, it was found that the mechanical properties of cattail fiber remain similar when harvested multiple times in a single growing season. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Reliability analysis of two Gompertz populations under joint progressive type-ii censoring scheme based on binomial removal.
- Author
-
Abo-Kasem, O.E., Almetwally, Ehab M., and Abu El Azm, Wael S.
- Subjects
- *
MONTE Carlo method , *CENSORING (Statistics) , *BAYES' estimation , *DISTRIBUTION (Probability theory) , *MAXIMUM likelihood statistics , *MARKOV chain Monte Carlo - Abstract
Analysis of jointly censoring schemes has received considerable attention in the last few years. In this paper, maximum likelihood and Bayes methods of estimation are used to estimate the unknown parameters of two Gompertz populations under a joint progressive Type-II censoring scheme. Bayesian estimations of the unknown parameters are obtained based on squared error loss functions under the assumption of independent gamma priors. We propose to apply the Markov Chain Monte Carlo technique to carry out a Bayes estimation procedure. The approximate, bootstrap, and credible confidence intervals for the unknown parameters are also obtained. Also, reliability and hazard rate function of the two Gompertz populations under joint progressive Type-II censoring scheme is obtained and the corresponding approximate confidence intervals. Finally, all the theoretical results obtained are assessed and compared using two real-world data sets and Monte Carlo simulation studies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Concentration inequalities of MLE and robust MLE.
- Author
-
Yang, Xiaowei, Liu, Xinqiao, and Wei, Haoyu
- Subjects
- *
MAXIMUM likelihood statistics , *MACHINE learning , *STATISTICS - Abstract
The Maximum Likelihood Estimator (MLE) serves an important role in statistics and machine learning. In this article, for i.i.d. variables, we obtain constant-specified and sharp concentration inequalities and oracle inequalities for the MLE only under exponential moment conditions. Furthermore, in a robust setting, the sub-Gaussian type oracle inequalities of the log-truncated maximum likelihood estimator are derived under the second-moment condition. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. The exponentiated-Weibull proportional hazard regression model with application to censored survival data.
- Author
-
Ishag, Mohamed A.S., Wanjoya, Anthony, Adem, Aggrey, Alsultan, Rehab, Alghamdi, Abdulaziz S., and Afify, Ahmed Z.
- Subjects
PROPORTIONAL hazards models ,MONTE Carlo method ,REGRESSION analysis ,CENSORING (Statistics) ,MAXIMUM likelihood statistics - Abstract
The proportional hazard regression models are widely used statistical tools for analyzing survival data and estimating the effects of covariates on survival times. It is assumed that the effects of the covariates are constant across the time. In this paper, we propose a novel extension of the proportional hazard model by incorporating an exponentiated-Weibull distribution to model the baseline line hazard function. The proposed model offers more flexibility in capturing various shapes of failure rates and accommodates both monotonic and non-monotonic hazard shapes. The performance evaluation of the proposed model and comparison with other commonly used survival models including the generalized log–logistic, Weibull, Gompertz, and exponentiated exponential PH regression models are explored using simulation results. The results demonstrate the ability of the introduced model to capture the baseline hazard shapes and to estimate the effect of covariates on the hazard function accurately. Furthermore, two real survival medical data sets are analyzed to illustrate the practical importance of the proposed model to provide accurate predictions of survival outcomes for individual patients. Finally, the survival data analysis reveal that the model is a powerful tool for analyzing complex survival data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Repair alert model when the lifetimes are discretely distributed.
- Author
-
Atlehkhani, Mohammad and Doostparast, Mahdi
- Abstract
AbstractThis paper deals with the repair alert models. They are used for analyzing lifetime data coming from engineering devices under maintenance management. Repair alert models have been proposed and investigated for continuous component lifetimes. Existing studies are concerned with the lifetimes of items described by continuous distributions. However, discrete lifetimes are also frequently encountered in practice. Examples include operating a piece of equipment in cycles, reporting field failures that are gathered weekly, and the number of pages printed by a device completed before failure. Here, the repair alert models are developed when device lifetimes are discrete. A wide class of discrete distributions, called the
telescopic family , is considered for the component lifetimes, and the proposed repair alert model is explained in detail. Furthermore, the problem of estimating parameters is investigated and illustrated by analyzing a real data set. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
11. Quasi shrinkage estimation of a block-structured covariance matrix.
- Author
-
Markiewicz, A., Mokrzycka, M., and Mrowińska, M.
- Subjects
- *
LEAST squares , *MAXIMUM likelihood statistics , *COVARIANCE matrices , *ORTHOGRAPHIC projection - Abstract
In this paper, we study the estimation of a block covariance matrix with linearly structured off-diagonal blocks. We consider estimation based on the least squares method, which has some drawbacks. These estimates are not always well conditioned and may not even be definite. We propose a new estimation procedure providing a structured positive definite and well-conditioned estimator with good statistical properties. The least squares estimator is improved with the use of a shrinkage method and an additional algebraic approach. The resulting so-called quasi shrinkage estimator is compared with the structured maximum likelihood estimator. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. An alternative bounded distribution: regression model and applications.
- Author
-
Sağlam, Şule and Karakaya, Kadir
- Subjects
- *
PROCESS capability , *MONTE Carlo method , *LEAST squares , *MAXIMUM likelihood statistics , *REGRESSION analysis - Abstract
In this paper, a new bounded distribution is introduced and some distributional properties of the new distribution are discussed. Moreover, the new distribution is implemented in the field of engineering to the Cpc process capability index. Three unknown parameters of the distribution are estimated with several estimators, and the performances of the estimators are evaluated with a Monte Carlo simulation. A new regression model is introduced based on this new distribution as an alternative to beta and Kumaraswamy models. Furthermore, it is considered one of the first studies where regression model parameters are estimated using least squares, weighted least squares, Cramér–von Mises, and maximum product spacing estimators other than the maximum likelihood. The efficiency of the estimators for the parameters of the regression model is further assessed through a simulation. Real datasets are analyzed to demonstrate the applicability of the new distribution and regression model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. A modified uncertain maximum likelihood estimation with applications in uncertain statistics.
- Author
-
Liu, Yang and Liu, Baoding
- Subjects
- *
MAXIMUM likelihood statistics , *TIME series analysis , *REGRESSION analysis , *STATISTICAL models , *STATISTICS , *DIFFERENTIAL equations - Abstract
In uncertain statistics, the uncertain maximum likelihood estimation is a method of estimating the values of unknown parameters of an uncertain statistical model that make the observed data most likely. However, the observed data obtained in practice usually contain outliers. In order to eliminate the influence of outliers when estimating unknown parameters, this article modifies the uncertain maximum likelihood estimation. Following that, the modified uncertain maximum likelihood estimation is applied to uncertain regression analysis, uncertain time series analysis, and uncertain differential equation. Finally, some real-world examples are provided to illustrate the modified uncertain maximum likelihood estimation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Estimation of the constant-stress model with bathtub-shaped failure rates under progressive type-I interval censoring scheme.
- Author
-
Sief, Mohamed, Liu, Xinsheng, Alsadat, Najwan, and Abd El-Raheem, Abd El-Raheem M.
- Subjects
MAXIMUM likelihood statistics ,ACCELERATED life testing ,CONFIDENCE intervals ,PARAMETER estimation ,MARKOV chain Monte Carlo ,DATA analysis - Abstract
This paper investigates constant-stress accelerated life tests interrupted by a progressive type-I interval censoring regime. We provide a model based on the Chen distribution with a constant shape parameter and a log-linear connection between the scale parameter and stress loading. Inferential methods, whether classical or Bayesian, are employed to address model parameters and reliability attributes. Classical methods involve the estimation of model parameters through maximum likelihood and midpoint techniques. Bayesian approximations are achieved via the utilization of the Metropolis–Hastings algorithm, Tierney-Kadane procedure, and importance sampling methods. Furthermore, we engage in a discourse on the estimation of confidence intervals, making references to both asymptotic confidence intervals and credible intervals. To conclude, we furnish a simulation study, a corresponding discussion, and supplement these with an analysis of real data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Measurement and Evaluation of the Development Level of Health and Wellness Tourism from the Perspective of High-Quality Development.
- Author
-
Pan, Huali, Mi, Huanhuan, Chen, Yanhua, Chen, Ziyan, and Zhou, Weizhong
- Abstract
In recent years, with the dramatic surge in the demand for health and elderly care services, the emergence of the health dividend has presented good development opportunities for health and wellness tourism. However, as a sector of the economy, health and wellness tourism still faces numerous challenges in achieving high-quality development. Therefore, this paper focuses on 31 provinces in China and constructs a multidimensional evaluation index system for the high-quality development of health and wellness tourism. The global entropy-weighted TOPSIS method and cluster analysis are used to conduct in-depth measurements, regional comparisons, and classification evaluations of the high-quality development of health and wellness tourism in each province. The research results indicate that: (1) From a quality perspective, the level of health and wellness tourism development in 11 provinces in China has exceeded the national average, while the remaining 20 provinces are below the national average. (2) From a regional perspective, the current level of high-quality development in health and wellness tourism decreases sequentially from the eastern to the central to the western regions, with significant regional differences. (3) Overall, the development in the 31 provinces can be categorized into five types: the High-Quality Benchmark Type, the High-Quality Stable Type, the High-Quality Progressive Type, the General-Quality Potential Type, and the General-Quality Lagging Type. (4) From a single-dimension analysis perspective, there are significant differences in the rankings of each province across different dimensions. Finally, this paper enriches and expands the theoretical foundation on the high-quality development of health and wellness tourism; on the other hand, it puts forward targeted countermeasures and suggestions to help promote the comprehensive enhancement of health and wellness tourism. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Tobit models for count time series.
- Author
-
Weiß, Christian H. and Zhu, Fukang
- Subjects
- *
MAXIMUM likelihood statistics , *DISTRIBUTION (Probability theory) , *MOVING average process , *TIME series analysis , *PARAMETER estimation - Abstract
Several models for count time series have been developed during the last decades, often inspired by traditional autoregressive moving average (ARMA) models for real‐valued time series, including integer‐valued ARMA (INARMA) and integer‐valued generalized autoregressive conditional heteroscedasticity (INGARCH) models. Both INARMA and INGARCH models exhibit an ARMA‐like autocorrelation function (ACF). To achieve negative ACF values within the class of INGARCH models, log and softplus link functions are suggested in the literature, where the softplus approach leads to conditional linearity in good approximation. However, the softplus approach is limited to the INGARCH family for unbounded counts, that is, it can neither be used for bounded counts, nor for count processes from the INARMA family. In this paper, we present an alternative solution, named the Tobit approach, for achieving approximate linearity together with negative ACF values, which is more generally applicable than the softplus approach. A Skellam–Tobit INGARCH model for unbounded counts is studied in detail, including stationarity, approximate computation of moments, maximum likelihood and censored least absolute deviations estimation for unknown parameters and corresponding simulations. Extensions of the Tobit approach to other situations are also discussed, including underlying discrete distributions, INAR models, and bounded counts. Three real‐data examples are considered to illustrate the usefulness of the new approach. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Properties, estimation, and applications of the extended log-logistic distribution.
- Author
-
Kariuki, Veronica, Wanjoya, Anthony, Ngesa, Oscar, Alharthi, Amirah Saeed, Aljohani, Hassan M., and Afify, Ahmed Z.
- Subjects
- *
ESTIMATION theory , *MAXIMUM likelihood statistics , *ORDER statistics , *DATA modeling , *SIMPLICITY - Abstract
This paper presents the exponentiated alpha-power log-logistic (EAPLL) distribution, which extends the log-logistic distribution. The EAPLL distribution emphasizes its suitability for survival data modeling by providing analytical simplicity and accommodating both monotone and non-monotone failure rates. We derive some of its mathematical properties and test eight estimation methods using an extensive simulation study. To determine the best estimation approach, we rank mean estimates, mean square errors, and average absolute biases on a partial and overall ranking. Furthermore, we use the EAPLL distribution to examine three real-life survival data sets, demonstrating its superior performance over competing log-logistic distributions. This study adds vital insights to survival analysis methodology and provides a solid framework for modeling various survival data scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Different estimation techniques and data analysis for constant-partially accelerated life tests for power half-logistic distribution.
- Author
-
Alomani, Ghadah A., Hassan, Amal S., Al-Omari, Amer I., and Almetwally, Ehab M.
- Subjects
- *
ACCELERATED life testing , *CONFIDENCE intervals , *DATA analysis , *LEAST squares , *MAXIMUM likelihood statistics - Abstract
Partial accelerated life tests (PALTs) are employed when the results of accelerated life testing cannot be extended to usage circumstances. This work discusses the challenge of different estimating strategies in constant PALT with complete data. The lifetime distribution of the test item is assumed to follow the power half-logistic distribution. Several classical and Bayesian estimation techniques are presented to estimate the distribution parameters and the acceleration factor of the power half-logistic distribution. These techniques include Anderson–Darling, maximum likelihood, Cramér von-Mises, ordinary least squares, weighted least squares, maximum product of spacing and Bayesian. Additionally, the Bayesian credible intervals and approximate confidence intervals are constructed. A simulation study is provided to compare the outcomes of various estimation methods that have been provided based on mean squared error, absolute average bias, length of intervals, and coverage probabilities. This study shows that the maximum product of spacing estimation is the most effective strategy among the options in most circumstances when adopting the minimum values for MSE and average bias. In the majority of situations, Bayesian method outperforms other methods when taking into account both MSE and average bias values. When comparing approximation confidence intervals to Bayesian credible intervals, the latter have a higher coverage probability and smaller average length. Two authentic data sets are examined for illustrative purposes. Examining the two real data sets shows that the value methods are workable and applicable to certain engineering-related problems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Statistical inference on multicomponent stress–strength reliability with non-identical component strengths using progressively censored data from Kumaraswamy distribution.
- Author
-
Saini, Shubham, Patel, Jyoti, and Garg, Renu
- Abstract
In this article, we draw inferences on stress–strength reliability in a multicomponent system with non-identical strength components based on the progressively censored data from the Kumaraswamy distribution (KuD). When one shape parameter of KuD is known, the uniformly minimum variance unbiased estimator is produced. To evaluate the reliability of such systems when all the parameters are unknown, the maximum likelihood and Bayes estimators are developed. Along with coverage probabilities, the asymptotic confidence and highest posterior credible (HPD) intervals are also obtained. Tierney–Kadane's approximation and Markov chain Monte Carlo methods are used for Bayesian computations. To compare the performance of estimators, a Monte Carlo simulation study is performed. Also, one real-life example is analyzed for illustrative purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Unit-Power Half-Normal Distribution Including Quantile Regression with Applications to Medical Data.
- Author
-
Santoro, Karol I., Gómez, Yolanda M., Soto, Darlin, and Barranco-Chamorro, Inmaculada
- Subjects
- *
MAXIMUM likelihood statistics , *REGRESSION analysis , *DATA analysis , *DATA modeling , *QUANTILE regression - Abstract
In this paper, we present the unit-power half-normal distribution, derived from the power half-normal distribution, for data analysis in the open unit interval. The statistical properties of the unit-power half-normal model are described in detail. Simulation studies are carried out to evaluate the performance of the parameter estimators. Additionally, we implement the quantile regression for this model, which is applied to two real healthcare data sets. Our findings suggest that the unit power half-normal distribution provides a robust and flexible alternative for existing models for proportion data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Estimation of the Reliability Function of the Generalized Rayleigh Distribution under Progressive First-Failure Censoring Model.
- Author
-
Gong, Qin, Chen, Rui, Ren, Haiping, and Zhang, Fan
- Subjects
- *
MONTE Carlo method , *MAXIMUM likelihood statistics , *RAYLEIGH model , *HAZARD function (Statistics) , *INFERENTIAL statistics - Abstract
This study investigates the statistical inference of the parameters, reliability function, and hazard function of the generalized Rayleigh distribution under progressive first-failure censoring samples, considering factors such as long product lifetime and challenging experimental conditions. Firstly, the progressive first-failure model is introduced, and the maximum likelihood estimation for the parameters, reliability function, and hazard function under this model are discussed. For interval estimation, confidence intervals have been constructed for the parameters, reliability function, and hazard function using the bootstrap method. Next, in Bayesian estimation, considering informative priors and non-information priors, the Bayesian estimation of the parameters, reliability function, and hazard function under symmetric and asymmetric loss functions is obtained using the MCMC method. Finally, Monte Carlo simulation is conducted to compare mean square errors, evaluating the superiority of the maximum likelihood estimation and Bayesian estimation under different loss functions. The performance of the estimation methods used in the study is illustrated through illustrative examples. The results indicate that Bayesian estimation outperforms maximum likelihood estimation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Algorithm for Option Number Selection in Stochastic Paired Comparison Models.
- Author
-
Gyarmati, László, Mihálykó, Csaba, and Orbán-Mihálykó, Éva
- Subjects
- *
MAXIMUM likelihood statistics , *STOCHASTIC models , *DECISION making , *COMPUTER simulation , *ALGORITHMS - Abstract
In this paper, paired comparison models with a stochastic background are investigated and compared from the perspective of the option numbers allowed. As two-option and three-option models are the ones most frequently used, we mainly focus on the relationships between two-option and four-option models and three-option and five-option models, and then we turn to the general s- and (s + 2) -option models. We compare them from both theoretical and practical perspectives; the latter are based on computer simulations. We examine, when it is possible, mandatory, or advisable how to convert four-, five-, and (s + 2) -option models into two-, three-, and s-option models, respectively. The problem also exists in reverse: when is it advisable to use four-, five-, and (s + 2) -option models instead of two-, three-, and s-option models? As a result of these investigations, we set up an algorithm to perform the decision process. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. On a Randomly Censoring Scheme for Generalized Logistic Distribution with Applications.
- Author
-
Hasaballah, Mustafa M., Balogun, Oluwafemi Samson, and Bakr, Mahmoud E.
- Subjects
- *
MARKOV chain Monte Carlo , *FISHER information , *CENSORING (Statistics) , *MAXIMUM likelihood statistics , *CONFIDENCE intervals , *BAYES' estimation - Abstract
In this paper, we investigate the inferential procedures within both classical and Bayesian frameworks for the generalized logistic distribution under a random censoring model. For randomly censored data, our main goals were to develop maximum likelihood estimators and construct confidence intervals using the Fisher information matrix for the unknown parameters. Additionally, we developed Bayes estimators with gamma priors, addressing both squared error and general entropy loss functions. We also calculated Bayesian credible intervals for the parameters. These methods were applied to two real datasets with random censoring to provide valuable insights. Finally, we conducted a simulation analysis to assess the effectiveness of the estimated values. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. A Novel Discrete Linear-Exponential Distribution for Modeling Physical and Medical Data.
- Author
-
Al-Harbi, Khlood, Fayomi, Aisha, Baaqeel, Hanan, and Alsuraihi, Amany
- Subjects
- *
MEAN square algorithms , *CHARACTERISTIC functions , *ERROR functions , *PHYSICAL distribution of goods , *EXPONENTIAL functions , *BAYES' estimation - Abstract
In real-life data, count data are considered more significant in different fields. In this article, a new form of the one-parameter discrete linear-exponential distribution is derived based on the survival function as a discretization technique. An extensive study of this distribution is conducted under its new form, including characteristic functions and statistical properties. It is shown that this distribution is appropriate for modeling over-dispersed count data. Moreover, its probability mass function is right-skewed with different shapes. The unknown model parameter is estimated using the maximum likelihood method, with more attention given to Bayesian estimation methods. The Bayesian estimator is computed based on three different loss functions: a square error loss function, a linear exponential loss function, and a generalized entropy loss function. The simulation study is implemented to examine the distribution's behavior and compare the classical and Bayesian estimation methods, which indicated that the Bayesian method under the generalized entropy loss function with positive weight is the best for all sample sizes with the minimum mean squared errors. Finally, the discrete linear-exponential distribution proves its efficiency in fitting discrete physical and medical lifetime count data in real-life against other related distributions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Parameter estimation of Chen distribution under improved adaptive type-II progressive censoring.
- Author
-
Zhang, Li and Yan, Rongfang
- Subjects
- *
MONTE Carlo method , *MAXIMUM likelihood statistics , *ASYMPTOTIC normality , *HAZARD function (Statistics) , *PARAMETER estimation , *BAYES' estimation - Abstract
This study focuses on the estimation of the two unknown parameters of Chen distribution, characterized by a bathtub-shaped hazard rate function, within an improved adaptive type-II progressive censored data framework. Maximum likelihood estimation is proposed for the two parameters, and the establishment of approximate confidence intervals is based on asymptotic normality. Bayesian estimation is also conducted under both symmetric and asymmetric loss functions, utilizing the proposed importance sampling and Metropolis–Hastings algorithm. Lastly, the performance of various estimation methods is evaluated through Monte Carlo simulation experiments, and the proposed estimation approach is illustrated using a real dataset. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Fitting Cross-Lagged Panel Models with the Residual Structural Equations Approach.
- Author
-
Tseng, Ming-Chi
- Subjects
- *
MAXIMUM likelihood statistics , *STRUCTURAL equation modeling , *MOVING average process , *RESEARCH personnel , *DYNAMIC models - Abstract
This study simplifies the seven different cross-lagged panel models (CLPMs) by using the RSEM model for both inter-individual and intra-individual structures. In addition, the study incorporates the newly developed dynamic panel model (DPM), general cross-lagged model (GCLM) and the random intercept auto-regressive moving average (RI-ARMA) model. Then, using a longitudinal study of self-esteem and depression, ten different CLPMs are analyzed using robust maximum likelihood estimation. In addition, the Mplus syntax is provided as a reference for researcher. This study aims to enhance empirical researcher understanding and exploration of different CLPMs by providing simplified explanations and analyses of the ten different CLPMs, thereby promoting the development of empirical constructs and topics in the context of various cross-lagged panel models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Assessing COVID-19 Prevalence in Austria with Infection Surveys and Case Count Data as Auxiliary Information.
- Author
-
Guerrier, Stéphane, Kuzmics, Christoph, and Victoria-Feser, Maria-Pia
- Subjects
- *
COVID-19 pandemic , *MAXIMUM likelihood statistics , *COMMUNICABLE diseases , *MEASUREMENT errors , *MOMENTS method (Statistics) - Abstract
Countries officially record the number of COVID-19 cases based on medical tests of a subset of the population. These case count data obviously suffer from participation bias, and for prevalence estimation, these data are typically discarded in favor of infection surveys, or possibly also completed with auxiliary information. One exception is the series of infection surveys recorded by the Statistics Austria Federal Institute to study the prevalence of COVID-19 in Austria in April, May, and November 2020. In these infection surveys, participants were additionally asked if they were simultaneously recorded as COVID-19 positive in the case count data. In this article, we analyze the benefits of properly combining the outcomes from the infection survey with the case count data, to analyze the prevalence of COVID-19 in Austria in 2020, from which the case ascertainment rate can be deduced. The results show that our approach leads to a significant efficiency gain. Indeed, considerably smaller infection survey samples suffice to obtain the same level of estimation accuracy. Our estimation method can also handle measurement errors due to the sensitivity and specificity of medical testing devices and to the nonrandom sample weighting scheme of the infection survey. The proposed estimators and associated confidence intervals are implemented in the companion open source R package pempi available on the Comprehensive R Archive Network (CRAN). for this article are available online including a standardized description of the materials available for reproducing the work. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. A new quantile regression model with application to human development index.
- Author
-
Cordeiro, Gauss M., Rodrigues, Gabriela M., Prataviera, Fábio, and Ortega, Edwin M. M.
- Subjects
- *
MONTE Carlo method , *HUMAN Development Index , *MAXIMUM likelihood statistics , *REGRESSION analysis , *CITIES & towns , *QUANTILE regression - Abstract
A new odd log-logistic unit omega distribution is defined and studied, and some of its structural properties are obtained. A quantile regression model based on the new re-parameterized distribution is constructed, and the estimation is conducted by the maximum likelihood method. Monte Carlo simulations are used to assess the accuracy of the estimators. The flexibility, practical relevance and applicability of the proposed regression are proved by means of Human Development Index data from the cities of the state of São Paulo (Brazil). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. Second term improvement to generalized linear mixed model asymptotics.
- Author
-
Maestrini, Luca, Bhaskaran, Aishwarya, and Wand, Matt P
- Subjects
- *
STATISTICAL accuracy , *MAXIMUM likelihood statistics , *INFERENTIAL statistics , *SAMPLE size (Statistics) , *DATA analysis - Abstract
A recent article by Jiang et al. (2022) on generalized linear mixed model asymptotics derived the rates of convergence for the asymptotic variances of maximum likelihood estimators. If m denotes the number of groups and n is the average within-group sample size then the asymptotic variances have orders m − 1 and (m n) − 1 , depending on the parameter. We extend this theory to provide explicit forms of the (m n) − 1 second terms of the asymptotically harder-to-estimate parameters. Improved accuracy of statistical inference and planning are consequences of our theory. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Some asymptotic inferential aspects of the Kumaraswamy distribution.
- Author
-
Carneiro, Hérica P. A., Sandoval, Mônica C., Botter, Denise A., and Magalhães, Tiago M.
- Subjects
- *
MONTE Carlo method , *LIKELIHOOD ratio tests , *MAXIMUM likelihood statistics , *CORRECTION factors , *SAMPLE size (Statistics) - Abstract
The Kumaraswamy distribution is doubly limited, continuous, very flexible, and is widely applied in hydrology and related areas. Recently, several families of distributions based on this distribution have emerged. To make a contribution regarding some asymptotic aspects related to the inferential analysis, we derived an analytic expression of order n − 1 / 2 , where n is the sample size, for the skewness coefficient of the distribution of the maximum likelihood estimators of the parameters of the Kumaraswamy distribution. A simulation study and an application are presented to illustrate that, when the sample size is small, the likelihood inferences may not be reliable. We also obtain Bartlett correction factors for the likelihood ratio statistic as well as the results of the bootstrap likelihood ratio test and bootstrap Bartlett correction and present a Monte Carlo simulation study to compare the rejection rates of the tests in question. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. A Novel Family of Distribution with Application in Engineering Problems: A Simulation Study.
- Author
-
Modi, Kanak and Singh, Yudhveer
- Subjects
- *
PHYSICAL distribution of goods , *UNCERTAINTY (Information theory) , *MONTE Carlo method , *ENGINEERING simulations , *DISTRIBUTION (Probability theory) , *MAXIMUM likelihood statistics , *ORDER statistics - Abstract
We establish a novel family of Kumaraswamy-X probability distributions in the present investigation. We discussed the KumaraswamyExponential univariate probability distribution. The new distribution with three parameters possesses density function with unimodal and reverse J-shape and hazard rate function of bathtub shaped. We study various statistical properties for it and derive the expressions for its density function, distribution function, survival and hazard rate function, Probability weighted Moments, lth moment, moment generating function, quantile function and Shannon entropy. For the derived distribution order statistics is also discussed. The parameters are estimated using the maximum likelihood estimation approach, and the performance of the estimators was evaluated using a Monte Carlo simulation. Through extensive Monte Carlo simulations and comparative analyses, we assess the performance of the Kumaraswamy-X distribution against other common probability distributions used in engineering contexts. When we apply it to real datasets, it offers a more suitable fit than other existing distributions. We explore the characteristics and potential applications of the Kumaraswamy-X distribution in the context of engineering problems through a comprehensive simulation-based investigation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
32. Estimation with extended sequential order statistics: A link function approach.
- Author
-
Pesch, Tim, Cramer, Erhard, Polpo, Adriano, and Cripps, Edward
- Abstract
The model of extended sequential order statistics (ESOS) comprises of two valuable characteristics making the model powerful when modelling multi‐component systems. First, components can be assumed to be heterogeneous and second, component lifetime distributions can change upon failure of other components. This degree of flexibility comes at the cost of a large number of parameters. The exact number depends on the system size and the observation depth and can quickly exceed the number of observations available. Consequently, the model would benefit from a reduction in the dimension of the parameter space to make it more readily applicable to real‐world problems. In this article, we introduce link functions to the ESOS model to reduce the dimension of the parameter space while retaining the flexibility of the model. These functions model the relation between model parameters of a component across levels. By construction the proposed 'link estimates' conveniently yield ordered model estimates. We demonstrate how those ordered estimates lead to better results compared to their unordered counterparts, particularly when sample sizes are small. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. On Modified Weighted Exponential Rayleigh Distribution.
- Author
-
Hussein, Lamyaa Khalid, Hussein, Iden Hasan, and Rasheed, Huda Abdullah
- Subjects
PROBABILITY density function ,MAXIMUM likelihood statistics ,ENTROPY - Abstract
This study seeks for classical estimations to estimating the anonymous parameters and reliability function of Modified Weighted Exponential Rayleigh MWER distribution. These classical methods are chosen precisely because all of these methods maximize the probability density function. Newton-Raphson technique was used to derive estimation methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Robust maximum correntropy criterion based square-root rotating lattice Kalman filter.
- Author
-
Liu, Sanshan, Wang, Shiyuan, Lin, Dongyuan, Zheng, Yunfei, Guo, Zhongyuan, and Kuang, Zhijian
- Abstract
Lattice Kalman filter (LKF) is a nonlinear Kalman filter that utilizes a deterministic sampling method with the advantages of optional sampling points and a flexible balance between computational burden and estimation accuracy. However, the fixed angle of sampling points in LKF can limit the optimality of the selected points. To this end, this paper proposes a novel maximum correntropy square-root rotating lattice Kalman filter (MCSRLKF) to improve the performance of LKF by adjusting the angle of sampling points. In MCSRLKF, a rotation matrix is first constructed to enhance the estimation accuracy of LKF and the optimal rotation angle of sampling points is selected to generate rotating lattice Kalman filter (RLKF). Then, the square-root RLKF (SRLKF) is proposed to enhance the stability and estimation accuracy of RLKF. Due to the utilization of the minimum mean square error criterion in SRLKF, there is a potential for significant performance degradation in non-Gaussian noises. Thus, to enhance the robustness against non-Gaussian noises, the maximum correntropy criterion is applied to SRLKF, generating MCSRLKF. Moreover, the Cramér-Rao lower bound (CRLB) serves as an indicator for assessing the performance of MCSRLKF. Finally, simulations on the nonlinear function model and reentry vehicle tracking model are used to demonstrate that MCSRLKF exhibits excellent filtering accuracy and robustness when dealing with non-Gaussian noises. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. A novel extended inverse‐exponential distribution and its application to COVID‐19 data.
- Author
-
Kargbo, Moses, Gichuhi, Anthony Waititu, and Wanjoya, Anthony Kibira
- Subjects
MONTE Carlo method ,DISTRIBUTION (Probability theory) ,MAXIMUM likelihood statistics ,COVID-19 ,RENYI'S entropy - Abstract
The aim of this article is to define a new flexible statistical model to examine the COVID‐19 data sets that cannot be modeled by the inverse exponential distribution. A novel extended distribution with one scale and three shape parameters is proposed using the generalized alpha power family of distributions to derive the generalized alpha power exponentiated inverse exponential distribution. Some important statistical properties of the new distribution such as the survival function, hazard function, quantile function, rth$$ r\mathrm{th} $$ moment, Rényi entropy, and order statistics are all derived. The method of maximum likelihood estimation is used to estimate the parameters of the new distribution. The performance of the estimators are assessed through Monte Carlo simulation, which shows that the maximum likelihood method works well in estimating the parameters. The GAPEIEx distribution was applied to COVID‐19 data sets in order to access the flexibility and adaptability of the distribution, and it happens to perform better than its submodels and other well‐known distributions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Parametric Estimation in Fractional Stochastic Differential Equation.
- Author
-
Pramanik, Paramahansa, Boone, Edward L., and Ghanam, Ryad A.
- Subjects
STOCHASTIC differential equations ,FRACTIONAL differential equations ,GLOBAL Financial Crisis, 2008-2009 ,MAXIMUM likelihood statistics ,ESTIMATION bias - Abstract
Fractional Stochastic Differential Equations are becoming more popular in the literature as they can model phenomena in financial data that typical Stochastic Differential Equations models cannot. In the formulation considered here, the Hurst parameter, H, controls the Fraction of Differentiation, which needs to be estimated from the data. Fortunately, the covariance structure among observations in time is easily expressed in terms of the Hurst parameter which means that a likelihood is easily defined. This work derives the Maximum Likelihood Estimator for H, which shows that it is biased and is not a consistent estimator. Simulation data used to understand the bias of the estimator is used to create an empirical bias correction function and a bias-corrected estimator is proposed and studied. Via simulation, the bias-corrected estimator is shown to be minimally biased and its simulation-based standard error is created, which is then used to create a 95% confidence interval for H. A simulation study shows that the 95% confidence intervals have decent coverage probabilities for large n. This method is then applied to the S&P500 and VIX data before and after the 2008 financial crisis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Modeling the time to dropout under phase-wise variable stress fixed cohort setup.
- Author
-
Biswas, Aniket, Chakraborty, Subrata, and Nandi, Anupama
- Subjects
- *
MAXIMUM likelihood statistics , *SCHOOL dropouts , *OVERPRESSURE (Education) , *ACADEMIC programs , *CONFIDENCE intervals - Abstract
The event of a student dropping out from an academic program depends on several factors namely course content, change in interest, financial problems among many others. These factors vary interdependently with different phases of the academic program. We assume that the factors put different amount of academic stresses on a student in different phases of the program. We formulate and analyze such an accumulated-stress model under the assumption that the attrition time at each phase follows the Kumaraswamy distribution. A hazard-rate based approach is used to model the accumulated stress through different phases. At each phase the stress levels vary. Accordingly, we estimate the model parameters based on the frequentist approach. Extensive simulation experiments indicate satisfactory performance of the estimators. A synthetic dataset related to students' dropout has been analyzed for illustrative purpose. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Chen-Burr XII Model as a Competing Risks Model with Applications to Real-Life Data Sets.
- Author
-
Kalantan, Zakiah I., Binhimd, Sulafah M. S., Salem, Heba N., AL-Dayian, Gannat R., EL-Helbawy, Abeer A., and Elaal, Mervat K. Abd
- Subjects
- *
PROBABILITY density function , *MAXIMUM likelihood statistics , *HAZARD function (Statistics) , *COMPETING risks , *PARAMETER estimation - Abstract
In this paper Chen-Burr XII distribution is constructed and graphical description of the probability density function, hazard rate and reversed hazard rate functions of the proposed model is obtained. Also, some statistical characteristics of the Chen-Burr XII distribution are discussed and some new models as sub-models from the Chen-Burr XII distribution are introduced. Moreover, maximum likelihood estimation of the parameters, reliability, hazard rate and reversed hazard rate functions of the Chen-Burr XII distribution are considered. Also, the asymptotic confidence intervals of the distribution parameters, reliability, hazard rate and reversed hazard rate functions are presented. Finally, three real life data sets are applied to prove how the Chen-Burr XII distribution can be applied in real life and to confirm its superiority over some existing distributions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Toward Sustainable Development Goals (SDGs) with Statistical Modeling: Recursive Bivariate Binary Probit.
- Author
-
Ratnasari, Vita, Utama, Syirrul Hadi, and Rian Dani, Andrea Tri
- Subjects
- *
POOR people , *MAXIMUM likelihood statistics , *INDEPENDENT variables , *NEWTON-Raphson method , *MARITAL status , *PROBIT analysis - Abstract
Poverty is still a global problem that must be immediately eradicated by Sustainable Development Goals (SDGs) 1, namely ending poverty anywhere and in any form. In 2021, West Papua province will have the 2nd most significant percentage of poor people after Papua province, with 21.84% of the poor population. Poor households in West Papua province are dominated by families, with the Head of Household (KRT) working in the agricultural sector at 65.10 %. In this research, joint modelling was carried out between the level of household welfare and the employment sector of the head of the household in the West Papua province. It is suspected that these two variables have endogeneity problems, where one of the response variables becomes a predictor variable in the other equation, so a recursive bivariate binary probit regression model is used. Recursive bivariate binary probit regression parameter estimation uses Maximum Likelihood Estimation (MLE), but the results are not closed form, so it is continued using the Newton-Raphson iteration method. The results of hypothesis testing show that partially, variables that significantly influence the level of household welfare include the variable marital status, KRT formal/informal workers, health complaints, asset ownership status, migration status, number of household members, classification of area of residence (Village/City), age of head of household, and employment sector of head of household. Meanwhile, variables that significantly influence the choice of working in the agricultural sector include the director of household education, classification of area of residence (rural/city), and the age of the head of household. [ABSTRACT FROM AUTHOR]
- Published
- 2024
40. Estimation of Multicomponent Stress-Strength Reliability with Exponentiated Generalized Inverse Rayleigh Distribution.
- Author
-
Youssef Temraz, Neama Salah
- Abstract
In this paper, an estimation of the multicomponent stress-strength reliability is introduced subject to the exponentiated generalized inverse Rayleigh distribution. Different methods of estimation are introduced to estimate the multicomponent stress-strength reliability. Simulation method is introduced to illustrate the steps of finding the estimates of the multicomponent stress-strength reliability. Asymptotic and bootstrap confidence intervals are proposed in order to find interval estimations for the multicomponent stress-strength reliability. A Bayesian estimation method is introduced for the multicomponent stress-strength reliability. A simulation study is introduced to obtain the estimates of the multicomponent stress-strength reliability for the different methods of estimation. A real data application is introduced to show how the exponentiated generalized inverse Rayleigh distribution is used to fit the real data sets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
41. Parameter estimation for grouped data using EM and MCEM algorithms.
- Author
-
AghahosseinaliShirazi, Zahra, da Silva, João Pedro A. R., and de Souza, Camila P. E.
- Subjects
- *
EXPECTATION-maximization algorithms , *MAXIMUM likelihood statistics , *GAUSSIAN distribution , *INTERVAL measurement , *PARAMETER estimation - Abstract
Nowadays, the confidentiality of data and information is of great importance for many companies and organizations. For this reason, they may prefer not to release exact data, but instead to grant researchers access to approximate data. For example, rather than providing the exact measurements of their clients, they may only provide researchers with grouped data, that is, the number of clients falling in each of a set of non-overlapping measurement intervals. The challenge is to estimate the mean and variance structure of the hidden ungrouped data based on the observed grouped data. To tackle this problem, this work considers the exact observed data likelihood and applies the Expectation-Maximization (EM) and Monte Carlo EM (MCEM) algorithms for cases where the hidden data follow a univariate, bivariate, or multivariate normal distribution. Simulation studies are conducted to evaluate the performance of the proposed EM and MCEM algorithms. The well-known Galton data set is considered as an application example. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. A New Generalization of the Uniform Distribution: Properties and Applications to Lifetime Data.
- Author
-
González-Hernández, Isidro Jesús, Méndez-González, Luis Carlos, Granillo-Macías, Rafael, Rodríguez-Muñoz, José Luis, and Pacheco-Cedeño, José Sergio
- Subjects
- *
PROBABILITY density function , *MAXIMUM likelihood statistics , *RELIABILITY in engineering , *RANDOM variables , *DISTRIBUTION (Probability theory) - Abstract
In this paper, we generalize two new statistical distributions, to improve the ability to model failure rates with non-monotonic, monotonic, and mainly bathtub curve behaviors. We call these distributions Generalized Powered Uniform Distribution and MOE-Powered Uniform. The proposed distributions' approach is based on incorporating a parameter k in the power of the values of the random variables, which is associated with the Probability Density Function and includes an operator called the Powered Mean. Various statistical and mathematical features focused on reliability analysis are presented and discussed, to make the models attractive to reliability engineering or medicine specialists. We employed the Maximum Likelihood Estimator method to estimate the model parameters and we analyzed its performance through a Monte Carlo simulation study. To demonstrate the flexibility of the proposed approach, a comparative analysis was carried out on four case studies with the proposed MOE-Powered Uniform distribution, which can model failure times as a bathtub curve. The results showed that this new model is more flexible and useful for performing reliability analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Alpha–beta-power family of distributions with applications to exponential distribution.
- Author
-
Semary, H.E., Hussain, Zawar, Hamdi, Walaa A., Aldahlan, Maha A., Elbatal, Ibrahim, and Nagarjuna, Vasili B.V.
- Subjects
DISTRIBUTION (Probability theory) ,LEAST squares ,QUANTILE regression ,MAXIMUM likelihood statistics - Abstract
This article proposes a new way to increase the flexibility of a family of statistical distributions by adding two additional parameters. The newly proposed family is called the alpha–beta power transformation family of distributions. A specific model, the alpha–beta power exponential distribution, was thoroughly investigated. The hazard rates of the proposed distribution can be decreasing, bathtub-shaped, and unimodal. The derived structural features of the proposed model include explicit formulations for the quantiles, the moments, the moment-generating function, and the incomplete and conditional moments. In addition, estimates of maximum likelihood, least squares, weighted least squares, and minimum distance of Cramér von Mises, von Anderson–Darling, and right-tail Anderson–Darling are obtained for the unknown parameters. Two real data sets were examined to demonstrate the usefulness of the proposed approach. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. New Arctan-generator family of distributions with an example of Frechet distribution: Simulation and analysis to strength of glass and carbon fiber data.
- Author
-
Ahmad, Aijaz, Alghamdi, Fatimah M., Ahmad, Afaq, Albalawi, Olayan, Zaagan, Abdullah A., Zakarya, Mohammed, Almetwally, Ehab M., and Mekiso, Getachew Tekle
- Subjects
GLASS analysis ,CARBON fibers ,TRIGONOMETRIC functions ,GLASS fibers ,MAXIMUM likelihood statistics - Abstract
Since standard distributions do not fundamentally have an acceptable fit to all types of data sets, it is necessary to construct extensions of standard distributions to increase their capability in data modeling. As a result of this shortage in old ones, we proposed a novel generator based on the trigonometric function (Arctan). We selected the Frechet distribution as the baseline for the generator's applicability. This generator produces the "new Arctan Frechet distribution" (NATFD). The fundamental properties of the proposed distribution have been taken into consideration. Estimating the given distribution's parameters is accomplished using the maximum-likelihood method. A simulation study is carried out to evaluate the superiority of the proposed distribution, and two actual data sets are used. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Unit compound Rayleigh model: Statistical characteristics, estimation and application.
- Author
-
Qin Gong, Laijun Luo, and Haiping Ren
- Subjects
MONTE Carlo method ,UNCERTAINTY (Information theory) ,DISTRIBUTION (Probability theory) ,PROBABILITY theory ,MOMENTS method (Statistics) ,RAYLEIGH model - Abstract
In this paper, we proposed a novel probability distribution model known as the unit compound Rayleigh distribution, which possesses the distinctive characteristic of defining the range within the bounded interval (0,1). Through an in-depth investigation of this distribution, we analyzed various statistical and structural characteristics including reliability function, risk function, quantile function, moment analysis, order statistics, and entropy measurement. To estimate the unknown parameters of our proposed distribution model, we employed maximum likelihood (ML) estimation and Bayesian estimation. Furthermore, we derived several entropy measures based on ML estimation under the unit compound Rayleigh distribution. To comprehensively evaluate the performance of these entropies, we employed the Monte Carlo simulation method to calculate the average entropy estimate, average entropy bias, corresponding mean square error, and mean relative estimate for assessing the performance of various entropies within the unit compound Rayleigh distribution model. Finally, in order to validate its potential for practical applications, two sets of real data were selected for empirical analysis where fitting and parameter estimation were conducted to demonstrate the advantages of utilizing the unit compound Rayleigh distribution in describing and predicting actual data. This study not only introduces a new probability theory and statistics framework by proposing a novel distribution model but also provides researchers and practitioners in related fields with a powerful analytical tool. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. The exponentiated-Weibull proportional hazard regression model with application to censored survival data
- Author
-
Mohamed A.S. Ishag, Anthony Wanjoya, Aggrey Adem, Rehab Alsultan, Abdulaziz S. Alghamdi, and Ahmed Z. Afify
- Subjects
Survival analysis ,Censored data ,Proportional hazard regression model ,Exponentiated-Weibull distribution ,Maximum likelihood estimation ,Monte Carlo simulation ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
The proportional hazard regression models are widely used statistical tools for analyzing survival data and estimating the effects of covariates on survival times. It is assumed that the effects of the covariates are constant across the time. In this paper, we propose a novel extension of the proportional hazard model by incorporating an exponentiated-Weibull distribution to model the baseline line hazard function. The proposed model offers more flexibility in capturing various shapes of failure rates and accommodates both monotonic and non-monotonic hazard shapes. The performance evaluation of the proposed model and comparison with other commonly used survival models including the generalized log–logistic, Weibull, Gompertz, and exponentiated exponential PH regression models are explored using simulation results. The results demonstrate the ability of the introduced model to capture the baseline hazard shapes and to estimate the effect of covariates on the hazard function accurately. Furthermore, two real survival medical data sets are analyzed to illustrate the practical importance of the proposed model to provide accurate predictions of survival outcomes for individual patients. Finally, the survival data analysis reveal that the model is a powerful tool for analyzing complex survival data.
- Published
- 2024
- Full Text
- View/download PDF
47. Properties, estimation, and applications of the extended log-logistic distribution
- Author
-
Veronica Kariuki, Anthony Wanjoya, Oscar Ngesa, Amirah Saeed Alharthi, Hassan M. Aljohani, and Ahmed Z. Afify
- Subjects
Log-logistic distribution ,Alpha-power family ,Survival data ,Maximum likelihood estimation ,Order statistics ,Medicine ,Science - Abstract
Abstract This paper presents the exponentiated alpha-power log-logistic (EAPLL) distribution, which extends the log-logistic distribution. The EAPLL distribution emphasizes its suitability for survival data modeling by providing analytical simplicity and accommodating both monotone and non-monotone failure rates. We derive some of its mathematical properties and test eight estimation methods using an extensive simulation study. To determine the best estimation approach, we rank mean estimates, mean square errors, and average absolute biases on a partial and overall ranking. Furthermore, we use the EAPLL distribution to examine three real-life survival data sets, demonstrating its superior performance over competing log-logistic distributions. This study adds vital insights to survival analysis methodology and provides a solid framework for modeling various survival data scenarios.
- Published
- 2024
- Full Text
- View/download PDF
48. ESTIMATING COMMON PARAMETERS OF DIFFERENT CONTINUOUS DISTRIBUTIONS
- Author
-
Abbarapu Ashok and Nadiminti Nagamani
- Subjects
maximum likelihood estimation ,confidence interval ,gamma distribution ,weibull distribution ,rayleigh distribution ,lomax distribution ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
Estimating a common parameter is the most essential and quite fascinating task across various probability distributions. This article addresses the challenge of estimating this parameter through the application of Maximum Likelihood Estimation (MLE). Numeric determination of common parameters is conducted for several distributions, including the Lomax distribution, Gamma distribution, Rayleigh distribution, and Weibull distribution. In cases where distributions lack a closed-form solution, estimation of MLEs is achieved using the Newton-Raphson technique. Furthermore, asymptotic confidence intervals are computed utilizing the Fisher information matrix tailored to each distribution. The performance evaluation of these estimators centers on the assessment of bias and mean squared error. To enable a numerical comparison of these estimators, the Monte Carlo simulation method is employed. Finally, these techniques are applied to real-time rainfall data to assess parameter estimates for each distribution.
- Published
- 2024
- Full Text
- View/download PDF
49. Estimation of the constant-stress model with bathtub-shaped failure rates under progressive type-I interval censoring scheme
- Author
-
Mohamed Sief, Xinsheng Liu, Najwan Alsadat, and Abd El-Raheem M. Abd El-Raheem
- Subjects
Chen distribution ,Constant-stress accelerated life test ,Progressive type-I interval censoring ,Maximum likelihood estimation ,Bayesian estimation ,MCMC ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
This paper investigates constant-stress accelerated life tests interrupted by a progressive type-I interval censoring regime. We provide a model based on the Chen distribution with a constant shape parameter and a log-linear connection between the scale parameter and stress loading. Inferential methods, whether classical or Bayesian, are employed to address model parameters and reliability attributes. Classical methods involve the estimation of model parameters through maximum likelihood and midpoint techniques. Bayesian approximations are achieved via the utilization of the Metropolis–Hastings algorithm, Tierney-Kadane procedure, and importance sampling methods. Furthermore, we engage in a discourse on the estimation of confidence intervals, making references to both asymptotic confidence intervals and credible intervals. To conclude, we furnish a simulation study, a corresponding discussion, and supplement these with an analysis of real data.
- Published
- 2024
- Full Text
- View/download PDF
50. Alpha–beta-power family of distributions with applications to exponential distribution
- Author
-
H.E. Semary, Zawar Hussain, Walaa A. Hamdi, Maha A. Aldahlan, Ibrahim Elbatal, and Vasili B.V. Nagarjuna
- Subjects
Exponential distribution ,Bathtub shape ,Moments ,Maximum likelihood estimation ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
This article proposes a new way to increase the flexibility of a family of statistical distributions by adding two additional parameters. The newly proposed family is called the alpha–beta power transformation family of distributions. A specific model, the alpha–beta power exponential distribution, was thoroughly investigated. The hazard rates of the proposed distribution can be decreasing, bathtub-shaped, and unimodal. The derived structural features of the proposed model include explicit formulations for the quantiles, the moments, the moment-generating function, and the incomplete and conditional moments. In addition, estimates of maximum likelihood, least squares, weighted least squares, and minimum distance of Cramér von Mises, von Anderson–Darling, and right-tail Anderson–Darling are obtained for the unknown parameters. Two real data sets were examined to demonstrate the usefulness of the proposed approach.
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.