25,688 results on '"Maximum Likelihood Estimation"'
Search Results
2. Parameter estimation procedures for exponential-family random graph models on count-valued networks: A comparative simulation study
- Author
-
Huang, Peng and Butts, Carter T
- Subjects
Anthropology ,Sociology ,Human Society ,Bioengineering ,Generic health relevance ,Contrastive divergence ,Exponential-family random graph model ,Markov chain Monte Carlo ,Maximum likelihood estimation ,Pseudo likelihood ,Valued ,Weighted networks - Published
- 2024
3. Revisiting the Briggs Ancient DNA Damage Model: A Fast Maximum Likelihood Method to Estimate Post‐Mortem Damage.
- Author
-
Zhao, Lei, Henriksen, Rasmus Amund, Ramsøe, Abigail, Nielsen, Rasmus, and Korneliussen, Thorfinn Sand
- Subjects
- *
FOSSIL DNA , *MAXIMUM likelihood statistics , *DAMAGE models , *DNA analysis , *DNA sequencing - Abstract
ABSTRACT One essential initial step in the analysis of ancient DNA is to authenticate that the DNA sequencing reads are actually from ancient DNA. This is done by assessing if the reads exhibit typical characteristics of post‐mortem damage (PMD), including cytosine deamination and nicks. We present a novel statistical method implemented in a fast multithreaded programme, ngsBriggs that enables rapid quantification of PMD by estimation of the Briggs ancient damage model parameters (Briggs parameters). Using a multinomial model with maximum likelihood fit, ngsBriggs accurately estimates the parameters of the Briggs model, quantifying the PMD signal from single and double‐stranded DNA regions. We extend the original Briggs model to capture PMD signals for contemporary sequencing platforms and show that ngsBriggs accurately estimates the Briggs parameters across a variety of contamination levels. Classification of reads into ancient or modern reads, for the purpose of decontamination, is significantly more accurate using ngsBriggs than using other methods available. Furthermore, ngsBriggs is substantially faster than other state‐of‐the‐art methods. ngsBriggs offers a practical and accurate method for researchers seeking to authenticate ancient DNA and improve the quality of their data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Detection and Estimation of Diffuse Signal Components Using the Periodogram.
- Author
-
Selva, Jesus
- Abstract
One basic limitation of using the periodogram as a frequency estimator is that any of its significant peaks may result from a diffuse (or spread) frequency component rather than a pure one. Diffuse components are common in applications such as channel estimation, in which a given periodogram peak reveals the presence of a complex multipath distribution (unresolvable propagation paths or diffuse scattering, for example). We present a method to detect the presence of a diffuse component in a given peak based on analyzing the projection of the data vector onto the span of the signature's derivatives up to a given order. Fundamentally, a diffuse component is detected if the energy in the derivatives' subspace is too high at the peak's frequency, and its spread is estimated as the ratio between this last energy and the peak's energy. The method is based on exploiting the signature's Vandermonde structure through the properties of discrete Chebyshev polynomials. We also present an efficient numerical procedure for computing the data component in the derivatives' span based on barycentric interpolation. The paper contains a numerical assessment of the proposed estimator and detector. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. An Analysis of Type-I Generalized Progressive Hybrid Censoring for the One Parameter Logistic-Geometry Lifetime Distribution with Applications.
- Author
-
Nagy, Magdy, Mosilhy, Mohamed Ahmed, Mansi, Ahmed Hamdi, and Abu-Moussa, Mahmoud Hamed
- Abstract
Based on Type-I generalized progressive hybrid censored samples (GPHCSs), the parameter estimate for the unit-half logistic-geometry (UHLG) distribution is investigated in this work. Using maximum likelihood estimation (MLE) and Bayesian estimation, the parameters, reliability, and hazard functions of the UHLG distribution under GPHCSs have been assessed. Likewise, the computation is carried out for the asymptotic confidence intervals (ACIs). Furthermore, two bootstrap CIs, bootstrap-p and bootstrap-t, are mentioned. For symmetric loss functions, like squared error loss (SEL), and asymmetric loss functions, such as linear exponential loss (LL) and general entropy loss (GEL), there are specific Bayesian approximations. The Metropolis–Hastings samplers methodology were used to construct the credible intervals (CRIs). In conclusion, a genuine data set measuring the mortality statistics of a group of male mice with reticulum cell sarcoma is regarded as an application of the methods given. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Bivariate Length-Biased Exponential Distribution under Progressive Type-II Censoring: Incorporating Random Removal and Applications to Industrial and Computer Science Data.
- Author
-
Fayomi, Aisha, Almetwally, Ehab M., and Qura, Maha E.
- Abstract
In this paper, we address the analysis of bivariate lifetime data from a length-biased exponential distribution observed under Type II progressive censoring with random removals, where the number of units removed at each failure time follows a binomial distribution. We derive the likelihood function for the progressive Type II censoring scheme with random removals and apply it to the bivariate length-biased exponential distribution. The parameters of the proposed model are estimated using both likelihood and Bayesian methods for point and interval estimators, including asymptotic confidence intervals and bootstrap confidence intervals. We also employ different loss functions to construct Bayesian estimators. Additionally, a simulation study is conducted to compare the performance of censoring schemes. The effectiveness of the proposed methodology is demonstrated through the analysis of two real datasets from the industrial and computer science domains, providing valuable insights for illustrative purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Parameter Estimation of Uncertain Differential Equations Driven by Threshold Ornstein–Uhlenbeck Process with Application to U.S. Treasury Rate Analysis.
- Author
-
Li, Anshui, Wang, Jiajia, and Zhou, Lianlian
- Abstract
Uncertain differential equations, as an alternative to stochastic differential equations, have proved to be extremely powerful across various fields, especially in finance theory. The issue of parameter estimation for uncertain differential equations is the key step in mathematical modeling and simulation, which is very difficult, especially when the corresponding terms are driven by some complicated uncertain processes. In this paper, we propose the uncertainty counterpart of the threshold Ornstein–Uhlenbeck process in probability, named the uncertain threshold Ornstein–Uhlenbeck process, filling the gaps of the corresponding research in uncertainty theory. We then explore the parameter estimation problem under different scenarios, including cases where certain parameters are known in advance while others remain unknown. Numerical examples are provided to illustrate our method proposed. We also apply the method to study the term structure of the U.S. Treasury rates over a specific period, which can be modeled by the uncertain threshold Ornstein–Uhlenbeck process mentioned in this paper. The paper concludes with brief remarks and possible future directions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Diagnostic analytics for a GARCH model under skew-normal distributions.
- Author
-
Liu, Yonghui, Wang, Jing, Yao, Zhao, Liu, Conan, and Liu, Shuangzhe
- Abstract
In this paper, a generalized autoregressive conditional heteroskedasticity model under skew-normal distributions is studied. A maximum likelihood approach is taken and the parameters in the model are estimated based on the expectation-maximization algorithm. The statistical diagnostics is made through the local influence technique, with the normal curvature and diagnostics results established for the model under four perturbation schemes in identifying possible influential observations. A simulation study is conducted to evaluate the performance of our proposed method and a real-world application is presented as an illustrative example. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Reliability evaluation for Weibull distribution with heavily Type II censored data.
- Author
-
Liu, Mengyu, Zheng, Huiling, and Yang, Jun
- Subjects
- *
MAXIMUM likelihood statistics , *LEAST squares , *WEIBULL distribution , *ESTIMATION bias , *CENSORSHIP - Abstract
The lifetime data collected from the field are usually heavily censored, in which case, getting an accurate reliability evaluation based on heavily censored data is challenging. For heavily Type‐II censored data, the parameters estimation bias of traditional methods (i.e., maximum likelihood estimation (MLE) and least squares estimation (LSE)) are still large, and Bayesian methods are hard to specify the priors in practice. Therefore, considering the existing range of shape parameter for Weibull distribution, this study proposes two novel parameter estimation methods, the three‐step MLE method and the hybrid estimation method. For the three‐step MLE method, the initial estimates of shape and scale parameters are first respectively derived using MLE, then are updated by the single parameter MLE method with the range constraint of shape parameter. For the hybrid estimation method, the shape parameter is estimated by the LSE method with the existing range constraint of shape parameter, then the scale parameter estimate can be obtained by MLE. On this basis, two numerical examples are performed to demonstrate the consistency and effectiveness of the proposed methods. Finally, a case study on turbine engines is given to verify the effectiveness and applicability of the proposed methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. A SCALE PARAMETERS AND MODIFIED RELIABILITY ESTIMATION FOR THE INVERSE EXPONENTIAL RAYLEIGH DISTRIBUTION.
- Author
-
AL-Sultany, Shurooq A. K.
- Subjects
- *
RAYLEIGH model , *MAXIMUM likelihood statistics , *SAMPLE size (Statistics) - Abstract
This paper present methods for estimating a scale parameters and modified reliability for the Inverse Exponential Rayleigh Distribution include Maximum Likelihood, rank set sampling and Cramér-von-Mises Estimations. In all the mentioned estimation methods, the Newton-Raphson iterative numerical method was used. Then a simulation was conducted to compare the three methods with six cases and different sample sizes. The comparisons between scale parameter estimates were based on values from Mean Square Error while it was based on values from Integrated Mean Square Error for the estimates of the modified reliability function. The results show that Cramér-von-Mises (MCV) estimators is the best among the other two methods for estimating the modified reliability function. [ABSTRACT FROM AUTHOR]
- Published
- 2024
11. Inference on process capability index Spmk for a new lifetime distribution.
- Author
-
Karakaya, Kadir
- Subjects
- *
MONTE Carlo method , *PROCESS capability , *MAXIMUM likelihood statistics , *CONTINUOUS distributions , *CONFIDENCE intervals - Abstract
In various applied disciplines, the modeling of continuous data often requires the use of flexible continuous distributions. Meeting this demand calls for the introduction of new continuous distributions that possess desirable characteristics. This paper introduces a new continuous distribution. Several estimators for estimating the unknown parameters of the new distribution are discussed and their efficiency is assessed through Monte Carlo simulations. Furthermore, the process capability index S pmk is examined when the underlying distribution is the proposed distribution. The maximum likelihood estimation of the S pmk is also studied. The asymptotic confidence interval is also constructed for S pmk . The simulation results indicate that estimators for both the unknown parameters of the new distribution and the S pmk provide reasonable results. Some practical analyses are also performed on both the new distribution and the S pmk . The results of the conducted data analysis indicate that the new distribution yields effective outcomes in modeling lifetime data in the literature. Similarly, the data analyses performed for S pmk illustrate that the new distribution can be utilized for process capability indices by quality controllers. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Reliability estimation and statistical inference under joint progressively Type-II right-censored sampling for certain lifetime distributions.
- Author
-
Lin, Chien-Tai, Chen, Yen-Chou, Yeh, Tzu-Chi, and Ng, Hon Keung Tony
- Abstract
AbstractIn this article, the parameter estimation of several commonly used two-parameter lifetime distributions, including the Weibull, inverse Gaussian, and Birnbaum–Saunders distributions, based on joint progressively Type-II right-censored sample is studied. Different numerical methods and algorithms are used to compute the maximum likelihood estimates of the unknown model parameters. These methods include the Newton–Raphson method, the stochastic expectation–maximization (SEM) algorithm, and the dual annealing (DA) algorithm. These estimation methods are compared in terms of accuracy (e.g. the bias and mean squared error), computational time and effort (e.g. the required number of iterations), the ability to obtain the largest value of the likelihood, and convergence issues by means of a Monte Carlo simulation study. Recommendations are made based on the simulated results. A real data set is analyzed for illustrative purposes. These methods are implemented in Python, and the computer programs are available from the authors upon request. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Zero-Inflated Binary Classification Model with Elastic Net Regularization.
- Author
-
Xin, Hua, Lio, Yuhlong, Chen, Hsien-Ching, and Tsai, Tzong-Ru
- Subjects
- *
MACHINE learning , *MAXIMUM likelihood statistics , *EXPECTATION-maximization algorithms , *OPEN-ended questions , *DIABETES - Abstract
Zero inflation and overfitting can reduce the accuracy rate of using machine learning models for characterizing binary data sets. A zero-inflated Bernoulli (ZIBer) model can be the right model to characterize zero-inflated binary data sets. When the ZIBer model is used to characterize zero-inflated binary data sets, overcoming the overfitting problem is still an open question. To improve the overfitting problem for using the ZIBer model, the minus log-likelihood function of the ZIBer model with the elastic net regularization rule for an overfitting penalty is proposed as the loss function. An estimation procedure to minimize the loss function is developed in this study using the gradient descent method (GDM) with the momentum term as the learning rate. The proposed estimation method has two advantages. First, the proposed estimation method can be a general method that simultaneously uses L 1 - and L 2 -norm terms for penalty and includes the ridge and least absolute shrinkage and selection operator methods as special cases. Second, the momentum learning rate can accelerate the convergence of the GDM and enhance the computation efficiency of the proposed estimation procedure. The parameter selection strategy is studied, and the performance of the proposed method is evaluated using Monte Carlo simulations. A diabetes example is used as an illustration. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Tensile Properties of Cattail Fibres at Various Phenological Development Stages.
- Author
-
Hossain, Mohammed Shahadat, Rahman, Mashiur, and Cicek, Nazim
- Subjects
- *
MAXIMUM likelihood statistics , *CALCIUM oxalate , *WEIBULL distribution , *INDUSTRIAL capacity , *GROWING season - Abstract
Cattails (Typha latifolia L.) are naturally occurring aquatic macrophytes with significant industrial potential because of their abundance, high-quality fibers, and high fiber yields. This study is the first attempt to investigate how phenological development and plant maturity impact the quality of cattail fibers as they relate to composite applications. It was observed that fibers from all five growth stages exhibited a Weibull shape parameter greater than 1.0, with a goodness-of-fit exceeding 0.8. These calculations were performed using both the Least Square Regression (LSR) and Maximum Likelihood Estimation (MLE) methods. Among the estimators, the MLE method provided the most conservative estimation of Weibull parameters. Based on the Weibull parameters obtained with all estimators, cattail fibers from all five growth stages appear suitable for composite applications. The consistency of shape parameters across all five growth stages can be attributed to the morphological and molecular developments of cattail fiber during the vegetative period. These developments were confirmed through the presence of calcium oxalate (CaOx) plates, elemental composition, and specific infrared peaks at 2360 cm−1 contributing to the strength, cellulose peaks at 1635 cm−1, 2920 cm−1, and 3430 cm−1. In conclusion, it was found that the mechanical properties of cattail fiber remain similar when harvested multiple times in a single growing season. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Reliability analysis of two Gompertz populations under joint progressive type-ii censoring scheme based on binomial removal.
- Author
-
Abo-Kasem, O.E., Almetwally, Ehab M., and Abu El Azm, Wael S.
- Subjects
- *
MONTE Carlo method , *CENSORING (Statistics) , *BAYES' estimation , *DISTRIBUTION (Probability theory) , *MAXIMUM likelihood statistics , *MARKOV chain Monte Carlo - Abstract
Analysis of jointly censoring schemes has received considerable attention in the last few years. In this paper, maximum likelihood and Bayes methods of estimation are used to estimate the unknown parameters of two Gompertz populations under a joint progressive Type-II censoring scheme. Bayesian estimations of the unknown parameters are obtained based on squared error loss functions under the assumption of independent gamma priors. We propose to apply the Markov Chain Monte Carlo technique to carry out a Bayes estimation procedure. The approximate, bootstrap, and credible confidence intervals for the unknown parameters are also obtained. Also, reliability and hazard rate function of the two Gompertz populations under joint progressive Type-II censoring scheme is obtained and the corresponding approximate confidence intervals. Finally, all the theoretical results obtained are assessed and compared using two real-world data sets and Monte Carlo simulation studies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Concentration inequalities of MLE and robust MLE.
- Author
-
Yang, Xiaowei, Liu, Xinqiao, and Wei, Haoyu
- Subjects
- *
MAXIMUM likelihood statistics , *MACHINE learning , *STATISTICS - Abstract
The Maximum Likelihood Estimator (MLE) serves an important role in statistics and machine learning. In this article, for i.i.d. variables, we obtain constant-specified and sharp concentration inequalities and oracle inequalities for the MLE only under exponential moment conditions. Furthermore, in a robust setting, the sub-Gaussian type oracle inequalities of the log-truncated maximum likelihood estimator are derived under the second-moment condition. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. The exponentiated-Weibull proportional hazard regression model with application to censored survival data.
- Author
-
Ishag, Mohamed A.S., Wanjoya, Anthony, Adem, Aggrey, Alsultan, Rehab, Alghamdi, Abdulaziz S., and Afify, Ahmed Z.
- Subjects
PROPORTIONAL hazards models ,MONTE Carlo method ,REGRESSION analysis ,CENSORING (Statistics) ,MAXIMUM likelihood statistics - Abstract
The proportional hazard regression models are widely used statistical tools for analyzing survival data and estimating the effects of covariates on survival times. It is assumed that the effects of the covariates are constant across the time. In this paper, we propose a novel extension of the proportional hazard model by incorporating an exponentiated-Weibull distribution to model the baseline line hazard function. The proposed model offers more flexibility in capturing various shapes of failure rates and accommodates both monotonic and non-monotonic hazard shapes. The performance evaluation of the proposed model and comparison with other commonly used survival models including the generalized log–logistic, Weibull, Gompertz, and exponentiated exponential PH regression models are explored using simulation results. The results demonstrate the ability of the introduced model to capture the baseline hazard shapes and to estimate the effect of covariates on the hazard function accurately. Furthermore, two real survival medical data sets are analyzed to illustrate the practical importance of the proposed model to provide accurate predictions of survival outcomes for individual patients. Finally, the survival data analysis reveal that the model is a powerful tool for analyzing complex survival data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Repair alert model when the lifetimes are discretely distributed.
- Author
-
Atlehkhani, Mohammad and Doostparast, Mahdi
- Abstract
AbstractThis paper deals with the repair alert models. They are used for analyzing lifetime data coming from engineering devices under maintenance management. Repair alert models have been proposed and investigated for continuous component lifetimes. Existing studies are concerned with the lifetimes of items described by continuous distributions. However, discrete lifetimes are also frequently encountered in practice. Examples include operating a piece of equipment in cycles, reporting field failures that are gathered weekly, and the number of pages printed by a device completed before failure. Here, the repair alert models are developed when device lifetimes are discrete. A wide class of discrete distributions, called the
telescopic family , is considered for the component lifetimes, and the proposed repair alert model is explained in detail. Furthermore, the problem of estimating parameters is investigated and illustrated by analyzing a real data set. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
19. Quasi shrinkage estimation of a block-structured covariance matrix.
- Author
-
Markiewicz, A., Mokrzycka, M., and Mrowińska, M.
- Subjects
- *
LEAST squares , *MAXIMUM likelihood statistics , *COVARIANCE matrices , *ORTHOGRAPHIC projection - Abstract
In this paper, we study the estimation of a block covariance matrix with linearly structured off-diagonal blocks. We consider estimation based on the least squares method, which has some drawbacks. These estimates are not always well conditioned and may not even be definite. We propose a new estimation procedure providing a structured positive definite and well-conditioned estimator with good statistical properties. The least squares estimator is improved with the use of a shrinkage method and an additional algebraic approach. The resulting so-called quasi shrinkage estimator is compared with the structured maximum likelihood estimator. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. An alternative bounded distribution: regression model and applications.
- Author
-
Sağlam, Şule and Karakaya, Kadir
- Subjects
- *
PROCESS capability , *MONTE Carlo method , *LEAST squares , *MAXIMUM likelihood statistics , *REGRESSION analysis - Abstract
In this paper, a new bounded distribution is introduced and some distributional properties of the new distribution are discussed. Moreover, the new distribution is implemented in the field of engineering to the Cpc process capability index. Three unknown parameters of the distribution are estimated with several estimators, and the performances of the estimators are evaluated with a Monte Carlo simulation. A new regression model is introduced based on this new distribution as an alternative to beta and Kumaraswamy models. Furthermore, it is considered one of the first studies where regression model parameters are estimated using least squares, weighted least squares, Cramér–von Mises, and maximum product spacing estimators other than the maximum likelihood. The efficiency of the estimators for the parameters of the regression model is further assessed through a simulation. Real datasets are analyzed to demonstrate the applicability of the new distribution and regression model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. A modified uncertain maximum likelihood estimation with applications in uncertain statistics.
- Author
-
Liu, Yang and Liu, Baoding
- Subjects
- *
MAXIMUM likelihood statistics , *TIME series analysis , *REGRESSION analysis , *STATISTICAL models , *STATISTICS , *DIFFERENTIAL equations - Abstract
In uncertain statistics, the uncertain maximum likelihood estimation is a method of estimating the values of unknown parameters of an uncertain statistical model that make the observed data most likely. However, the observed data obtained in practice usually contain outliers. In order to eliminate the influence of outliers when estimating unknown parameters, this article modifies the uncertain maximum likelihood estimation. Following that, the modified uncertain maximum likelihood estimation is applied to uncertain regression analysis, uncertain time series analysis, and uncertain differential equation. Finally, some real-world examples are provided to illustrate the modified uncertain maximum likelihood estimation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Estimation of the constant-stress model with bathtub-shaped failure rates under progressive type-I interval censoring scheme.
- Author
-
Sief, Mohamed, Liu, Xinsheng, Alsadat, Najwan, and Abd El-Raheem, Abd El-Raheem M.
- Subjects
MAXIMUM likelihood statistics ,ACCELERATED life testing ,CONFIDENCE intervals ,PARAMETER estimation ,MARKOV chain Monte Carlo ,DATA analysis - Abstract
This paper investigates constant-stress accelerated life tests interrupted by a progressive type-I interval censoring regime. We provide a model based on the Chen distribution with a constant shape parameter and a log-linear connection between the scale parameter and stress loading. Inferential methods, whether classical or Bayesian, are employed to address model parameters and reliability attributes. Classical methods involve the estimation of model parameters through maximum likelihood and midpoint techniques. Bayesian approximations are achieved via the utilization of the Metropolis–Hastings algorithm, Tierney-Kadane procedure, and importance sampling methods. Furthermore, we engage in a discourse on the estimation of confidence intervals, making references to both asymptotic confidence intervals and credible intervals. To conclude, we furnish a simulation study, a corresponding discussion, and supplement these with an analysis of real data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Measurement and Evaluation of the Development Level of Health and Wellness Tourism from the Perspective of High-Quality Development.
- Author
-
Pan, Huali, Mi, Huanhuan, Chen, Yanhua, Chen, Ziyan, and Zhou, Weizhong
- Abstract
In recent years, with the dramatic surge in the demand for health and elderly care services, the emergence of the health dividend has presented good development opportunities for health and wellness tourism. However, as a sector of the economy, health and wellness tourism still faces numerous challenges in achieving high-quality development. Therefore, this paper focuses on 31 provinces in China and constructs a multidimensional evaluation index system for the high-quality development of health and wellness tourism. The global entropy-weighted TOPSIS method and cluster analysis are used to conduct in-depth measurements, regional comparisons, and classification evaluations of the high-quality development of health and wellness tourism in each province. The research results indicate that: (1) From a quality perspective, the level of health and wellness tourism development in 11 provinces in China has exceeded the national average, while the remaining 20 provinces are below the national average. (2) From a regional perspective, the current level of high-quality development in health and wellness tourism decreases sequentially from the eastern to the central to the western regions, with significant regional differences. (3) Overall, the development in the 31 provinces can be categorized into five types: the High-Quality Benchmark Type, the High-Quality Stable Type, the High-Quality Progressive Type, the General-Quality Potential Type, and the General-Quality Lagging Type. (4) From a single-dimension analysis perspective, there are significant differences in the rankings of each province across different dimensions. Finally, this paper enriches and expands the theoretical foundation on the high-quality development of health and wellness tourism; on the other hand, it puts forward targeted countermeasures and suggestions to help promote the comprehensive enhancement of health and wellness tourism. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Tobit models for count time series.
- Author
-
Weiß, Christian H. and Zhu, Fukang
- Subjects
- *
MAXIMUM likelihood statistics , *DISTRIBUTION (Probability theory) , *MOVING average process , *TIME series analysis , *PARAMETER estimation - Abstract
Several models for count time series have been developed during the last decades, often inspired by traditional autoregressive moving average (ARMA) models for real‐valued time series, including integer‐valued ARMA (INARMA) and integer‐valued generalized autoregressive conditional heteroscedasticity (INGARCH) models. Both INARMA and INGARCH models exhibit an ARMA‐like autocorrelation function (ACF). To achieve negative ACF values within the class of INGARCH models, log and softplus link functions are suggested in the literature, where the softplus approach leads to conditional linearity in good approximation. However, the softplus approach is limited to the INGARCH family for unbounded counts, that is, it can neither be used for bounded counts, nor for count processes from the INARMA family. In this paper, we present an alternative solution, named the Tobit approach, for achieving approximate linearity together with negative ACF values, which is more generally applicable than the softplus approach. A Skellam–Tobit INGARCH model for unbounded counts is studied in detail, including stationarity, approximate computation of moments, maximum likelihood and censored least absolute deviations estimation for unknown parameters and corresponding simulations. Extensions of the Tobit approach to other situations are also discussed, including underlying discrete distributions, INAR models, and bounded counts. Three real‐data examples are considered to illustrate the usefulness of the new approach. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Properties, estimation, and applications of the extended log-logistic distribution.
- Author
-
Kariuki, Veronica, Wanjoya, Anthony, Ngesa, Oscar, Alharthi, Amirah Saeed, Aljohani, Hassan M., and Afify, Ahmed Z.
- Subjects
- *
ESTIMATION theory , *MAXIMUM likelihood statistics , *ORDER statistics , *DATA modeling , *SIMPLICITY - Abstract
This paper presents the exponentiated alpha-power log-logistic (EAPLL) distribution, which extends the log-logistic distribution. The EAPLL distribution emphasizes its suitability for survival data modeling by providing analytical simplicity and accommodating both monotone and non-monotone failure rates. We derive some of its mathematical properties and test eight estimation methods using an extensive simulation study. To determine the best estimation approach, we rank mean estimates, mean square errors, and average absolute biases on a partial and overall ranking. Furthermore, we use the EAPLL distribution to examine three real-life survival data sets, demonstrating its superior performance over competing log-logistic distributions. This study adds vital insights to survival analysis methodology and provides a solid framework for modeling various survival data scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Different estimation techniques and data analysis for constant-partially accelerated life tests for power half-logistic distribution.
- Author
-
Alomani, Ghadah A., Hassan, Amal S., Al-Omari, Amer I., and Almetwally, Ehab M.
- Subjects
- *
ACCELERATED life testing , *CONFIDENCE intervals , *DATA analysis , *LEAST squares , *MAXIMUM likelihood statistics - Abstract
Partial accelerated life tests (PALTs) are employed when the results of accelerated life testing cannot be extended to usage circumstances. This work discusses the challenge of different estimating strategies in constant PALT with complete data. The lifetime distribution of the test item is assumed to follow the power half-logistic distribution. Several classical and Bayesian estimation techniques are presented to estimate the distribution parameters and the acceleration factor of the power half-logistic distribution. These techniques include Anderson–Darling, maximum likelihood, Cramér von-Mises, ordinary least squares, weighted least squares, maximum product of spacing and Bayesian. Additionally, the Bayesian credible intervals and approximate confidence intervals are constructed. A simulation study is provided to compare the outcomes of various estimation methods that have been provided based on mean squared error, absolute average bias, length of intervals, and coverage probabilities. This study shows that the maximum product of spacing estimation is the most effective strategy among the options in most circumstances when adopting the minimum values for MSE and average bias. In the majority of situations, Bayesian method outperforms other methods when taking into account both MSE and average bias values. When comparing approximation confidence intervals to Bayesian credible intervals, the latter have a higher coverage probability and smaller average length. Two authentic data sets are examined for illustrative purposes. Examining the two real data sets shows that the value methods are workable and applicable to certain engineering-related problems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. A TWO-PARAMETER ARADHANA DISTRIBUTION WITH APPLICATIONS TO RELIABILITY ENGINEERING.
- Author
-
Shanker, Ravi, Soni, Nitesh Kumar, Shanker, Rama, Ray, Mousumi, and Prodhani, Hosenur Rahman
- Subjects
- *
DISTRIBUTION (Probability theory) , *RELIABILITY in engineering - Abstract
The search for a statistical distribution for modelling the reliability data from reliability engineering is challenging and the main cause is the stochastic nature of the data and the presence of skewness, kurtosis and over-dispersion. During recent decades several one and two-parameter statistical distributions have been proposed in statistics literature but all these distributions were unable to capture the nature of data due to the presence of skewness, kurtosis and over-dispersion in the data. In the present paper, two-parameter Aradhana distribution, which includes one parameter Aradhana distribution as a particular case, has been proposed. Using convex combination approach of deriving a new statistical distribution, a two-parameter Aradhana distribution has been proposed. Various interesting and useful statistical properties including survival function, hazard function, reverse hazard function, mean residual life function, stochastic ordering, deviation from mean and median, stress-strength reliability, Bonferroni and Lorenz curve and their indices have been discussed. The raw moments, central moments and descriptive measures based on moments of the proposed distribution have been obtained. The estimation of parameters using the maximum likelihood method has been explained. The simulation study has been presented to know the performance in terms of consistency of maximum likelihood estimators as the sample size increases and. The goodness of test of the proposed distributions has been tested using the values of Akaike Information criterion and Kolmogorov-Smirnov statistics. Finally, two examples of real lifetime datasets from reliability engineering have been presented to demonstrate its applications and the goodness of fit, and it shows a better fit over two-parameter generalized Aradhana distribution, quasi Aradhana distribution, new quasi Aradhana distribution, Power Aradhana distribution, weighted Aradhana distribution, gamma distribution and Weibull distribution. The flexibility, tractability and usefulness of the proposed distribution show that it is very much useful for modelling reliability data from reliability engineering. As this is a new distribution and it has wide applications, it will draw the attention of researchers in reliability engineering and biomedical sciences to search many more applications in the future. [ABSTRACT FROM AUTHOR]
- Published
- 2024
28. A MODIFIED AILAMUJIA DISTRIBUTION: PROPERTIES AND APPLICATION.
- Author
-
John, DAVID Ikwuoche, Nkiru, OKEKE Evelyn, and Lilian, FRANKLIN
- Subjects
- *
DISTRIBUTION (Probability theory) , *GENERATING functions - Abstract
This study presents a modified one-parameter Ailamujia distribution called the Entropy Transformed Ailamujia distribution (ETAD) is introduced to handle both symmetric and asymmetric lifetime data sets. The ETAD properties like order and reliability statistics, entropy, moment and moment generating function, quantile function, and its variability measures were derived. The maximum likelihood estimation (MLE) method was used in estimating the parameter of ETAD and through simulation at different sample sizes, the MLE was found to be consistent, efficient, and unbiased for estimating the ETAD parameter. The flexibility of ETAD was shown by fitting it to six different real lifetime data sets and compared it alongside seven competing oneparameter distributions. The goodness of fit (GOF) results from Akaike information criteria, Bayesian information criteria, corrected Akaike information criteria, and Hannan-Quinn information criteria show that the ETAD was the best fit amongst all the seven competing distributions across all the six data sets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
29. A NOVEL HYBRID DISTRIBUTED INNOVATION EGARCH MODEL FOR INVESTIGATING THE VOLATILITY OF THE STOCK MARKET.
- Author
-
M. T., MUBARAK, O. D., ADUBISI, and U. F., ABBAS
- Subjects
- *
DISTRIBUTION (Probability theory) , *GARCH model , *MARKET volatility - Abstract
When calculating risk and making decisions, investors and financial institutions heavily rely on the modeling of asset return volatility. For the exponentiated generalized autoregressive conditional heteroscedasticity (EGARCH) model, we created a unique innovation distribution in this study called the type-II-Topp-Leone-exponentiated-Gumbel (TIITLEGU) distribution. The key mathematical characteristics of the distribution were determined, and Monte Carlo experiments were used to estimate the parameters of the novel distribution using maximum likelihood estimation (MLE) procedure. The performance of the EGARCH (1,1) model with TIITLEGU distributed innovation density in relation to other innovation densities in terms of volatility modeling is examined through applications using two Nigerian shock returns. The results of the diagnostic tests indicated that, with the exception of the EGARCH (1,1)-Johnson (SU) reparametrized (JSU) innovation density, the fitted models have been sufficiently specified. The parameters for the EGARCH (1,1) model with different innovation densities are significant at various levels. Furthermore, in out-of-sample prediction, the fitted EGARCH (1,1)-TIITLEGU innovation density performed better than the EGARCH (1,1)- existing innovation densities. As a result, it is decided that the EGARCH-TIITLEGU model is the most effective for analyzing Nigerian stock market volatility. [ABSTRACT FROM AUTHOR]
- Published
- 2024
30. INVERTED DAGUM DISTRIBUTION: PROPERTIES AND APPLICATION TO LIFETIME DATASET.
- Author
-
OSI, ABDULHAMEED A., SABO, SHAMSUDDEEN A., and MUSA, IBRAHIM Z.
- Subjects
- *
DISTRIBUTION (Probability theory) , *HAZARD function (Statistics) - Abstract
This article presents the introduction of a novel univariate probability distribution termed the inverted Dagum distribution. Extensive analysis of the statistical properties of this distribution, including the hazard function, survival function, Renyi's entropy, quantile function, and the distribution of the order statistics, was conducted. Parameter estimation of the model was performed utilizing the maximum likelihood method, with the consistency of the estimates validated through Monte Carlo simulation. Furthermore, the applicability of the proposed distribution was demonstrated through the analysis of two real datasets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
31. ANALYSIS OF TWO NON-IDENTICAL UNIT SYSTEM HAVING SAFE AND UNSAFE FAILURES WITH REBOOTING AND PARAMETRIC ESTIMATION IN CLASSICAL AND BAYESIAN PARADIGMS.
- Author
-
SHARMA, POONAM and KUMAR, PAWAN
- Subjects
- *
PARAMETER estimation , *BAYESIAN analysis - Abstract
The present paper aims at the study of a two non-identical system model having safe and unsafe failures and rebooting. The focus centers on the analysis w.r.t important reliability measures and estimation of parameters in Classical and Bayesian paradigms. At first one of the units is operational whereas other one is confined to standby mode. Any unit may suffer safe or unsafe failure. A safe failure is immediately taken up for remedial action by a repairman available with the system all the time, while the case of unsafe failure cannot be dealt directly but first rebooting is performed to convert the unsafe failure to safe failure mode so as to start repair normally. A switching device is used to make the repaired and standby units operational. The lifetime of both the units and switching device are taken to be exponentially distributed random variables whereas the distribution of repair times are assumed to be general. Regenerative point technique is employed to derive assosciated measures of effectiveness. To make the study more elaborative and visually attractive, some of the derived characteristics have been studied graphically too. A simulation study has also been undertaken to exhibit the behaviour of obtained characteristics in Classical and Bayesian setup. Valuable inferences about MLE and Bayes estimates have been drawn from the tables and graphs for varying values of failure and repair parameters. [ABSTRACT FROM AUTHOR]
- Published
- 2024
32. A PROBABILITY MODEL FOR SURVIVAL ANALYSIS OF CANCER PATIENTS.
- Author
-
Ray, Mousumi and Shanker, Rama
- Subjects
- *
CANCER patients , *PROBABILITY theory , *SURVIVAL analysis (Biometry) - Abstract
It has been observed by statistician that to find a suitable model for the survival analysis of cancer patients is really challenging. The main reasons for that is the highly positively skewed nature of datasets. During recent decades several statistician tried to propose one parameter, two-parameter, three-parameter, four-parameter and five-parameter probability models but due to either theoretical or applied point of view the goodness of fit provided by these distributions are not very satisfactory. In this paper a compound probability model called gamma-Sujatha distribution, which is a compound of gamma and Sujatha distribution, has been proposed for the modeling of survival times of cancer patients. dolor Many important properties of the suggested distribution including its shape, moments (negative), hazard function, reversed hazard function, quantile function have been discussed. Method of maximum likelihood has been used to estimate its parameters. A simulation study has been conducted to know the consistency of maximum likelihood estimators. Two real datasets, one relating to acute bone cancer and the other relating to head and neck cancer, has been considered to examine the applicability, suitability and flexibility of the proposed distribution. The goodness of fit of the proposed distribution shows quite satisfactory fit over other considered distributions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
33. Statistical inference on multicomponent stress–strength reliability with non-identical component strengths using progressively censored data from Kumaraswamy distribution.
- Author
-
Saini, Shubham, Patel, Jyoti, and Garg, Renu
- Subjects
- *
MARKOV chain Monte Carlo , *MONTE Carlo method , *MINIMUM variance estimation , *CENSORING (Statistics) , *MAXIMUM likelihood statistics - Abstract
In this article, we draw inferences on stress–strength reliability in a multicomponent system with non-identical strength components based on the progressively censored data from the Kumaraswamy distribution (KuD). When one shape parameter of KuD is known, the uniformly minimum variance unbiased estimator is produced. To evaluate the reliability of such systems when all the parameters are unknown, the maximum likelihood and Bayes estimators are developed. Along with coverage probabilities, the asymptotic confidence and highest posterior credible (HPD) intervals are also obtained. Tierney–Kadane's approximation and Markov chain Monte Carlo methods are used for Bayesian computations. To compare the performance of estimators, a Monte Carlo simulation study is performed. Also, one real-life example is analyzed for illustrative purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Unit-Power Half-Normal Distribution Including Quantile Regression with Applications to Medical Data.
- Author
-
Santoro, Karol I., Gómez, Yolanda M., Soto, Darlin, and Barranco-Chamorro, Inmaculada
- Subjects
- *
MAXIMUM likelihood statistics , *REGRESSION analysis , *DATA analysis , *DATA modeling , *QUANTILE regression - Abstract
In this paper, we present the unit-power half-normal distribution, derived from the power half-normal distribution, for data analysis in the open unit interval. The statistical properties of the unit-power half-normal model are described in detail. Simulation studies are carried out to evaluate the performance of the parameter estimators. Additionally, we implement the quantile regression for this model, which is applied to two real healthcare data sets. Our findings suggest that the unit power half-normal distribution provides a robust and flexible alternative for existing models for proportion data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Estimation of the Reliability Function of the Generalized Rayleigh Distribution under Progressive First-Failure Censoring Model.
- Author
-
Gong, Qin, Chen, Rui, Ren, Haiping, and Zhang, Fan
- Subjects
- *
MONTE Carlo method , *MAXIMUM likelihood statistics , *RAYLEIGH model , *HAZARD function (Statistics) , *INFERENTIAL statistics - Abstract
This study investigates the statistical inference of the parameters, reliability function, and hazard function of the generalized Rayleigh distribution under progressive first-failure censoring samples, considering factors such as long product lifetime and challenging experimental conditions. Firstly, the progressive first-failure model is introduced, and the maximum likelihood estimation for the parameters, reliability function, and hazard function under this model are discussed. For interval estimation, confidence intervals have been constructed for the parameters, reliability function, and hazard function using the bootstrap method. Next, in Bayesian estimation, considering informative priors and non-information priors, the Bayesian estimation of the parameters, reliability function, and hazard function under symmetric and asymmetric loss functions is obtained using the MCMC method. Finally, Monte Carlo simulation is conducted to compare mean square errors, evaluating the superiority of the maximum likelihood estimation and Bayesian estimation under different loss functions. The performance of the estimation methods used in the study is illustrated through illustrative examples. The results indicate that Bayesian estimation outperforms maximum likelihood estimation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Algorithm for Option Number Selection in Stochastic Paired Comparison Models.
- Author
-
Gyarmati, László, Mihálykó, Csaba, and Orbán-Mihálykó, Éva
- Subjects
- *
MAXIMUM likelihood statistics , *STOCHASTIC models , *DECISION making , *COMPUTER simulation , *ALGORITHMS - Abstract
In this paper, paired comparison models with a stochastic background are investigated and compared from the perspective of the option numbers allowed. As two-option and three-option models are the ones most frequently used, we mainly focus on the relationships between two-option and four-option models and three-option and five-option models, and then we turn to the general s- and (s + 2) -option models. We compare them from both theoretical and practical perspectives; the latter are based on computer simulations. We examine, when it is possible, mandatory, or advisable how to convert four-, five-, and (s + 2) -option models into two-, three-, and s-option models, respectively. The problem also exists in reverse: when is it advisable to use four-, five-, and (s + 2) -option models instead of two-, three-, and s-option models? As a result of these investigations, we set up an algorithm to perform the decision process. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. On a Randomly Censoring Scheme for Generalized Logistic Distribution with Applications.
- Author
-
Hasaballah, Mustafa M., Balogun, Oluwafemi Samson, and Bakr, Mahmoud E.
- Subjects
- *
MARKOV chain Monte Carlo , *FISHER information , *CENSORING (Statistics) , *MAXIMUM likelihood statistics , *CONFIDENCE intervals , *BAYES' estimation - Abstract
In this paper, we investigate the inferential procedures within both classical and Bayesian frameworks for the generalized logistic distribution under a random censoring model. For randomly censored data, our main goals were to develop maximum likelihood estimators and construct confidence intervals using the Fisher information matrix for the unknown parameters. Additionally, we developed Bayes estimators with gamma priors, addressing both squared error and general entropy loss functions. We also calculated Bayesian credible intervals for the parameters. These methods were applied to two real datasets with random censoring to provide valuable insights. Finally, we conducted a simulation analysis to assess the effectiveness of the estimated values. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. A Novel Discrete Linear-Exponential Distribution for Modeling Physical and Medical Data.
- Author
-
Al-Harbi, Khlood, Fayomi, Aisha, Baaqeel, Hanan, and Alsuraihi, Amany
- Subjects
- *
MEAN square algorithms , *CHARACTERISTIC functions , *ERROR functions , *PHYSICAL distribution of goods , *EXPONENTIAL functions , *BAYES' estimation - Abstract
In real-life data, count data are considered more significant in different fields. In this article, a new form of the one-parameter discrete linear-exponential distribution is derived based on the survival function as a discretization technique. An extensive study of this distribution is conducted under its new form, including characteristic functions and statistical properties. It is shown that this distribution is appropriate for modeling over-dispersed count data. Moreover, its probability mass function is right-skewed with different shapes. The unknown model parameter is estimated using the maximum likelihood method, with more attention given to Bayesian estimation methods. The Bayesian estimator is computed based on three different loss functions: a square error loss function, a linear exponential loss function, and a generalized entropy loss function. The simulation study is implemented to examine the distribution's behavior and compare the classical and Bayesian estimation methods, which indicated that the Bayesian method under the generalized entropy loss function with positive weight is the best for all sample sizes with the minimum mean squared errors. Finally, the discrete linear-exponential distribution proves its efficiency in fitting discrete physical and medical lifetime count data in real-life against other related distributions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Parameter estimation of Chen distribution under improved adaptive type-II progressive censoring.
- Author
-
Zhang, Li and Yan, Rongfang
- Subjects
- *
MONTE Carlo method , *MAXIMUM likelihood statistics , *ASYMPTOTIC normality , *HAZARD function (Statistics) , *PARAMETER estimation , *BAYES' estimation - Abstract
This study focuses on the estimation of the two unknown parameters of Chen distribution, characterized by a bathtub-shaped hazard rate function, within an improved adaptive type-II progressive censored data framework. Maximum likelihood estimation is proposed for the two parameters, and the establishment of approximate confidence intervals is based on asymptotic normality. Bayesian estimation is also conducted under both symmetric and asymmetric loss functions, utilizing the proposed importance sampling and Metropolis–Hastings algorithm. Lastly, the performance of various estimation methods is evaluated through Monte Carlo simulation experiments, and the proposed estimation approach is illustrated using a real dataset. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Fitting Cross-Lagged Panel Models with the Residual Structural Equations Approach.
- Author
-
Tseng, Ming-Chi
- Subjects
- *
MAXIMUM likelihood statistics , *STRUCTURAL equation modeling , *MOVING average process , *RESEARCH personnel , *DYNAMIC models - Abstract
This study simplifies the seven different cross-lagged panel models (CLPMs) by using the RSEM model for both inter-individual and intra-individual structures. In addition, the study incorporates the newly developed dynamic panel model (DPM), general cross-lagged model (GCLM) and the random intercept auto-regressive moving average (RI-ARMA) model. Then, using a longitudinal study of self-esteem and depression, ten different CLPMs are analyzed using robust maximum likelihood estimation. In addition, the Mplus syntax is provided as a reference for researcher. This study aims to enhance empirical researcher understanding and exploration of different CLPMs by providing simplified explanations and analyses of the ten different CLPMs, thereby promoting the development of empirical constructs and topics in the context of various cross-lagged panel models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Assessing COVID-19 Prevalence in Austria with Infection Surveys and Case Count Data as Auxiliary Information.
- Author
-
Guerrier, Stéphane, Kuzmics, Christoph, and Victoria-Feser, Maria-Pia
- Subjects
- *
COVID-19 pandemic , *MAXIMUM likelihood statistics , *COMMUNICABLE diseases , *MEASUREMENT errors , *MOMENTS method (Statistics) - Abstract
Countries officially record the number of COVID-19 cases based on medical tests of a subset of the population. These case count data obviously suffer from participation bias, and for prevalence estimation, these data are typically discarded in favor of infection surveys, or possibly also completed with auxiliary information. One exception is the series of infection surveys recorded by the Statistics Austria Federal Institute to study the prevalence of COVID-19 in Austria in April, May, and November 2020. In these infection surveys, participants were additionally asked if they were simultaneously recorded as COVID-19 positive in the case count data. In this article, we analyze the benefits of properly combining the outcomes from the infection survey with the case count data, to analyze the prevalence of COVID-19 in Austria in 2020, from which the case ascertainment rate can be deduced. The results show that our approach leads to a significant efficiency gain. Indeed, considerably smaller infection survey samples suffice to obtain the same level of estimation accuracy. Our estimation method can also handle measurement errors due to the sensitivity and specificity of medical testing devices and to the nonrandom sample weighting scheme of the infection survey. The proposed estimators and associated confidence intervals are implemented in the companion open source R package pempi available on the Comprehensive R Archive Network (CRAN). for this article are available online including a standardized description of the materials available for reproducing the work. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. A new quantile regression model with application to human development index.
- Author
-
Cordeiro, Gauss M., Rodrigues, Gabriela M., Prataviera, Fábio, and Ortega, Edwin M. M.
- Subjects
- *
MONTE Carlo method , *HUMAN Development Index , *MAXIMUM likelihood statistics , *REGRESSION analysis , *CITIES & towns , *QUANTILE regression - Abstract
A new odd log-logistic unit omega distribution is defined and studied, and some of its structural properties are obtained. A quantile regression model based on the new re-parameterized distribution is constructed, and the estimation is conducted by the maximum likelihood method. Monte Carlo simulations are used to assess the accuracy of the estimators. The flexibility, practical relevance and applicability of the proposed regression are proved by means of Human Development Index data from the cities of the state of São Paulo (Brazil). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Second term improvement to generalized linear mixed model asymptotics.
- Author
-
Maestrini, Luca, Bhaskaran, Aishwarya, and Wand, Matt P
- Subjects
- *
STATISTICAL accuracy , *MAXIMUM likelihood statistics , *INFERENTIAL statistics , *SAMPLE size (Statistics) , *DATA analysis - Abstract
A recent article by Jiang et al. (2022) on generalized linear mixed model asymptotics derived the rates of convergence for the asymptotic variances of maximum likelihood estimators. If m denotes the number of groups and n is the average within-group sample size then the asymptotic variances have orders m − 1 and (m n) − 1 , depending on the parameter. We extend this theory to provide explicit forms of the (m n) − 1 second terms of the asymptotically harder-to-estimate parameters. Improved accuracy of statistical inference and planning are consequences of our theory. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Some asymptotic inferential aspects of the Kumaraswamy distribution.
- Author
-
Carneiro, Hérica P. A., Sandoval, Mônica C., Botter, Denise A., and Magalhães, Tiago M.
- Subjects
- *
MONTE Carlo method , *LIKELIHOOD ratio tests , *MAXIMUM likelihood statistics , *CORRECTION factors , *SAMPLE size (Statistics) - Abstract
The Kumaraswamy distribution is doubly limited, continuous, very flexible, and is widely applied in hydrology and related areas. Recently, several families of distributions based on this distribution have emerged. To make a contribution regarding some asymptotic aspects related to the inferential analysis, we derived an analytic expression of order n − 1 / 2 , where n is the sample size, for the skewness coefficient of the distribution of the maximum likelihood estimators of the parameters of the Kumaraswamy distribution. A simulation study and an application are presented to illustrate that, when the sample size is small, the likelihood inferences may not be reliable. We also obtain Bartlett correction factors for the likelihood ratio statistic as well as the results of the bootstrap likelihood ratio test and bootstrap Bartlett correction and present a Monte Carlo simulation study to compare the rejection rates of the tests in question. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. A Novel Family of Distribution with Application in Engineering Problems: A Simulation Study.
- Author
-
Modi, Kanak and Singh, Yudhveer
- Subjects
- *
PHYSICAL distribution of goods , *UNCERTAINTY (Information theory) , *MONTE Carlo method , *ENGINEERING simulations , *DISTRIBUTION (Probability theory) , *MAXIMUM likelihood statistics , *ORDER statistics - Abstract
We establish a novel family of Kumaraswamy-X probability distributions in the present investigation. We discussed the KumaraswamyExponential univariate probability distribution. The new distribution with three parameters possesses density function with unimodal and reverse J-shape and hazard rate function of bathtub shaped. We study various statistical properties for it and derive the expressions for its density function, distribution function, survival and hazard rate function, Probability weighted Moments, lth moment, moment generating function, quantile function and Shannon entropy. For the derived distribution order statistics is also discussed. The parameters are estimated using the maximum likelihood estimation approach, and the performance of the estimators was evaluated using a Monte Carlo simulation. Through extensive Monte Carlo simulations and comparative analyses, we assess the performance of the Kumaraswamy-X distribution against other common probability distributions used in engineering contexts. When we apply it to real datasets, it offers a more suitable fit than other existing distributions. We explore the characteristics and potential applications of the Kumaraswamy-X distribution in the context of engineering problems through a comprehensive simulation-based investigation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
46. A General Framework for Generating Three-Components Heavy-Tailed Distributions with Application.
- Author
-
Osatohanmwen, Patrick, Oyegue, Francis O., Ogbonmwan, Sunday M., and Muhwava, William
- Subjects
DISTRIBUTION (Probability theory) ,EXTREME value theory ,VALUE distribution theory ,DATA distribution ,PARAMETER estimation - Abstract
The estimation of a certain threshold beyond which an extreme value distribution can be fitted to the tail of a data distribution remains one of the main issues in the theory of statistics of extremes. While standard Peak over Threshold (PoT) approaches determine this threshold graphically, we introduce in this paper a general framework which makes it possible for one to determine this threshold algorithmically by estimating it as a free parameter within a composite distribution. To see how this threshold point arises, we propose a general framework for generating three-component hybrid distributions which meets the need of data sets with right heavy-tail. The approach involves the combination of a distribution which can efficiently model the bulk of the data around the mean, with an heavy-tailed distribution meant to model the data observations in the tail while using another distribution as a link to connect the two. Some special examples of distributions resulting from the general framework are generated and studied. An estimation algorithm based on the maximum likelihood method is proposed for the estimation of the free parameters of the hybrid distributions. Application of the hybrid distributions to the S &P 500 index financial data set is also carried out. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Estimation with extended sequential order statistics: A link function approach.
- Author
-
Pesch, Tim, Cramer, Erhard, Polpo, Adriano, and Cripps, Edward
- Subjects
ORDER statistics ,MAXIMUM likelihood statistics ,NUMBER systems ,SAMPLE size (Statistics) - Abstract
The model of extended sequential order statistics (ESOS) comprises of two valuable characteristics making the model powerful when modelling multi‐component systems. First, components can be assumed to be heterogeneous and second, component lifetime distributions can change upon failure of other components. This degree of flexibility comes at the cost of a large number of parameters. The exact number depends on the system size and the observation depth and can quickly exceed the number of observations available. Consequently, the model would benefit from a reduction in the dimension of the parameter space to make it more readily applicable to real‐world problems. In this article, we introduce link functions to the ESOS model to reduce the dimension of the parameter space while retaining the flexibility of the model. These functions model the relation between model parameters of a component across levels. By construction the proposed 'link estimates' conveniently yield ordered model estimates. We demonstrate how those ordered estimates lead to better results compared to their unordered counterparts, particularly when sample sizes are small. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. On Modified Weighted Exponential Rayleigh Distribution.
- Author
-
Hussein, Lamyaa Khalid, Hussein, Iden Hasan, and Rasheed, Huda Abdullah
- Subjects
PROBABILITY density function ,MAXIMUM likelihood statistics ,ENTROPY - Abstract
This study seeks for classical estimations to estimating the anonymous parameters and reliability function of Modified Weighted Exponential Rayleigh MWER distribution. These classical methods are chosen precisely because all of these methods maximize the probability density function. Newton-Raphson technique was used to derive estimation methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Robust maximum correntropy criterion based square-root rotating lattice Kalman filter.
- Author
-
Liu, Sanshan, Wang, Shiyuan, Lin, Dongyuan, Zheng, Yunfei, Guo, Zhongyuan, and Kuang, Zhijian
- Abstract
Lattice Kalman filter (LKF) is a nonlinear Kalman filter that utilizes a deterministic sampling method with the advantages of optional sampling points and a flexible balance between computational burden and estimation accuracy. However, the fixed angle of sampling points in LKF can limit the optimality of the selected points. To this end, this paper proposes a novel maximum correntropy square-root rotating lattice Kalman filter (MCSRLKF) to improve the performance of LKF by adjusting the angle of sampling points. In MCSRLKF, a rotation matrix is first constructed to enhance the estimation accuracy of LKF and the optimal rotation angle of sampling points is selected to generate rotating lattice Kalman filter (RLKF). Then, the square-root RLKF (SRLKF) is proposed to enhance the stability and estimation accuracy of RLKF. Due to the utilization of the minimum mean square error criterion in SRLKF, there is a potential for significant performance degradation in non-Gaussian noises. Thus, to enhance the robustness against non-Gaussian noises, the maximum correntropy criterion is applied to SRLKF, generating MCSRLKF. Moreover, the Cramér-Rao lower bound (CRLB) serves as an indicator for assessing the performance of MCSRLKF. Finally, simulations on the nonlinear function model and reentry vehicle tracking model are used to demonstrate that MCSRLKF exhibits excellent filtering accuracy and robustness when dealing with non-Gaussian noises. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. A novel extended inverse‐exponential distribution and its application to COVID‐19 data.
- Author
-
Kargbo, Moses, Gichuhi, Anthony Waititu, and Wanjoya, Anthony Kibira
- Subjects
MONTE Carlo method ,DISTRIBUTION (Probability theory) ,MAXIMUM likelihood statistics ,COVID-19 ,RENYI'S entropy - Abstract
The aim of this article is to define a new flexible statistical model to examine the COVID‐19 data sets that cannot be modeled by the inverse exponential distribution. A novel extended distribution with one scale and three shape parameters is proposed using the generalized alpha power family of distributions to derive the generalized alpha power exponentiated inverse exponential distribution. Some important statistical properties of the new distribution such as the survival function, hazard function, quantile function, rth$$ r\mathrm{th} $$ moment, Rényi entropy, and order statistics are all derived. The method of maximum likelihood estimation is used to estimate the parameters of the new distribution. The performance of the estimators are assessed through Monte Carlo simulation, which shows that the maximum likelihood method works well in estimating the parameters. The GAPEIEx distribution was applied to COVID‐19 data sets in order to access the flexibility and adaptability of the distribution, and it happens to perform better than its submodels and other well‐known distributions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.