51 results
Search Results
2. Estimation of multicomponent stress–strength reliability for exponentiated Gumbel distribution.
- Author
-
Chacko, Manoj and Elizabeth Koshy, Ashly
- Subjects
- *
BAYES' estimation , *MARKOV chain Monte Carlo , *MAXIMUM likelihood statistics , *ERROR functions - Abstract
In this paper, the stress–strength reliability $ R_{s,k} $ R s , k of a multicomponent s-out-of-k system for exponentiated Gumbel distribution is considered. An s-out-of-k system means a system with total k components and the system can survive only when atleast s of the total components function properly. The ability of the system to overcome the experiencing stress with its strength is termed as its stress–strengh reliability. The maximum likelihood estimator and Bayes estimator for $ R_{s,k} $ R s , k are obtained. The Bayes estimators are obtained using Markov chain Monte Carlo(MCMC) method under both symmetric and asymmetric loss functions. The loss functions we considered are squared error loss function, LINEX loss function and entropy loss function. The asymptotic, bootstrap and highest posterior density(HPD) confidence intervals for $ R_{s,k} $ R s , k are also obtained. A simulation study is conducted for evaluating the efficiency of the estimators derived in this paper. Real data sets are also considered for illustration. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Inference for depending competing risks from Marshall–Olikin bivariate Kies distribution under generalized progressive hybrid censoring.
- Author
-
Chandra, Prakash, Mandal, Hemanta Kumar, Tripathi, Yogesh Mani, and Wang, Liang
- Subjects
- *
MAXIMUM likelihood statistics , *COMPETING risks , *FISHER information , *CONFIDENCE intervals , *MARKOV chain Monte Carlo - Abstract
This paper explores inferences for a competing risk model with dependent causes of failure. When the lifetimes of competing risks are modelled by a Marshall–Olikin bivariate Kies distribution, classical and Bayesian estimations are studied under generalized progressive hybrid censoring. The existence and uniqueness results for maximum likelihood estimators of unknown parameters are established, whereas approximate confidence intervals are constructed using the observed Fisher information matrix. In addition, Bayes estimates are explored based on a flexible Gamma-Dirichlet prior information. Furthermore, when there is a priori order information on competing risk parameters being available, traditional classical likelihood and Bayesian estimates are also established under restricted parameter case. The behavior of the proposed estimators is evaluated through extensive simulation studies, and a real data study is presented for illustrative purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Optimal Subsampling for Functional Quasi-Mode Regression with Big Data.
- Author
-
Wang, Tao
- Subjects
- *
MONTE Carlo method , *ASYMPTOTIC normality , *ASYMPTOTIC distribution , *MAXIMUM likelihood statistics , *SMOOTHNESS of functions - Abstract
AbstractWe propose investigating optimal subsampling for functional regression with massive datasets based on the mode value, which is referred to as functional quasi-mode regression, to reduce data volume and alleviate computational burden. Utilizing data-adaptive weights derived from regression residuals, the suggested regression offers enhanced robustness against non-normal errors compared to traditional least squares or maximum likelihood estimation methods. To estimate the model, we employ B-spline basis functions to approximate the functional coefficient and include a penalty term in the objective function for enforcing smoothness in the resulting estimator. We adopt a computationally efficient mode-expectation-maximization algorithm, augmented by a Gaussian kernel, for numerical estimation. Under mild regularity conditions, we derive the asymptotic distributions of both full data and subsample quasi-mode estimators. The optimal subsampling probabilities by minimizing the asymptotic variance-covariance matrix under A- and L-optimality criteria are identified. These optimal probabilities rely on the full data estimate, prompting the development of a two-step algorithm to approximate the optimal subsampling procedure. The resultant algorithm is processing-efficient and can significantly reduce computational time compared to the full data approach. We also establish the asymptotic normality of the quasi-mode estimator obtained through this two-step algorithm. To assess finite sample performance, we conduct Monte Carlo simulations and analyze air quality data, showcasing the effectiveness of the developed estimator. Supplemental materials for this paper are available online. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. A backfitting maximum likelihood estimator for hierarchical and geographically weighted regression modelling, with a case study of house prices in Beijing.
- Author
-
Hu, Yigong, Harris, Richard, Timmerman, Richard, and Lu, Binbin
- Subjects
- *
MAXIMUM likelihood statistics , *HOME prices , *REGRESSION analysis , *BIG data , *HETEROGENEITY - Abstract
AbstractGeographically weighted regression (GWR) and its extensions are important local modelling techniques for exploring spatial heterogeneity in regression relationships. However, when dealing with spatial data of overlapping samples – for example, when precise locational information is aggregated to a shared neighbourhood to avoid revealing the addresses of individual survey respondents – GWR-based models can encounter several problems, including obtaining reliable bandwidths. Because data with this characteristic exhibit spatial hierarchical structures, we propose combining hierarchical linear modelling (HLM) with GWR to give a hierarchical and geographically weighted regression (HGWR) model that divides coefficients into sample-level fixed effects, group-level fixed effects, sample-level random effects, and group-level spatially weighted effects. This paper presents a back-fitting likelihood estimator to fit the model, a simulation experiment that suggests that HGWR is better able to capture these effects and the spatial heterogeneity within them than are traditional HLM or GWR models, and a case study looking at predictors of housing price in Beijing, China. The ability of HGWR to tackle both spatial and group-level heterogeneity simultaneously suggests its potential as a promising data modelling tool for handling spatio-temporal big data with spatially hierarchical structures. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Almost unbiased ridge estimator in Bell regression model: theory and application to plastic polywood data.
- Author
-
Tanış, Caner and Asar, Yasin
- Subjects
- *
MONTE Carlo method , *MAXIMUM likelihood statistics , *REGRESSION analysis , *MODEL theory , *MULTICOLLINEARITY - Abstract
In this paper, a new regression estimator is proposed as an alternative to the ridge estimator in the case of multicollinearity in Bell regression model, called an almost unbiased ridge estimator. Also, we provide the theoretical properties of the new almost unbiased ridge estimator, and some theorems showing under which conditions that the almost unbiased ridge estimator is superior to its competitors. We consider a comprehensive simulation study to demonstrate the superiority of the almost unbiased ridge estimator compared to the usual Bell ridge estimator and the maximum likelihood estimator. The usefulness and superiority of the introduced regression estimator is shown via a real-world data example. According to the results of the simulation study and real-world data example, we conclude that the new almost unbiased ridge regression estimator is superior to its competitors in terms of the mean square error criterion. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Liu-type shrinkage strategies in zero-inflated negative binomial models with application to Expenditure and Default Data.
- Author
-
Zandi, Zahra, Arabi Belaghi, Reza, and Bevrani, Hossein
- Subjects
- *
MONTE Carlo method , *INDEPENDENT variables , *MAXIMUM likelihood statistics , *REGRESSION analysis , *MULTICOLLINEARITY , *DEFAULT (Finance) - Abstract
In modeling count data with overdispersion and extra zeros, zero-inflated negative binomial (ZINB) regression model is useful. In a regression model, the multicollinearity problem arises when there are some high correlations between predictor variables. This problem leads to the maximum likelihood method will not be an efficient estimator. The ridge and Liu-type estimators have been proposed to combat the multicollinearity problem so that the Liu-type estimator is better. In this paper, we proposed the Liu-type shrinkage estimators, namely linear shrinkage, preliminary test, shrinkage preliminary test, Stein-type, and positive Stein-type Liu estimators to estimate the count parameters in the ZINB model, when some of the predictor variables have not a significant effect to predict the response variable so that a sub-model may be sufficient. The asymptotic distributional biases and variances of the proposed estimators are nicely demonstrated. We also compared the performance of the Liu-type shrinkage estimators along with the Liu-type unrestricted estimator by using an extensive Monte Carlo simulation study. The results show that the performances of the proposed estimators are superior to those based on Liu-type unrestricted estimators. We also applied the proposed estimation methods to Expenditure and Default Data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Efficient estimation for accelerated failure time model with interval-censored data in the presence of a cured subgroup.
- Author
-
Zhao, Bo, Wang, Shuying, and Wang, Chunjie
- Subjects
- *
REGRESSION analysis , *MAXIMUM likelihood statistics , *BERNSTEIN polynomials , *LOGISTIC regression analysis , *DATA analysis , *MEASUREMENT errors - Abstract
As the alternative of Cox model, the accelerated failure time (AFT) model, which simply regresses the logarithm of the survival time over the covariates, is commonly used in the analysis of interval-censored data. In this paper, we propose a novel two-component mixture-cure model for the interval-censored failure time data in the presence of a cure fraction. Specifically, the first component is a logistic regression model that describes the cure rate, and the second component is a semiparametric accelerated failure time model that describes the failure time of interest for the uncured subjects. An efficient semiparametric procedure is developed to estimate parameters in the considered model. We propose a penalized sieve maximum likelihood estimation approach with Bernstein polynomials to estimate the regression parameters quickly and accurately and the proposed procedure does not rely on the assumption of the distribution of the measurement error. The asymptotic properties of the resulting estimators are established. Extensive simulation studies conducted indicate that the proposed procedure works well for practical situations. In addition, AIDS data analysis is provided for illustration of the proposed method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. A communication-efficient method for generalized linear regression with ℓ0 regularization.
- Author
-
Wang, Kunpeng, Li, Xuerui, Liu, Yanyan, and Kang, Lican
- Subjects
- *
MAXIMUM likelihood statistics , *DATA modeling , *MACHINERY , *ALGORITHMS - Abstract
In this paper, based on the Karush–Kuhn–Tucker (KKT) conditions of ℓ 0 regularization, we propose a communication-efficient distributed learning approach for high-dimensional and sparse generalized linear models with massive data sets stored across different machines. This proposed method is a support detection and root finding method for generalized linear models in a distributed form. In each round of the proposed method, the support set is first determined by the primal and dual information reduced to the master machine, then the reduced maximum likelihood estimator is obtained by the gradient descent method, among which it only suffices to calculate the gradient vectors on each machine and communicate them instead of the data. We give the optimal ℓ ∞ -norm error bound for the sequences generated by the proposed algorithm and show that this ℓ ∞ -norm error bound decays exponentially to the optimal order. Moreover, we show that the oracle estimator can be recovered if the target signal is not less than the detectable level. In addition, an adaptive version of the proposed algorithm is developed to estimate the sparsity level. Simulation studies illustrate the superior performance of the proposed methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Generalized q-logistic distribution.
- Author
-
Nair, Seema S and Jayakumar, K
- Subjects
- *
PATTERN recognition systems , *LEAST squares , *MAXIMUM likelihood statistics , *PARAMETER estimation , *GENERALIZATION - Abstract
Several generalizations of the logistic distribution and certain related models are proposed by many authors for modeling various random phenomena such as those encountered in data engineering, pattern recognition, and reliability assessment studies. In this paper, we study generalized q-logistic (GqL) distribution, in which the additional parameters offered increase flexibility of the distribution for modeling purposes. Since the parameter estimation of GqL model is not explored yet, in the present study, we propose different methods for estimating the GqL parameters. In particular, we have made a comprehensive comparison through simulation study, the performance of the maximum likelihood estimation (both complete and censored), maximum product spacing estimation and ordinary least squares estimation methods. Some characterization theorems are obtained and some properties of GqL are established. Finally, we analyze a real data to illustrate the potentiality of the model and also present some graphical illustrations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Change point detection in length-biased lognormal distribution.
- Author
-
Li, Mei, Ratnasingam, Suthakaran, Tian, Yubin, and Ning, Wei
- Subjects
- *
DISTRIBUTION (Probability theory) , *CHI-square distribution , *ASYMPTOTIC distribution , *MAXIMUM likelihood statistics , *FIX-point estimation , *CHANGE-point problems , *LOGNORMAL distribution - Abstract
AbstractIn this paper, we develop two procedures for identifying the dynamic trends in the parameters of length-biased lognormal distribution based on likelihood ratio and modified information criterion. These methods mainly consider the test of the existence of the change point and provide the maximum likelihood estimation of the change point when the change point exists. In addition, the asymptotic distribution of test statistic based on likelihood ratio is derived as an extreme value distribution, while the asymptotic distribution of test statistic based on modified information criterion is derived as a chi-square distribution. And the consistency of parameter estimation is proved. Simulations are conducted to study the performance of the proposed method in terms of power, coverage probabilities and average sizes of confidence sets. The proposed methods are applied to a real data to illustrate the detecting procedures. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. A new class of zero-truncated counting models and its application.
- Author
-
Tang, Xian-Ping, Tian, Yu-Zhu, Wu, Chun-Ho, Wang, Yue, and Mian, Zhi-Bao
- Subjects
- *
STANDARD deviations , *LIKELIHOOD ratio tests , *FACTORY accidents , *MAXIMUM likelihood statistics , *POISSON distribution - Abstract
AbstractCount data is a type of data derived from the number of times an event occurs per unit of time, and zero-truncated count data refers to count data without zero, which often appears in various fields. In this paper, a new zero-truncated Bell (ZTBell) distribution is proposed on the basis of Bell distribution. We studied its statistical properties, exploring methods such as maximum likelihood estimation (MLE), expectation–maximization (EM) algorithm, and minimization–maximization (MM) algorithm for parameter estimation, as well as conducting likelihood ratio tests. In addition, we used the Bootstrap method to calculate the standard errors and confidence intervals of the parameters. The simulation results found that all of the MLE, MM algorithm and EM algorithm are effective. And, as the sample size increases, the estimates of the parameters are closer to the true values and the root mean square error is smaller. Finally, applying the model to a set of factory accident data, we found that the ZTBell distribution fits better than the other models and is close to the fitting results of the zero-truncated generalized Poisson distribution. But ZTBell distribution has only one parameter, so it’s even simpler compared to the latter. Therefore, the ZTBell distribution can be a good alternative to other zero-truncated distributions, which provides more options available for statistical analysis in this domain. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Estimating mixed-effects state-space models via particle filters and the EM algorithm.
- Author
-
Hamdi, Fayçal and Lellou, Chahrazed
- Subjects
- *
EXPECTATION-maximization algorithms , *KALMAN filtering , *MONTE Carlo method , *GOODNESS-of-fit tests , *DYNAMICAL systems , *MAXIMUM likelihood statistics - Abstract
In this paper, we focus on studying the Mixed-Effects State-Space (MESS) models previously introduced by Liu et al. [Liu D, Lu T, Niu X-F, et al. Mixed-effects state-space models for analysis of longitudinal dynamic systems. Biometrics. 2011;67(2):476–485]. We propose an estimation method by combining the auxiliary particle learning and smoothing approach with the Expectation Maximization (EM) algorithm. First, we describe the technical details of the algorithm steps. Then, we evaluate their effectiveness and goodness of fit through a simulation study. Our method requires expressing the posterior distribution for the random effects using a sufficient statistic that can be updated recursively, thus enabling its application to various model formulations including non-Gaussian and nonlinear cases. Finally, we demonstrate the usefulness of our method and its capability to handle the missing data problem through an application to a real dataset. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. The optimal multi-stress–strength reliability technique for the progressive first failure in the length-bias exponential model using Bayesian and non-Bayesian methods.
- Author
-
Alotaibi, Refah, Almetwally, Ehab M., Ghosh, Indranil, and Rezk, Hoda
- Subjects
- *
DISTRIBUTION (Probability theory) , *ACCELERATED life testing , *MAXIMUM likelihood statistics , *MARKOV chain Monte Carlo , *FIX-point estimation , *RELIABILITY in engineering - Abstract
In many real-world situations, systems frequently fail in their demanding operational settings. Researchers pay little attention to the fact that systems typically fail to execute their intended activities when it reaches its extreme operating situations as appropriate. In this paper, we try to develop and study inferential aspect of a system reliability having multiple components based on a progressive first failure censoring scheme assuming unit length-bias exponential distribution. Regarding estimation, asymptotic, boot-p, and boot-t approaches under the interval estimation are adopted, while the maximum likelihood method under the point estimation is considered. The MCMC method is used to get the Bayes estimate of the reliability parameter assuming both the symmetric and asymmetric loss functions. The associated confidence intervals are also reported as appropriate. The effectiveness of the various adopted estimation strategies is evaluated and compared using Monte Carlo simulation studies and examples from real-world applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. A new EWMA chart for simultaneously monitoring the parameters of a shifted exponential distribution.
- Author
-
Baranwal, Amita, Kumar, Nirpeksh, Chatterjee, Kashinath, and Koukouvinos, Christos
- Subjects
- *
DISTRIBUTION (Probability theory) , *CUSUM technique , *MAXIMUM likelihood statistics , *MONTE Carlo method , *MOVING average process , *STANDARD deviations - Abstract
In various scenarios where products and services are accompanied by warranties to ensure their reliability over a specified time, the two-parameter (shifted) exponential distribution serves as a fundamental model for time-to-event data. In modern production process, the products often come with warranties, and their quality can be manifested by the changes in the scale and origin parameters of a shifted exponential (SE) distribution. This paper introduces the Max-EWMA chart, employing maximum likelihood estimators and exponentially weighted moving average (EWMA) statistics, to jointly monitor SE distribution parameters. Additionally, we extend two additional charts, namely the Max-DEWMA and Max-TEWMA charts to enhance early-stage shift detection. Performance evaluations under zero-state and steady-state conditions compare these charts with the existing Max-CUSUM chart in terms of expected value and standard deviation of the run length (RL) distribution. Our findings reveal that among the Max-EWMA schemes, the Max-EWMA SE chart outperforms the others in terms of steady-state performance, while the Max-TEWMA chart surpasses the Max-EWMA and Max-DEWMA SE charts in respect to zero-state performance. Moreover, the proposed Max-EWMA schemes demonstrate advantages over Max-CUSUM, especially for small to moderate smoothing constants. We also provide an illustrative example to demonstrate the implementation of the proposed schemes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Statistical inference for a two-parameter Rayleigh distribution under generalized progressive hybrid censoring scheme.
- Author
-
Xin, Ying, Zhou, Bingchang, Tang, Yaning, and Zhang, You
- Subjects
- *
RAYLEIGH model , *INFERENTIAL statistics , *MARKOV chain Monte Carlo , *CENSORING (Statistics) , *ASYMPTOTIC distribution , *MAXIMUM likelihood statistics , *FISHER information - Abstract
The two-parameter Rayleigh distribution, as an extended distribution of the Rayleigh distribution, has been widely applied in reliability analysis. With the introduction of the location parameter, two-parameter Rayleigh distribution becomes more flexible in fitting real-data. In this paper, based on generalized progressively hybrid censored(GPHC) sample from the two-parameter Rayleigh distribution, Classical and Bayesian inferences are discussed. The Newton-Raphson(NR) and Expectation-Maximization(EM) algorithms are used to compute the maximum likelihood estimates(MLEs). As well as the asymptotic confidence interval(ACI) estimation is obtained through the asymptotic distribution theory of maximum likelihood estimation(MLE) and computation of the observed Fisher information matrix. In Bayesian frame, the estimation of unknown parameters and prediction of future observable are taken into consideration. Due to Bayesian estimation is challenging to compute precisely and for the purpose of comparison, the Lindley's approximation, the Tierney-Kadane(TK) approximation and Markov chain Monte Carlo(MCMC) method are employed to obtain Bayesian estimates. Then, combining the MCMC algorithm mentioned in the article, the one- and two- samples Bayesian prediction are obtained. Finally, the simulation results are provided and a real-life data set is used for illustration purpose. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. A Technique for Efficient Estimation of Dynamic Structural Equation Models: A Case Study.
- Author
-
Sakalauskas, Leonidas, Dulskis, Vytautas, and Plikynas, Darius
- Subjects
- *
STRUCTURAL equation modeling , *TIME series analysis , *MAXIMUM likelihood statistics , *LATENT structure analysis , *MATRIX decomposition , *PARAMETER estimation - Abstract
Dynamic structural equation models (DSEM) are designed for time series analysis of latent structures. Inherent to the application of DSEM is model parameter estimation, which has to be addressed in many applications by a single time series. In this context, however, the methods currently available either lack estimation quality or are computationally inefficient. Given the era of big data, the necessity for a trade-off between these properties may be detrimental to the applicability of DSEM. The paper is aimed at tackling this trade-off by proposing a novel estimator recursioning technique (ER technique) that facilitates the development of computationally efficient raw-data maximum likelihood estimation algorithms through data transformation, covariance matrix block decomposition, likelihood function reduction, and estimator recursioning steps. The ER technique is introduced by applying it to a special case of the general dynamic structural equation model that encompasses a noisy Wiener-process-type structure with input from the factor-analytic model. The resulting algorithm has been verified through a number of numerical experiments as well as implemented in a brand new R package EMLI, which is available on CRAN. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Record-based transmuted generalized linear exponential distribution with increasing, decreasing and bathtub shaped failure rates.
- Author
-
Arshad, Mohd, Khetan, Mukti, Kumar, Vijay, and Pathak, Ashok Kumar
- Subjects
- *
DISTRIBUTION (Probability theory) , *MONTE Carlo method , *PROBABILITY density function , *LEAST squares , *MAXIMUM likelihood statistics , *BIAS correction (Topology) , *EXPONENTIAL functions , *BAYES' estimation - Abstract
The linear exponential distribution is a generalization of the exponential and Rayleigh distributions. This distribution is one of the best models to fit data with increasing failure rate (IFR). But it does not provide a reasonable fit for modeling data with decreasing failure rate (DFR) and bathtub shaped failure rate (BTFR). To overcome this drawback, we propose a new record-based transmuted generalized linear exponential (RTGLE) distribution by using the technique of Balakrishnan and He. The family of RTGLE distributions is more flexible to fit the data sets with IFR, DFR, and BTFR, and also generalizes several well-known models as well as some new record-based transmuted models. This paper aims to study the statistical properties of RTGLE distribution, like, the shape of the probability density function and hazard function, quantile function and its applications, moments and its generating function, order and record statistics, Rényi entropy. The maximum likelihood estimators, least squares and weighted least squares estimators, Anderson-Darling estimators, Cramér-von Mises estimators of the unknown parameters are constructed and their biases and mean squared errors are reported via Monte Carlo simulation study. Finally, the real data sets illustrate the goodness of fit and applicability of the proposed distribution; hence, suitable recommendations are forwarded. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. On inference in a class of exponential distribution under imperfect maintenance.
- Author
-
Kamranfar, Hoda, Ahmadi, Kambiz, and Fouladirad, Mitra
- Subjects
- *
DISTRIBUTION (Probability theory) , *MONTE Carlo method , *BAYES' estimation , *MAXIMUM likelihood statistics , *INFERENTIAL statistics , *CONFIDENCE intervals - Abstract
This paper deals with statistical inference for lifetime data in presence of imperfect maintenance. For the maintenance model, the Sheu and Griffith model is considered. The lifetime distribution belongs to exponential distribution class. The maximum likelihood estimation procedure of the model parameters is discussed, and confidence intervals are provided using the asymptotic likelihood theory and bootstrap approach. Based on conjugate and discrete priors, Bayesian estimators of the model parameters are developed under symmetric and asymmetric loss functions. The proposed methodologies are applied to simulated data and sensitivity analysis to different parameters and data characteristics is carried out. The effect of model misspecification is also assessed within this class of distributions through a Monte Carlo simulation study. Finally, two datasets are analyzed for demonstrative aims. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Ridge estimator in a mixed Poisson regression model.
- Author
-
Tharshan, Ramajeyam and Wijekoon, Pushpakanthie
- Subjects
- *
MULTICOLLINEARITY , *POISSON regression , *REGRESSION analysis , *MONTE Carlo method , *MAXIMUM likelihood statistics - Abstract
The generalized linear model approach of the mixed Poisson regression models (MPRM) is suitable for over-dispersed count data. The maximum likelihood estimator (MLE) is adopted to estimate their regression coefficients. However, the variance of the MLE becomes high when the covariates are collinear. The Poisson-Modification of Quasi Lindley (PMQL) regression model is a recently introduced model as an alternative MPRM. The variance of the proposed MLE for the PMQL regression model is high in the presence of multicollinearity. This paper adopts the ridge regression method for the PMQL regression model to combat such an issue, and we use several notable methods to estimate its ridge parameter. A Monte Carlo simulation study was designed to evaluate the performance of the MLE and the different PMQL ridge regression estimators by using their scalar mean square (SMSE) values. Further, we analyzed a simulated data and a real-life applications to show the consistency of the simulation results. The simulation and applications results indicate that the PMQL ridge regression estimators dominate the MLE when multicollinearity exists. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Fitting data to a multiple structural measurement errors model.
- Author
-
Al Dibi’i, Ro’ya, Abdul Rahman, Rosmanjawati, and Al-Nasser, Amjad
- Subjects
- *
ERRORS-in-variables models , *STANDARD deviations , *MOMENTS method (Statistics) , *GROSS national product , *MAXIMUM likelihood statistics - Abstract
AbstractThis paper proposes two new estimation methods to fit a multiple structural measurement error model when all variables are subject to errors. The new estimation methods were extensions of the Wald estimation method, one is the weighted grouping method, and the other is the iterative method. A Monte Carlo experiment is performed to investigate the performance of the new estimators compared with the classical estimation methods; the Maximum Likelihood Estimator and Method of Moment, in terms of root mean square error and its bias. The simulation outcomes demonstrated that the suggested estimators are more effective than conventional estimators. In addition, real data analysis is discussed to examine the relationship between the national gross domestic product, unemployment rate, and human development index after applying the two proposed estimation methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Time series regression models for zero-inflated proportions.
- Author
-
Axalan, A., Ghahramani, M., and Slonowsky, D.
- Subjects
- *
TIME series analysis , *REGRESSION analysis , *JENSEN'S inequality , *BETA distribution , *MAXIMUM likelihood statistics , *POISSON regression - Abstract
Time series of proportions are often encountered in applications such as ecology, environmental science and public health. Strategies for such data include linear regression after logistic transformation. Though easy to fit, the transformation approach renders covariate effects uninterpretable on the scale on which they were observed owing to Jensen's inequality. An alternative to the transformation approach has been to directly model the response via the beta distribution. In this paper, we extend zero-inflated beta regression models for independent proportions to time series data that is bounded over the unit interval and that may take on zero values. Estimation is within the partial-likelihood framework and is computationally feasible to implement. We outline the asymptotic theory of our maximum partial likelihood estimators under mild regularity conditions and investigate their bias and variability using simulation studies. The utility of our method is illustrated using two real data examples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Estimation of complier causal treatment effects under the additive hazards model with interval-censored data.
- Author
-
Ma, Yuqing, Wang, Peijie, Li, Shuwei, and Sun, Jianguo
- Subjects
- *
TREATMENT effectiveness , *MAXIMUM likelihood statistics , *HAZARDS , *CENSORING (Statistics) , *DATA modeling , *EARLY detection of cancer , *CONFOUNDING variables - Abstract
Estimation of causal treatment effects has attracted a great deal of interest in many areas including social, biological and health science, and for this, instrumental variable (IV) has become a commonly used tool in the presence of unmeasured confounding. In particular, many IV methods have been developed for right-censored time-to-event outcomes. In this paper, we consider a much more complicated situation where one faces interval-censored time-to-event outcomes, which are ubiquitously present in studies with, for example, intermittent follow-up but are challenging to handle in terms of both theory and computation. A sieve maximum likelihood estimation procedure is proposed for estimating complier causal treatment effects under the additive hazards model, and the resulting estimators are shown to be consistent and asymptotically normal. A simulation study is conducted to evaluate the finite sample performance of the proposed approach and suggests that it works well in practice. It is applied to a breast cancer screening study. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Identifying the time of step change in process parameter for Maxwell distribution.
- Author
-
Kapase, Rupali A. and Ghute, Vikas B.
- Subjects
- *
MAXWELL-Boltzmann distribution law , *QUALITY control charts , *MAXIMUM likelihood statistics , *SKEWNESS (Probability theory) , *TIME perception - Abstract
Due to the quick identification of the root causes for an out-of-control process, the estimation of exact time of a process change would be helpful thing for the process improvement. In contrast to the typical normal assumption, this study realizes that a process may follow a skewed Maxwell distribution. In this paper, using maximum likelihood estimation, a considerably efficient change point model is proposed for $ V $ V control chart which underlying distribution is Maxwell distribution. The required chart statistics are calculated with its distributional properties. The proposed method is used when $ V $ V control chart signals a change in a process parameter. The proposed method, when used with $ V $ V control chart would be helpful for process engineers both in controlling and identifying a permanent step change in a process scale parameter. An illustrative example is given to understand the use of the proposed method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Correcting for Sampling Error in between-Cluster Effects: An Empirical Bayes Cluster-Mean Approach with Finite Population Corrections.
- Author
-
Lai, Mark H. C., Zhang, Yichi, and Ji, Feng
- Subjects
- *
SAMPLING errors , *MONTE Carlo method , *CLUSTER sampling , *STRUCTURAL equation modeling , *MAXIMUM likelihood statistics , *SCHOOL employees , *CONFIDENCE intervals - Abstract
With clustered data, such as where students are nested within schools or employees are nested within organizations, it is often of interest to estimate and compare associations among variables separately for each level. While researchers routinely estimate between-cluster effects using the sample cluster means of a predictor, previous research has shown that such practice leads to biased estimates of coefficients at the between level, and recent research has recommended the use of latent cluster means with the multilevel structural equation modeling framework. However, the latent cluster mean approach may not always be the best choice as it (a) relies on the assumption that the population cluster sizes are close to infinite, (b) requires a relatively large number of clusters, and (c) is currently only implemented in specialized software such as Mplus. In this paper, we show how using empirical Bayes estimates of the cluster means can also lead to consistent estimates of between-level coefficients, and illustrate how the empirical Bayes estimate can incorporate finite population corrections when information on population cluster sizes is available. Through a series of Monte Carlo simulation studies, we show that the empirical Bayes cluster-mean approach performs similarly to the latent cluster mean approach for estimating the between-cluster coefficients in most conditions when the infinite-population assumption holds, and applying the finite population correction provides reasonable point and interval estimates when the population is finite. The performance of EBM can be further improved with restricted maximum likelihood estimation and likelihood-based confidence intervals. We also provide an R function that implements the empirical Bayes cluster-mean approach, and illustrate it using data from the classic High School and Beyond Study. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Bayesian estimation for geometric process with the Weibull distribution.
- Author
-
Usta, Ilhan
- Subjects
- *
WEIBULL distribution , *MARKOV chain Monte Carlo , *BAYES' estimation , *ASYMPTOTIC distribution , *MAXIMUM likelihood statistics , *MOMENTUM transfer - Abstract
In this paper, we focus on Bayesian estimation of the parameters in the geometric process (GP) in which the first occurrence time of an event is assumed to have Weibull distribution. The Bayesian estimators are derived based on both symmetric (Squared Error) and asymmetric (General Entropy, LINEX) loss functions. Since the Bayesian estimators of unknown parameters cannot be obtained analytically, Lindley's approximation and the Markov Chain Monte Carlo (MCMC) methods are applied to compute the Bayesian estimates. Furthermore, by using the MCMC methods, credible intervals of the parameters are constructed. Maximum likelihood (ML) estimators are also derived for unknown parameters. The confidence intervals of the parameters are obtained based on an asymptotic distribution of ML estimators. Moreover, the performances of the proposed Bayesian estimators are compared with the corresponding ML, modified moment and modified maximum likelihood estimators through an extensive simulation study. Finally, analyses of two different real data sets are presented for illustrative purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Unit-bimodal Birnbaum-Saunders distribution with applications.
- Author
-
Martínez-Flórez, Guillermo, Olmos, Neveka M., and Venegas, Osvaldo
- Subjects
- *
CENSORING (Statistics) , *RANDOM variables , *REGRESSION analysis , *PARAMETER estimation , *CUMULATIVE distribution function , *MAXIMUM likelihood statistics - Abstract
In this paper, we consider a transformation in a random variable which follows a bimodal Birnbaum-Saunders distribution. We propose the unit-bimodal Birnbaum-Saunders (UBBS) distribution and investigate some of its important properties, like cumulative distribution function, moments, survival function and risk function. We apply the UBBS distribution to censored data inflated at zero and one. We used the maximum likelihood approach for parameter estimation and to compare the models. Given the flexibility in UBBS distribution modes, our proposal performs best in beta regression models with zero and/or one excess. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Estimation of structural parameters in balanced Bühlmann credibility model with correlation risk.
- Author
-
Yang, Yang and Wang, Lichun
- Subjects
- *
BAYES' estimation , *MAXIMUM likelihood statistics , *PARAMETER estimation , *PANEL analysis - Abstract
In this paper, the longitudinal data analysis is used to interpret the balanced Bühlmann credibility model with correlation risk, and the homogeneous credibility estimator is derived. We obtain the restricted maximum likelihood estimators (RMLE) for the structural parameters involved in the credibility factor and show that they are unbiased. In addition, the linear Bayes method is employed to estimate the structural parameters, and the proposed linear Bayes estimators (LBE) appear to outperform RMLE in terms of the mean squared error matrix (MSEM) criterion. Simulation studies show that the proposed LBE performs well. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. An investigation of periodic degradation of axle box vibration spectrum for a high-speed rail vehicle based on Bayesian method.
- Author
-
Yang, Ningrui, Wu, Xingwen, Cai, Wubin, Liang, Shulin, and Chi, Maoru
- Subjects
- *
VIBRATIONAL spectra , *HIGH speed trains , *MAXIMUM likelihood statistics , *WIENER processes , *SPECTRAL energy distribution , *SERVICE life - Abstract
Axle box vibration serves as the main source of excitations for rail vehicles. Due to the wear of wheel/rail contact and the re-profiling procedure, the axle box vibration usually degrades periodically with the increased mileage in the service. This could significantly impact the estimation of vibration fatigue when the component is subjected to the axle box vibration. This paper develop a method to describe the periodic evolution of axle box vibration spectrum to better characterise the vibration spectrum of axle box. In this study, a Wiener process incorporating with four random parameters was employed to model the non-linearity of the degradation process. The maximum likelihood estimation (MLE) algorithm is used to estimate the initial values of the random parameters, and a Bayesian approach is employed to update the parameters based on newly obtained data. Finally, the proposed methodology is tested using long-term field test data from a high-speed train, and the results demonstrate that it accurately estimates the evolution of the axle box acceleration spectral density (ASD) spectrum. This could aid in predicting the residual service life of structures subjected to axle box vibration and further contribute to the development of maintenance strategies and top-down design of the structure. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Estimation of the trajectory and attitude of railway vehicles using inertial sensors with application to track geometry measurement.
- Author
-
González-Carbajal, J., Urda, Pedro, Muñoz, Sergio, and Escalona, José L.
- Subjects
- *
RAILROAD trains , *TRACKING algorithms , *MAXIMUM likelihood statistics , *POSITION sensors , *DETECTORS , *KALMAN filtering , *MOTION - Abstract
This paper describes a novel method for the estimation of the trajectory and orientation of a rigid body moving along a railway track. Compared to other recent developments in the literature, the presented approach has the significant advantage of using inertial sensors only, excluding global position and orientation sensors. The excluded sensors are compensated with an odometry system and previous knowledge of the design track geometry. The procedure is based on a kinematic model of the relative motion of the body with respect to the track, together with a Kalman filter algorithm. Two different approaches are used and compared for the estimation of the noise covariance matrices in the Kalman filter. One is based on the use of experimental results with a known output. The other one relies upon constrained maximum likelihood estimation. The calculated trajectory and orientation are applied in this research to the problem of track geometry measurement. A scale track is used for experimental validation, showing that results are sufficiently accurate for this application. The obtained results also reveal that the constrained maximum likelihood estimation performs similarly to the known-output method. This is very convenient because it allows a straightforward application of the algorithm in different scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. The E-Bayesian and hierarchical Bayesian estimations for the reliability analysis of Kumaraswamy generalized distribution based on upper record values.
- Author
-
Shi, Weihua, Ye, Tianrui, and Gui, Wenhao
- Subjects
- *
MEAN square algorithms , *MAXIMUM likelihood statistics , *ULTRASONIC testing , *GAMMA distributions , *BAYES' estimation , *FATIGUE testing machines - Abstract
This paper investigates the E-Bayesian and hierarchical Bayesian estimations of shape parameter and reliability function of Kumaraswamy generalized distribution based on upper record values. The classical estimation method is utilized to deduce the maximum likelihood estimation of unknown parameter and reliability function. Bayesian estimates are derived by using conjugate Gamma prior distributions under quadratic and general entropy loss functions. Furthermore, assuming that hyper-hyperparameters obey three prior distributions, the E-Bayesian estimates of unknown parameters and reliability functions are obtained. The hierarchical Bayesian estimates are obtained by using hierarchical prior distributions. We also explore some characteristics and size relationships of E-Bayesian and hierarchical Bayesian estimations. The performance of E-Bayesian, Hierarchical Bayesian, Bayesian, and maximum likelihood estimations is compared based on the minimum mean square error criterion. Finally, the proposed estimation methods are applied to evaluate the reliability of the specimen under ultrasonic fatigue testing, and the results align with their structures and profiles. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Estimation of the stress-strength parameter under two-sample balanced progressive censoring scheme.
- Author
-
Sultana, Farha, Çetinkaya, Çagatay, and Kundu, Debasis
- Subjects
- *
CENSORING (Statistics) , *PARAMETER estimation , *MAXIMUM likelihood statistics , *GIBBS sampling , *WEIBULL distribution , *SEARCH algorithms - Abstract
In this paper, we obtain the stress-strength reliability estimation under balanced joint Type-II progressive censoring scheme for independent samples from two different populations. We simultaneously place two independent samples where the experimental units follow Weibull distributions with common shape parameter β and different scale parameters α, λ, respectively. The maximum likelihood estimators of the unknown parameters are derived. Further, the Bayesian inference is considered using Lindley's approximation and Gibbs sampling method. Extensive simulations are performed to see the effectiveness of the proposed estimation methods. Further, we derive the optimal censoring scheme in the Bayesian framework by using the variable neighbourhood search method proposed by [Bhattacharya et al. On optimum life-testing plans under type-ii progressive censoring scheme using variable neighbourhood search algorithm. Test. 2016;25(2):309–330]. Further, some simulation schemes are provided to compare the performances of the estimations under the jointly censored samples versus two separate censored samples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. A new RCAR(1) model based on explanatory variables and observations.
- Author
-
Sheng, Danshu, Wang, Dehui, and Kang, Yao
- Subjects
- *
QUANTILE regression , *ASYMPTOTIC normality , *RANDOM variables , *TIME series analysis , *MAXIMUM likelihood statistics , *ASYMPTOTIC distribution - Abstract
The random coefficient autoregressive (RCAR) processes are very useful to model time series in applications. It is commonly observed that the random autoregressive coefficient is assumed to be an independent identically distributed (i.i.d.) random variable sequence. To make the RCAR model more practical, this paper considers a new RCAR(1) model driven by explanatory variable and observations. We use the conditional least squares, the quantile regression and the conditional maximum likelihood methods to estimate the model parameters. The consistency and asymptotic normality of the proposed estimates are established. Simulation studies are conducted for the evaluation of the developed approaches and two applications to real-data examples are provided. The results show that the proposed procedures perform well for the simulations and application. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. The Pareto type I joint frailty-copula model for clustered bivariate survival data.
- Author
-
Lin, Yuan-Hsin, Sun, Li-Hsien, Tseng, Yi-Ju, and Emura, Takeshi
- Subjects
- *
SURVIVAL analysis (Biometry) , *MAXIMUM likelihood statistics , *SURVIVAL rate , *HAZARD function (Statistics) , *INFERENTIAL statistics , *BIVARIATE analysis , *SPLINES , *COMPETING risks - Abstract
Clustered bivariate survival data arise in various fields, such as biology and medicine, when individuals in a dataset are clustered and exhibit two survival outcomes. Recently, the joint frailty-copula model was proposed to analyze clustered bivariate survival outcomes by accommodating the between-cluster heterogeneity via a shared frailty term. In this model, researchers fitted the baseline hazard functions via the nonparametric model, the spline model, or the Weibull model. However, when a population has extremely large survival time, the baseline hazard functions are better modeled by a heavy-tailed distribution. In this paper, we adopt the Pareto type I distribution for the joint frailty-copula model, which is one of the most popular heavy-tailed distributions. We show that the moments of the Pareto type I joint frailty copula model diverge to infinity owing to the heavy right-tail. We develop statistical inference methods based on three types of censoring schemes: (i) bivariate random censoring, (ii) semi-competing risks, and (iii) competing risks. We develop maximum likelihood estimation procedures, and make our computational tools available for users. Simulations are performed to check the accuracy of the proposed method. We finally analyze a real dataset for illustration. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Weighted Lindley regression model with varying precision: estimation, modeling and its diagnostics.
- Author
-
Mota, Alex L., Santos-Neto, Manoel, Neto, Milton Miranda, Leão, Jeremias, Tomazella, Vera L. D., and Louzada, Francisco
- Subjects
- *
REGRESSION analysis , *MONTE Carlo method , *MAXIMUM likelihood statistics , *FISHER information - Abstract
The two-parameter weighted Lindley distribution has become much popular due to its simplicity, attractive properties, and flexibility to fit data when compared with similar generalizations of the exponential model, such as gamma and Weibull, among others. In this paper, we introduce a regression model based on a weighted Lindley distribution, which is reparameterized in terms of mean and precision parameters. In this model, both the mean and precision parameters vary with the explanatory variable values and general link functions are used in order to account for these relationships. We developed and implemented local influence diagnostics to identify potential influential observations. Hessian and Fisher information matrices are computed on the closed-form as well as their inverses. Classical inference based on the maximum likelihood method is presented. Extensive Monte Carlo simulation studies are carried out for a special case of the regression model in order to verify the asymptotic properties of the maximum likelihood estimators. Finally, the usefulness of the proposed model is illustrated through an empirical analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Enhancing mortgage rate prediction: a comprehensive evaluation of computational statistical approaches.
- Author
-
Zhu, Danlei, Khaliq, Yousaf, Wang, Haoyuan, Sun, Tingting, and Wang, Donglin
- Subjects
- *
MORTGAGE rates , *STANDARD deviations , *GIBBS sampling , *MAXIMUM likelihood statistics , *MORTGAGE banks , *FORECASTING - Abstract
The noterate is a tool for predicting home mortgage rates, it is often skewed and has missing information. The noterate could be affected by incomplete or inaccurate data, thus leading to inaccurate predictions. Financial organizations including mortgage companies or banks need to consider the risk of uncertainty carefully and make a more accurate prediction based on some suitable models. To deal with this situation, in this paper we compared six computational statistical methods, including the ordinary least square model, maximum likelihood estimation, maximum a posterior, bootstrapping, Metropolis-Hastings, and Gibbs sampling method on a mortgage dataset. Based on the k fold cross-validation technique and four metrics including mean absolute error (MAE), mean squared error (MSE), root mean squared error (RMSE), and mean absolute percentage error (MAPE), the bootstrapping method outperforms other methods. In practice, this method is recommended for predicting noterate. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Stein estimators for the drift of the mixing of two fractional Brownian motions.
- Author
-
Djerfi, Kouider, Djellouli, Ghaouti, and Madani, Fethi
- Subjects
- *
BROWNIAN motion , *MAXIMUM likelihood statistics , *PARAMETER estimation - Abstract
In this paper, we consider the problem of efficient estimation for the drift parameter θ ∈ R d in the linear model Z t : = θ t + σ 1 B H 1 (t) + σ 2 B H 2 (t) , t ∈ [ 0 , T ]. Where B H 1 and B H 2 are two independent d-dimensional fractional Brownian motions with Hurst indices H1 and H2 such that 1 2 ≤ H 1 < H 2 < 1. The main goal is firstly to define the maximum likelihood estimator (MLE) of the drift θ, and secondly to provide a sufficient condition for the James-Stein type estimators which dominate, under the usual quadratic risk, the usual estimator (MLE). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. The sparse estimation of the semiparametric linear transformation model with dependent current status data.
- Author
-
Luo, Lin, Yu, Jinzhao, and Zhao, Hui
- Subjects
- *
NONPARAMETRIC estimation , *MAXIMUM likelihood statistics , *BERNSTEIN polynomials , *ALZHEIMER'S disease , *CENSORING (Statistics) - Abstract
In this paper, we study the sparse estimation under the semiparametric linear transformation models for the current status data, also called type I interval-censored data. For the problem, the failure time of interest may be dependent on the censoring time and the association parameter between them is left unspecified. To address this, we employ the copula model to describe the dependence between them and a two-stage estimation procedure to estimate both the association parameter and the regression parameter. In addition, we propose a penalized maximum likelihood estimation procedure based on the broken adaptive ridge regression, and Bernstein polynomials are used to approximate the nonparametric functions involved. The oracle property of the proposed method is established and the numerical studies suggest that the method works well for practical situations. Finally, the method is applied to an Alzheimer's disease study that motivated this investigation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Simple methods for comparing two predictive values with incomplete data.
- Author
-
Wu, Yougui
- Subjects
- *
MISSING data (Statistics) , *MAXIMUM likelihood statistics , *SENSITIVITY & specificity (Statistics) - Abstract
Statistical methods have been well developed for comparing the predictive values of two binary diagnostic tests under a paired design. However, existing methods do not make allowance for incomplete data. Although maximum likelihood based method can be used to deal with incomplete data, it requires iterative algorithm for implementation. A simple and easily implemented statistical method is therefore needed. Simple methods exist for comparing two sensitivities or specificities with incomplete data but such simple methods are not available for comparing two predictive values with incomplete data. In this paper, we propose two simple methods for comparing two predictive values with incomplete data. The test statistics derived by these two methods are simple to compute, only involving some minor modification of the existing weighted generalized score statistics with complete data. Simulation results demonstrate that the proposed methods are more efficient than the ad-hoc method that only uses the subjects wit complete data. As an illustration, the proposed methods are applied to an observational study comparing two non-invasive methods in detecting endometriosis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Bayes estimates of variance components in mixed linear model.
- Author
-
Jiang, Jie, He, Tian, and Wang, Lichun
- Subjects
- *
BAYES' estimation , *MAXIMUM likelihood statistics , *ANALYSIS of variance - Abstract
This paper proves that in mixed linear model, the analysis of variance estimation (ANOVAE), the minimum norm quadratic unbiased estimation (MINQUE), the spectral decomposition estimation (SDE) and the restricted maximum likelihood estimation (RMLE) of variance components are the same under some conditions. Based on this result, we construct a linear Bayes estimation (LBE) for the parameter vector consisting of variance components and establish its superiorities. Numerical computations and an illustration show that the LBE is comparable to Lindley's approximation, Tierney and Kadane's approximation and the usual Bayes estimation (UBE) obtained by the MCMC method and easy to use as well. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Theoretical results and modeling under the discrete Birnbaum-Saunders distribution.
- Author
-
Vilca, Filidor, Vila, Roberto, Saulo, Helton, Sánchez, Luis, and Leão, Jeremias
- Subjects
- *
STATISTICAL reliability , *MONTE Carlo method , *MAXIMUM likelihood statistics , *REGRESSION analysis , *ORDER statistics - Abstract
In this paper, we discuss some theoretical results and properties of a discrete version of the Birnbaum-Saunders distribution. We present a proof of the unimodality of this model. Moreover, results on moments, quantile function, reliability and order statistics are also presented. In addition, we propose a regression model based on the discrete Birnbaum-Saunders distribution. The model parameters are estimated by the maximum likelihood method and a Monte Carlo study is performed to evaluate the performance of the estimators. Finally, we illustrate the proposed methodology with the use of real data sets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Inference on the lifetime performance index of gamma distribution: point and interval estimation.
- Author
-
Shaabani, J. and Jafari, A. A.
- Subjects
- *
GAMMA distributions , *FIX-point estimation , *MONTE Carlo method , *ACCEPTANCE sampling , *MAXIMUM likelihood statistics , *CONFIDENCE intervals - Abstract
Performance capability indices are valuable measures to evaluate the quality of a product. In this paper, we consider inference on a lifetime performance index when the product's lifetime follows a gamma distribution with unknown parameters. The bias and mean square error of the maximum likelihood estimator and other proposed estimators are compared. Also, an asymptotic confidence interval using the maximum likelihood estimator, thirteen bootstrap confidence intervals, and four generalized pivotal quantities are derived for the performance capability index. A Monte Carlo simulation is provided to investigate the accuracy of expected lengths and coverage probabilities of the confidence intervals. An actual data set is used to illustrate the estimators and confidence intervals. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Poisson-Modification of Quasi Lindley regression model for over-dispersed count responses.
- Author
-
Tharshan, Ramajeyam and Wijekoon, Pushpakanthie
- Subjects
- *
REGRESSION analysis , *MAXIMUM likelihood statistics , *POISSON regression - Abstract
This paper introduces an alternative linear regression model for over-dispersed count responses with appropriate covariates. It is an extended work of univariate Poisson-Modification of the Quasi Lindley (PMQL) distribution via the generalized linear model approach. A re-parametrized PMQL distribution is considered to demonstrate the flexible properties of the distribution on its regression model. Further, the performance of its maximum likelihood estimation method is examined by a simulation study based on the asymptotic theory. The maximum likelihood estimator is used to estimate the parameters of the regression model. Finally, three simulated data sets and a real-world data set are taken to show the applicability of the PMQL regression model against the Poisson, Negative binomial (NB), Poisson-Quasi Lindley (PQL), and Generalized Poisson-Lindley (GPL) regression models. The results of applications show that the newly introduced model provides a better fit for over-dispersed count responses with covariates than the Poisson, NB, PQL, GPL regression models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Statistical inference for Gompertz distribution under adaptive type-II progressive hybrid censoring.
- Author
-
Lv, Qi, Tian, Yajie, and Gui, Wenhao
- Subjects
- *
DISTRIBUTION (Probability theory) , *INFERENTIAL statistics , *MAXIMUM likelihood statistics , *MONTE Carlo method , *EXPECTATION-maximization algorithms , *CENSORING (Statistics) - Abstract
Gompertz distribution is a significant and commonly used lifetime distribution, which plays an important role in reliability engineering. In this paper, we study the statistical inference of Gompertz distribution based on adaptive Type-II hybrid progressive censored schemes. From the perspective of frequentist, we derive the point estimations through the method of maximum likelihood estimation (MLE) and the existence of MLE is proved. Besides MLE, we propose the stochastic EM algorithm to reduce complexity and simplify computing. We also apply the method of Bootstraps (Bootstrap-p and Bootstrap-t) to construct confidence intervals. From Bayesian aspect, the Bayes estimates of the unknown parameters are evaluated by applying the MCMC method, the average length and coverage rate of credible intervals are also carried out. The Bayes inference is based on the squared error loss function and LINEX loss function. Furthermore, a numerical simulation is conducted to assess the performance of the proposed methods. Finally, a real-life example is considered to illustrate the application and development of the inference methods. In summary, the Bayesian method seems to perform the best among all approaches, while other approaches also present different advantages. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Alternative classification rules for two inverse gaussian populations with a common mean and order restricted scale-like parameters.
- Author
-
Kumar, Pushkal, Tripathy, Manas Ranjan, and Kumar, Somesh
- Subjects
- *
MAXIMUM likelihood statistics , *CLASSIFICATION , *GAUSSIAN processes - Abstract
The problem of classification into two inverse Gaussian populations with a common mean and ordered scale-like parameters is considered. Surprisingly, the maximum likelihood estimators (MLEs) of the associated model parameters have not been utilized for classification purposes. Note that the MLEs of the model parameters, including the MLE of the common mean, do not have closed-form expressions. In this paper, several classification rules are proposed that use the MLEs and some plug-in type estimators under order restricted scale-like parameters. In the sequel, the risk values of all the proposed estimators are compared numerically, which shows that the proposed plug-in type restricted MLE performs better than others, including the Graybill-Deal type estimator of the common mean. Further, the proposed classification rules are compared in terms of the expected probability of correct classification (EPC) numerically. It is seen that some of our proposed rules have better performance than the existing ones in most of the parameter space. Two real-life examples are considered for application purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Non-Bayesian and Bayesian estimation of stress-strength reliability from Topp-Leone distribution under progressive first-failure censoring.
- Author
-
Saini, Shubham and Garg, Renu
- Subjects
- *
BAYES' estimation , *CENSORING (Statistics) , *MAXIMUM likelihood statistics , *GIBBS sampling , *CENSORSHIP - Abstract
In this paper, the Bayesian and non-Bayesian estimation of $$\psi = P(X \gt Y)$$ ψ = P (X > Y) based on the progressively first-failure censored data is considered. The $$X$$ X and $$Y$$ Y are strength and stress random variables and follow the Topp-Leone distributions, respectively. The maximum likelihood and Bayes estimators of $$\psi $$ ψ are derived. The Bayes estimators under generalized entropy loss function are computed using Lindley's approximation and Gibbs sampling methods. Different interval estimates like asymptotic, bootstrap confidence, Bayesian credible, and highest posterior density credible intervals of $$\psi $$ ψ are constructed. Furthermore, a Monte Carlo numerical study is conducted to check the performance of various estimators developed. Finally, an application of algorithm real data is considered for illustrative purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. A new bivariate lifetime distribution: properties, estimations and its extension.
- Author
-
Sarhan, Ammar M., Apaloo, Joseph, and Kundu, Debasis
- Subjects
- *
MARGINAL distributions , *BAYES' estimation , *MAXIMUM likelihood statistics , *BIVARIATE analysis , *DISTRIBUTION (Probability theory) , *COMPETING risks - Abstract
In this paper a new bivariate lifetime distribution is introduced. Its marginal distribution functions follow two-parameter Chen distribution, which has a bathtub shaped or increasing hazard rate functions. The proposed distribution, which we call a bivariate Chen distribution (BCD), is of Marshall-Olkin type and it is a singular distribution. Several properties of this proposed distribution are discussed. The BCD distribution has four unknown parameters. The maximum likelihood (ML) method and the Bayes techniques are used to estimate the unknown parameters. The maximum likelihood estimators or the Bayes estimators cannot be obtained in closed form. Numerical methods have been used in both cases. A real data set is analyzed using the proposed distribution for illustrative and comparison purposes. An application to dependent competing risks data is discussed, and finally we have extended the BCD to the multivariate case. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. A simulation-based study of ZIP regression with various zero-inflated submodels.
- Author
-
Ali, Essoham
- Subjects
- *
MAXIMUM likelihood statistics , *POISSON regression , *REGRESSION analysis - Abstract
In this paper, we are interested in the robustness of the estimation in the Zero-Inflated Poisson regression model, when varying the class membership model of the underlying mixture. We propose an estimation procedure based on the maximum likelihood estimator. Simulations are used to examine the performance of the MLE. The results suggest that maximum likelihood allows for accurate inference. Using simulated datasets, we show that the proposed alternative link functions are quite flexible and outperform the standard link function. Also, an application to a real dataset is provided. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Estimation for multivariate normal rapidly decreasing tempered stable distributions.
- Author
-
Bianchi, Michele Leonardo and Tassinari, Gian Luca
- Subjects
- *
RANDOM variables , *PARAMETER estimation , *STOCK price indexes , *GAUSSIAN distribution , *MAXIMUM likelihood statistics - Abstract
In this paper we describe a methodology for parameter estimation of multivariate distributions defined as normal mean-variance mixture where the mixing random variable is rapidly decreasing tempered stable distributed. We address some numerical issues resulting from the use of the characteristic function for density approximation. We focus our attention on the practical implementation of numerical methods involving the use of these multivariate distributions in the field of finance and we empirical assess the proposed algorithm through an analysis on a five-dimensional series of stock index log-returns. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Exact likelihood inference for Laplace distribution based on generalized hybrid censored samples.
- Author
-
Zhu, Xiaojun and Balakrishnan, Narayanaswamy
- Subjects
- *
LAPLACE distribution , *MAXIMUM likelihood statistics , *GENERATING functions , *CENSORING (Statistics) - Abstract
In this paper, we first develop exact likelihood inference for Laplace distribution based on a generalized Type-I hybrid censored sample (Type-I HCS). We derive explicit expressions for the maximum likelihood estimators (MLEs) of the location and scale parameters. We then derive the joint moment generating function (MGF) of the MLEs, and use it to obtain the exact distributions and moments of the MLEs. Using an analogous approach, we extend the results to a generalized Type-II hybrid censored sample (Type-II HCS) next. Finally, we present a numerical example to illustrate all the results established here. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.