25,784 results on '"Maximum likelihood estimation"'
Search Results
2. Some inferences on a mixture of exponential and Rayleigh distributions based on fuzzy data
- Author
-
Mathai, Ashlyn Maria and Kumar, Mahesh
- Published
- 2024
- Full Text
- View/download PDF
3. Parameter estimation procedures for exponential-family random graph models on count-valued networks: A comparative simulation study
- Author
-
Huang, Peng and Butts, Carter T
- Subjects
Anthropology ,Sociology ,Human Society ,Bioengineering ,Generic health relevance ,Contrastive divergence ,Exponential-family random graph model ,Markov chain Monte Carlo ,Maximum likelihood estimation ,Pseudo likelihood ,Valued ,Weighted networks - Published
- 2024
4. On some algorithms for estimation in Gaussian graphical models.
- Author
-
Højsgaard, S and Lauritzen, S
- Abstract
In Gaussian graphical models, the likelihood equations must typically be solved iteratively. This paper investigates two algorithms: a version of iterative proportional scaling, which avoids inversion of large matrices, and an algorithm based on convex duality and operating on the covariance matrix by neighbourhood coordinate descent, which corresponds to the graphical lasso with zero penalty. For large, sparse graphs, the iterative proportional scaling algorithm appears feasible and has simple convergence properties. The algorithm based on neighbourhood coordinate descent is extremely fast and less dependent on sparsity, but needs a positive-definite starting value to converge. We provide an algorithm for finding such a starting value for graphs with low colouring number. As a consequence, we also obtain a simplified proof of existence of the maximum likelihood estimator in such cases. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Letter to the editor.
- Author
-
Nadarajah, Saralees
- Abstract
Mohsin, Abbas and Khan [Stochastic Environmental Research and Risk Assessment, 2024, doi: 10.1007/s00477–024-02787-z] used a generalized Pareto-exponential distribution to model extreme events involving exponentially decaying variables. In this letter, we show that the maximum likelihood procedure used in simulation studies as well as the data application is not correct. We derive the corrected version. We also derive simpler expressions for mathematical properties of the generalized Pareto-exponential distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Maximum likelihood estimation for a stochastic SEIR system with a COVID-19 application.
- Author
-
Baltazar-Larios, Fernando, Delgado-Vences, Francisco, and Diaz-Infante, Saul
- Abstract
In this paper, we propose a stochastic model for epidemiology data. The proposed model is obtained as a random perturbation of a suitable parameter in a deterministic SEIR system. This perturbation allows us to obtain a set of coupled stochastic differential equations (SDEs) that still have the conservation law. Afterward, by using Girsanov's Theorem, we calculate the maximum likelihood estimation (MLE) for parameters that represent the symptomatic infection rate, asymptomatic infection rate, and the proportion of symptomatic individuals. These parameters are crucial to obtain information about the dynamic of the disease. We prove the consistency of the MLE for a fixed time observation window, in which the disease is in its growth phase. The proposed stochastic SEIR model improves the uncertainty quantification of an overestimated MCMC scheme based on its deterministic model to count reported-confirmed COVID-19 cases of Mexico City. Using a particular mechanism to manage missing data, we developed MLE for some parameters of the stochastic model that improves the description of variance of the actual data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Analysis of Stress–Strength Modeling Using Median-Ranked Set Sampling with Its Illustration.
- Author
-
Singh, Garib Nath, Alok, Arvind Kumar, and Chandra, Prakash
- Abstract
This study focuses on the estimation of stress–strength reliability models, specifically R 1 = P (Y < X) in the first model, where both the strength variable X and stress variable Y follow the identical distribution, and R 2 = P (Y < X) in the second model, where the strength variable X and stress variable Z have distinct distributions. In the first model, we assume that the stress variable Y and the strength variable X follow generalized exponential distributions. In contrast, in the second model, the strength variable X is assumed to have a generalized exponential distribution, while the stress variable Z follows the Weibull distribution. The research utilizes the median-ranked set sampling (MRSS) technique to investigate different scenarios, operating on the premise that the variables in models are independent. The research utilizes the MRSS technique to investigate different scenarios, operating on the premise that the variables in both models are independent. By employing the maximum likelihood technique and implementing an MRSS design, we derive the reliability estimators for both models, considering situations where the strength and stress variables may have similar or dissimilar set sizes. A simulation study is conducted to validate the precision and accuracy of different estimations. In the majority of cases, the simulation results demonstrate that the reliability estimates for the first model are more precise than those for the second model. Also, the first model's reliability is always more than the reliability estimate of the second model. Survival data are utilized to illustrate the concepts, enabling the verification of the theoretical findings. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Estimation of Stress–Strength Reliability for Power Transformed Perks Distribution with Applications.
- Author
-
Haridoss, Venugopal, Jose, Sudheep, and Xavier, Thomas
- Abstract
The power transformation of the Perks distribution, its survival, and hazard functions are computed. The shapes of hazard function are obtained analytically which shows that this model has increasing and decreasing hazard rates. In the stress–strength reliability estimation, point and interval estimates are computed using maximum likelihood and asymptotic confidence interval methods, respectively. The performance of the estimates is evaluated through the Monte Carlo simulation and the applications of the proposed distribution to three real datasets have been demonstrated. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. A Cooperative Fault Detection Approach for Stochastic Multi-Agent Systems Using Maximum Likelihood Estimation Method.
- Author
-
Yang, Chen, Li, Yan, and Chen, Qijun
- Abstract
This study addresses the fault detection problem in multi-agent systems (MASs) with additive faults and stochastic uncertainties. The main focus is on enhancing the fault detection capability of each agent through a cooperative fault detection scheme, fostering cooperation between agents in two scenarios. For Gaussian uncertainties, one scheme is developed using the maximum likelihood estimation (MLE) matching expectation maximization (EM) algorithm. Additionally, a novel cooperative fault detection scheme is introduced to handle non-Gaussian uncertainties, where the cooperation mechanism among agents is determined by approximating non-Gaussian uncertainties using the Gaussian mixture model (GMM). The effectiveness and improvements of the proposed cooperative fault detection method are validated through numerical simulations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Maximum likelihood estimation for left-truncated log-logistic distributions with a given truncation point.
- Author
-
Kreer, Markus, Kızılersü, Ayşe, Guscott, Jake, Schmitz, Lukas Christopher, and Thomas, Anthony W.
- Subjects
MAXIMUM likelihood statistics ,PARETO distribution ,ASYMPTOTIC normality ,RANDOM variables ,FUNCTION spaces - Abstract
For a sample X 1 , X 2 , ... X N of independent identically distributed copies of a log-logistically distributed random variable X the maximum likelihood estimation is analysed in detail if a left-truncation point x L > 0 is introduced. Due to scaling properties it is sufficient to investigate the case x L = 1 . Here the corresponding maximum likelihood equations for a normalised sample (i.e. a sample divided by x L ) do not always possess a solution. A simple criterion guarantees the existence of a solution: Let E (·) denote the expectation induced by the normalised sample and denote by β 0 = E (ln X) - 1 , the inverse value of expectation of the logarithm of the sampled random variable X (which is greater than x L = 1 ). If this value β 0 is bigger than a certain positive number β C then a solution of the maximum likelihood equation exists. Here the number β C is the unique solution of a moment equation, E (X - β C ) = 1 2 . In the case of existence a profile likelihood function can be constructed and the optimisation problem is reduced to one dimension leading to a robust numerical algorithm. When the maximum likelihood equations do not admit a solution for certain data samples, it is shown that the Pareto distribution is the L 1 -limit of the degenerated left-truncated log-logistic distribution, where L 1 (R +) is the usual Banach space of functions whose absolute value is Lebesgue-integrable. A large sample analysis showing consistency and asymptotic normality complements our analysis. Finally, two applications to real world data are presented. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Wind speed probabilistic forecast based wind turbine selection and siting for urban environment.
- Author
-
Sachar, Shivangi, Shubham, Shubham, Doerffer, Piotr, Ianakiev, Anton, and Flaszyński, Paweł
- Subjects
DISTRIBUTION (Probability theory) ,RENEWABLE energy sources ,PROBABILITY density function ,WIND power ,WEIBULL distribution ,OFFSHORE wind power plants - Abstract
Wind energy being a free source of energy is becoming popular over the past decades and is being studied extensively. Integration of wind turbines is now being expanded to urban and offshore settings in contrast to the conventional wind farms in relatively open areas. The direct installation of wind turbines poses a potential risk, as it may result in financial losses in scenarios characterized by inadequate wind resource availability. Therefore, wind energy availability analysis in such urban environments is a necessity. This research paper presents an in‐depth investigation conducted to predict the exploitable wind energy at four distinct locations within Nottingham, United Kingdom. Subsequently, the most suitable location, Clifton Campus at Nottingham Trent University, is identified where a comprehensive comparative analysis of power generation from eleven different wind turbine models is performed. The findings derived from this analysis suggest that the QR6 wind turbine emerges as the optimal choice for subsequent experimental investigations to be conducted in partnership with Nottingham Trent University. Furthermore, this study explores the selection of an appropriate probability density function for assessing wind potential considering seven different distributions namely, Gamma, Weibull, Rayleigh, Log‐normal, Genextreme, Gumbel, and Normal. Ultimately, the Weibull probability distribution is selected, and various methodologies are employed to estimate its parameters, which are then ranked using statistical assessments. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Height Measurement for Meter-Wave MIMO Radar Based on Sparse Array Under Multipath Interference.
- Author
-
Qin, Cong, Zhang, Qin, Zheng, Guimei, Zhang, Gangsheng, and Wang, Shiqiang
- Abstract
For meter-wave multiple-input multiple-output (MIMO) radar, the multipath of target echoes may cause severe errors in height measurement, especially in the case of complex terrain where terrain fluctuation, ground inclination, and multiple reflection points exist. Inspired by a sparse array with greater degrees of freedom and low mutual coupling, a height measurement method based on a sparse array is proposed. First, a practical signal model of MIMO radar based on a sparse array is established. Then, the modified multiple signal classification (MUSIC) and maximum likelihood (ML) estimation algorithms based on two classical sparse arrays (coprime array and nested array) are proposed. To reduce the complexity of the algorithm, a real-valued processing algorithm for generalized MUSIC (GMUSIC) and maximum likelihood is proposed, and a reduced dimension matrix is introduced into the real-valued processing algorithm to further reduce computation complexity. Finally, sufficient simulation results are provided to illustrate the effectiveness and superiority of the proposed technique. The simulation results show that the height measurement accuracy can be efficiently improved by using our proposed technique for both simple and complex terrain. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Reliability inference for dual constant-stress accelerated life test with exponential distribution and progressively Type-II censoring.
- Author
-
Feng, Xuefeng, Tang, Jiayin, Balakrishnan, N., and Tan, Qitao
- Abstract
Accelerated life test provides a feasible and effective way to rapidly derive lifetime information by exposing products to higher-than-normal operating conditions. However, most of the previous research on accelerated life test has focused on the application of a single stress factor and a traditional censoring scheme. This article considers the reliability inference for a dual constant-stress accelerated life test model with exponential distribution and progressively Type-II censoring. Point estimates for model parameters are provided using maximum likelihood estimation and the weighted least squares method based on random variable transformation. In addition, we construct asymptotic confidence intervals, approximate confidence intervals, and bootstrap confidence intervals for the parameters of interest. Finally, extensive simulation studies and an illustrative example are presented to investigate the performance of the proposed methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Properties and applications of two-tailed quasi-Lindley distribution.
- Author
-
Kumar, C. Satheesh and Jose, Rosmi
- Subjects
- *
LAPLACE distribution , *MAXIMUM likelihood statistics , *RENYI'S entropy , *ORDER statistics - Abstract
Here we consider two-parameter and three-parameter versions of the two-tailed quasi-Lindley distribution and investigate their important properties. An attempt has been made to estimate its parameters by the method of maximum likelihood, along with a brief discussion on the existence of the estimators. Further, the distribution is fitted to certain real-life data sets to illustrate the utility of the proposed models. A simulation study is carried out to assess the performance of likelihood estimators of the parameters of the distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Efficient non-parametric estimation of variable productivity Hawkes processes.
- Author
-
Phillips, Sophie and Schoenberg, Frederic
- Subjects
- *
NONPARAMETRIC estimation , *MAXIMUM likelihood statistics , *POINT processes , *LEAST squares , *FIX-point estimation , *DATA binning - Abstract
Several approaches to estimating the productivity function for a Hawkes point process with variable productivity are discussed, improved upon, and compared in terms of their root-mean-squared error and computational efficiency for various data sizes, and for binned as well as unbinned data. We find that for unbinned data, a regularized version of the analytic maximum likelihood estimator proposed by Schoenberg is the most accurate but is computationally burdensome. The unregularized version of the estimator is faster to compute but has lower accuracy, though both estimators outperform empirical or binned least squares estimators in terms of root-mean-squared error, especially when the mean productivity is 0.2 or greater. For binned data, binned least squares estimates are highly efficient both in terms of computation time and root-mean-squared error. An extension to estimating transmission time density is discussed, and an application to estimating the productivity of Covid-19 in the United States as a function of time from January 2020 to July 2022 is provided. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. A novel statistical approach to COVID-19 variability using the Weibull-Inverse Nadarajah Haghighi distribution.
- Author
-
Ahmad, Aijaz, Alsadat, Najwan, Rather, Aafaq A., Meraou, M.A., and Mohie El-Din, Marwa M.
- Subjects
PROBABILITY density function ,MAXIMUM likelihood statistics ,COVID-19 ,RESEARCH personnel ,ENVIRONMENTAL sciences - Abstract
Researchers have devoted decades to striving to create a plethora of distinctive distributions in order to meet specific objectives. The argument is that traditional distributions have typically been found to lack fit in real-world situations, which include pharmaceutical studies, the field of engineering, hydrology, environmental science, and a number of others. The Weibull-inverse Nadarajah Haghighi (WINH) distribution is developed by combining the Weibull and inverse Nadarajah Haghighi distributions. The proposed distribution's fundamental characteristics have been established and analyzed. Several plots of the distributional properties, notably probability density function (PDF) with corresponding cumulative distribution function (CDF) are displayed. The estimation of model parameter is performed via the MLE procedure. Simulation-based research is conducted to demonstrate the performance of proposed estimator's using some measure, like the average bias, variance, and associated mean square error (MSE). Two real datasets represent the morality due to COVID 19 in France and Canada are illustrated to see the practicality of the recommended model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Bayesian and non Bayesian inference for extended two parameters model with application in financial and production fields.
- Author
-
Alhelali, Marwan H. and Alsaedi, Basim S.O.
- Subjects
PROBABILITY density function ,DISTRIBUTION (Probability theory) ,MAXIMUM likelihood statistics ,RANDOM variables ,INFERENTIAL statistics - Abstract
In statistical inference, introducing a probability distribution appropriate for modeling complex, skewed and symmetric datasets plays an important role. This article presents a new method, referred to as the exponential transformed approach, aimed at creating fresh probability models. This method entails transforming independent and identically distributed reduced Kies random variables. This article establishes various statistical and distributional properties of this model. Furthermore, the article employs several estimation methods to estimates the unknown parameter for the proposed model. Simulation experiments are conducted to showcase the effectiveness of the proposed estimators. Additionally, two real-world data analyses demonstrate practical applications in financial and production contexts, and it is shown that the recommended distribution has superior performance compared to other existing models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Data Reconciliation for Assessing Compliance of Physicochemical Properties of Petroleum Products in Commercial Transactions.
- Author
-
Moreira, Rosana Medeiros, Silva Rocha, Ariadne Mayra, and de Oliveira, Elcio Cruz
- Abstract
The physicochemical properties of petroleum products in commercial transactions are crucial for quality control in the oil and gas industry. However, different laboratories often produce slightly different measurement results. These variations can be significant when approving or rejecting properties based on regulatory agency and environmental body specifications. A simple arithmetic average is typically used to determine the most probable value in disputes. This study proposed using a Data Reconciliation approach to address the disparity between the projected model and empirical data. An objective function was employed to optimize and evaluate parameters using maximum likelihood estimation, considering the experimental uncertainty values. This study found that the flash point of jet fuel, as determined by the Tag Closed Cup Tester, was within the specified range (maximum of 40 °C). The application of this tool resolved a dispute between a supplier and a customer, as the reconciled value with minimized uncertainty was determined to be 37.5 ± 2.0 °C. Additionally, the study utilized experimental results from 12 accredited laboratories to determine a single reconciled value for the final boiling point of gasoline. Despite the varying experimental uncertainties ranging from 6.0 °C to 13 °C, the reconciled uncertainty was minimized to 2.6 °C. The last case study identified that ASTM D4294 was incompatible with other test methods for evaluating the mass fraction in diesel oil. In this manner, Data Reconciliation enhanced the accuracy and effectively reduced measurement uncertainties, rendering it a potent tool for resolving legal disputes when evaluating the compliance of the physicochemical properties of petroleum products in commercial transactions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Exploring statistical and machine learning methods for modeling probability distribution parameters in downtime length analysis: a paper manufacturing machine case study.
- Author
-
Koković, Vladimir, Pavlović, Kosta, Mijanović, Andjela, Kovačević, Slavko, Mačužić, Ivan, and Božović, Vladimir
- Subjects
ARTIFICIAL neural networks ,VIBRATION (Mechanics) ,PROBABILITY density function ,DISTRIBUTION (Probability theory) ,DATA libraries - Abstract
Manufacturing companies focus on improving productivity, reducing costs, and aligning performance metrics with strategic objectives. In industries like paper manufacturing, minimizing equipment downtime is essential for maintaining high throughput. Leveraging the extensive data generated by these facilities offers opportunities for gaining competitive advantages through data-driven insights, revealing trends, patterns, and predicting future performance indicators like unplanned downtime length, which is essential in optimizing maintenance and minimizing potential losses. This paper explores statistical and machine learning techniques for modeling downtime length probability distributions and correlation with machine vibration measurements. We proposed a novel framework, employing advanced data-driven techniques like artificial neural networks (ANNs) to estimate parameters of probability distributions governing downtime lengths. Our approach specifically focuses on modeling parameters of these distribution, rather than directly modeling probability density function (PDF) values, as is common in other approaches. Experimental results indicate a significant performance boost, with the proposed method achieving up to 30% superior performance in modeling the distribution of downtime lengths compared to alternative methods. Moreover, this method facilitates unsupervised training, making it suitable for big data repositories of unlabelled data. The framework allows for potential expansion by incorporating additional input variables. In this study, machine vibration velocity measurements are selected for further investigation. The study underscores the potential of advanced data-driven techniques to enables companies to make better-informed decisions regarding their current maintenance practices and to direct improvement programs in industrial settings. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. On the Discretization of the Weibull-G Family of Distributions: Properties, Parameter Estimates, and Applications of a New Discrete Distribution.
- Author
-
Balubaid, Abeer, Klakattawi, Hadeel, and Alsulami, Dawlah
- Abstract
In this article, we introduce a new three-parameter distribution called the discrete Weibull exponential (DWE) distribution, based on the use of a discretization technique for the Weibull-G family of distributions. This distribution is noteworthy, as its probability mass function presents both symmetric and asymmetric shapes. In addition, its related hazard function is tractable, exhibiting a wide range of shapes, including increasing, increasing–constant, uniform, monotonically increasing, and reversed J-shaped. We also discuss some of the properties of the proposed distribution, such as the moments, moment-generating function, dispersion index, Rényi entropy, and order statistics. The maximum likelihood method is employed to estimate the model's unknown parameters, and these estimates are evaluated through simulation studies. Additionally, the effectiveness of the model is examined by applying it to three real data sets. The results demonstrate that, in comparison to the other considered distributions, the proposed distribution provides a better fit to the data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Optimal Estimation of Reliability Parameters for Modified Frechet-Exponential Distribution Using Progressive Type-II Censored Samples with Mechanical and Medical Data.
- Author
-
Ramadan, Dina A., Farhat, Ahmed T., Bakr, M. E., Balogun, Oluwafemi Samson, and Hasaballah, Mustafa M.
- Abstract
The aim of this research is to estimate the parameters of the modified Frechet-exponential (MFE) distribution using different methods when applied to progressive type-II censored samples. These methods include using the maximum likelihood technique and the Bayesian approach, which were used to determine the values of parameters in addition to calculating the reliability and failure functions at time t. The approximate confidence intervals (ACIs) and credible intervals (CRIs) are derived for these parameters. Two bootstrap techniques of parametric type are provided to compute the bootstrap confidence intervals. Both symmetric loss functions such as the squared error loss (SEL) and asymmetric loss functions such as the linear-exponential (LINEX) loss are used in the Bayesian method to obtain the estimates. The Markov Chain Monte Carlo (MCMC) technique is utilized in the Metropolis–Hasting sampler approach to obtain the unknown parameters using the Bayes approach. Two actual datasets are utilized to examine the various progressive schemes and different estimation methods considered in this paper. Additionally, a simulation study is performed to compare the schemes and estimation techniques. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. The Representative Points of Generalized Alpha Skew- t Distribution and Applications.
- Author
-
Zhou, Yong-Feng, Lin, Yu-Xuan, Fang, Kai-Tai, and Yin, Hong
- Abstract
Assuming the underlying statistical distribution of data is critical in information theory, as it impacts the accuracy and efficiency of communication and the definition of entropy. The real-world data are widely assumed to follow the normal distribution. To better comprehend the skewness of the data, many models more flexible than the normal distribution have been proposed, such as the generalized alpha skew-t (GAST) distribution. This paper studies some properties of the GAST distribution, including the calculation of the moments, and the relationship between the number of peaks and the GAST parameters with some proofs. For complex probability distributions, representative points (RPs) are useful due to the convenience of manipulation, computation and analysis. The relative entropy of two probability distributions could have been a good criterion for the purpose of generating RPs of a specific distribution but is not popularly used due to computational complexity. Hence, this paper only provides three ways to obtain RPs of the GAST distribution, Monte Carlo (MC), quasi-Monte Carlo (QMC), and mean square error (MSE). The three types of RPs are utilized in estimating moments and densities of the GAST distribution with known and unknown parameters. The MSE representative points perform the best among all case studies. For unknown parameter cases, a revised maximum likelihood estimation (MLE) method of parameter estimation is compared with the plain MLE method. It indicates that the revised MLE method is suitable for the GAST distribution having a unimodal or unobvious bimodal pattern. This paper includes two real-data applications in which the GAST model appears adaptable to various types of data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. High-dimensional Bayesian optimization with a combination of Kriging models.
- Author
-
Appriou, Tanguy, Rullière, Didier, and Gaudrie, David
- Abstract
Bayesian Optimization (BO) is a popular approach to solve optimization problems using as few function evaluations as possible. In particular, Efficient Global Optimization (EGO) based on Kriging surrogate models has been successfully applied to many real-world applications in low dimensions (less than 30 design parameters). However, in high dimension, building an accurate Kriging model is difficult, especially when the number of samples is limited as is the case when dealing with numerical simulators. This is due to the inner optimization of the Kriging length-scale hyperparameters which can lead to inaccurate models and impacts the performances of the optimization. In this paper, we introduce a new method for high-dimensional BO which bypasses the length-scales optimization by combining sub-models with random length-scales, and whose expression, obtained in closed-form, avoids any inner optimization. We also describe how to sample suitable length-scales for the sub-models using an entropy-based criterion, in order to avoid degenerated sub-models having either too large or too small length-scales. Finally, the variance of the combination being not directly available, we present a method to compute the prediction variance for any weighting method. We apply our combined Kriging model to high-dimensional BO for analytical test functions and for the design of an electric machine. We show that our method builds more accurate surrogate models than ordinary Kriging when the number of samples is small. This results in faster convergence for BO using the combination. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Uniformization and bounded Taylor series in Newton–Raphson method improves computational performance for a multistate transition model estimation and inference.
- Author
-
Zhu, Yuxi, Brock, Guy, and Li, Lang
- Subjects
- *
ELECTRONIC health records , *MAXIMUM likelihood statistics , *INFERENTIAL statistics , *PARAMETER estimation , *COVARIANCE matrices - Abstract
Multistate transition models (MSTMs) are valuable tools depicting disease progression. However, due to the complexity of MSTMs, larger sample size and longer follow-up time in real-world data, the computation of statistical estimation and inference for MSTMs becomes challenging. A bounded Taylor series in Newton–Raphson procedure is proposed which leverages the uniformization technique to derive maximum likelihood estimates and corresponding covariance matrix. The proposed method, namely uniformization Taylor-bounded Newton–Raphson, is validated in three simulation studies, which demonstrate the accuracy in parameter estimation, the efficiency in computation time and robustness in terms of different situations. This method is also illustrated using a large electronic medical record data related to statin-induced side effects and discontinuation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Inference methods for the Very Flexible Weibull distribution based on progressive type-II censoring.
- Author
-
Brito, Eder S., Ferreira, Paulo H., Tomazella, Vera L. D., Martins Neto, Daniele S. B., and Ehlers, Ricardo S.
- Subjects
- *
CENSORING (Statistics) , *MAXIMUM likelihood statistics , *BAYES' estimation , *WEIBULL distribution , *DATA modeling , *MARKOV chain Monte Carlo - Abstract
In this work, we present classical and Bayesian inferential methods based on samples in the presence of progressive type-II censoring under the Very Flexible Weibull (VFW) distribution. The considered distribution is relevant because it is an alternative to traditional non-flexible distributions and also to some flexible distributions already known in the literature, keeping the low amount of two parameters. In addition, studying it in a context of progressive censoring allows attesting to its applicability in data modeling from various areas of industry and technology that can use this censoring methodology. We obtain the maximum likelihood estimators of the model parameters, as well as their asymptotic variation measures. We propose the use of Markov chain Monte Carlo methods for the computation of Bayes estimates. A simulation study is carried out to evaluate the performance of the proposed estimators under different sample sizes and progressive type-II censoring schemes. Finally, the methodology is illustrated through three real data sets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Classical inference for time series of count data in parameter-driven models.
- Author
-
Marciano, Francisco William P.
- Subjects
- *
MAXIMUM likelihood statistics , *TIME series analysis , *CONFIDENCE intervals , *MARKOV chain Monte Carlo , *DATA modeling - Abstract
We study estimation on parameter-driven models for time series of counts. This class of models follows the structure of a generalized linear model in which the serial dependency is included in the model by the link function through a time-dependent latent process. The likelihood function for this class of models commonly cannot be calculated explicitly and computationally intensive methods like importance sampling and Markov chain Monte Carlo are used to estimate the model parameters. Here, we propose a simple and fast estimation procedure in a wide class of models that accommodate both discrete and continuous data. The maximum likelihood methodology is used to obtain the parameter estimates for the models under study. The simplicity of the procedure allows for build bootstrap confidence intervals for the hyperparameters and latent states of parameter-driven models. We perform extensive simulation studies to verify the asymptotic behavior of the parameter estimates, as well as present application of the proposed procedure through set of real data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Reliability analysis of multiple repairable systems under imperfect repair and unobserved heterogeneity.
- Author
-
Brito, Éder S., Tomazella, Vera L. D., Ferreira, Paulo H., Louzada Neto, Francisco, and Gonzatto Junior, Oilson A.
- Subjects
- *
ASYMPTOTIC efficiencies , *MAXIMUM likelihood statistics , *RELIABILITY in engineering , *FAILURE (Psychology) , *HETEROGENEITY - Abstract
Imperfect repairs (IRs) are widely applicable in reliability engineering since most equipment is not completely replaced after failure. In this sense, it is necessary to develop methodologies that can describe failure processes and predict the reliability of systems under this type of repair. One of the challenges in this context is to establish reliability models for multiple repairable systems considering unobserved heterogeneity associated with systems failure times and their failure intensity after performing IRs. Thus, in this work, frailty models are proposed to identify unobserved heterogeneity in these failure processes. In this context, we consider the arithmetic reduction of age (ARA) and arithmetic reduction of intensity (ARI) classes of IR models, with constant repair efficiency and a power‐law process distribution to model failure times and a univariate Gamma distributed frailty by all systems failure times. Classical inferential methods are used to estimate the parameters and reliability predictors of systems under IRs. An extensive simulation study is carried out under different scenarios to investigate the suitability of the models and the asymptotic consistency and efficiency properties of the maximum likelihood estimators. Finally, we illustrate the practical relevance of the proposed models on two real data sets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Reliability analysis of deep space satellites launched 1991–2020: Bulk population and deployable satellite performance analysis.
- Author
-
Grile, Travis M. and Bettinger, Robert A.
- Subjects
- *
ARTIFICIAL satellites , *MAXIMUM likelihood statistics , *SYSTEM failures , *FAILURE mode & effects analysis , *ARTIFICIAL satellite launching - Abstract
Flight data for deep space satellites launched and operated between 1991 and 2020 is analyzed to generate various reliability metrics. Satellite reliability is first estimated by the Kaplan‐Meier estimator, then parameterized through the Weibull distribution. This general process is applied to a general satellite data set that included all deep space satellites launched between 1991 and 2020, as well as two data subsets. One subset focuses on deployable satellites, while the other introduces a methodology of normalizing satellite lifetimes by satellite design life. Results from the general data set prove deep space satellites suffer from infant mortality while the results from the deployable data subset show deployable deep space satellites are only reliable over short periods of time. Results from the design life normalized data set give promising results, with satellites having a relatively high chance of reaching their design life. Available information regarding specific modes of failure is also leveraged to generate a percent contribution to overall satellite failure for eight distinct failure modes. Satellite failure due to crashing, in‐space propulsion failure, and telemetry system failure are proven to drive both early in life failure and later in life failure, making them the main causes of decreased reliability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. Bivariate Epanechnikov-exponential distribution: statistical properties, reliability measures, and applications to computer science data.
- Author
-
Barakat, H. M., Alawady, M. A., Husseiny, I. A., Nagy, M., Mansi, A. H., and Mohamed, M. O.
- Abstract
One important area of statistical theory and its applications to bivariate data modeling is the construction of families of bivariate distributions with specified marginals. This motivates the proposal of a bivariate distribution employing the Farlie-Gumbel-Morgenstern (FGM) copula and Epanechnikov exponential (EP-EX) marginal distribution, denoted by EP-EX-FGM. The EP-EX distribution is a complementing distribution, not a rival, to the exponential (EX) distribution. Its simple function shape and dependence on a single scale parameter make it an ideal choice for marginals in the suggested new bivariate distribution. The statistical properties of the EP-EX-FGM model are examined, including product moments, coefficient of correlation between the internal variables, moment generating function, conditional distribution, concomitants of order statistics (OSs), mean residual life function, and vitality function. In addition, we calculated reliability and information measures including the hazard function, reversed hazard function, positive quadrant dependence feature, bivariate extropy, bivariate weighted extropy, and bivariate cumulative residual extropy. Estimating model parameters is accomplished by utilizing maximum likelihood, asymptotic confidence intervals, and Bayesian approaches. Finally, the advantage of EP-EX-FGM over the bivariate Weibull FGM distribution, bivariate EX-FGM distribution, and bivariate generalized EX-FGM distribution is illustrated using actual data sets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Analysis of stress-strength reliability with m-step strength levels under type I censoring and Gompertz distribution.
- Author
-
Youssef Temraz, Neama Salah
- Abstract
Because of modern technology, product reliability has increased, making it more challenging to evaluate products in real-world settings and raising the cost of gathering sufficient data about a product's lifetime. Instead of using stress to accelerate failures, the most practical way to solve this problem is to use accelerated life tests, in which test units are subjected to varying degrees of stress. This paper deals with the analysis of stress-strength reliability when the strength variable has changed m levels at predetermined times. It is common for the observed failure time data of items to be partially unavailable in numerous reliability and life-testing studies. In statistical analyses where data is censored, lowering the time and expense involved is vital. Maximum likelihood estimation when the stress and strength variables follow the Gompertz distribution was introduced under type I censoring data. The bootstrap confidence intervals were deduced for stress-strength reliability under m levels of strength variable and applying the Gompertz distribution to model time. A simulation study was introduced to find the maximum likelihood estimates, bootstrapping, and credible intervals for stressstrength reliability. Real data was presented to show the application of the model in real life. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. Control Charts Based on Zero to k Inflated Power Series Regression Models and Their Applications.
- Author
-
Saboori, Hadi and Doostparast, Mahdi
- Abstract
In many different fields and industries, count data are publicly accessible. Control charts are used in quality studies to track count procedures. These control charts, however, only have a limited impact on zero-inflated data that contains extra zeros. The Zero-inflated power series (ZIPS) models, particularly its crucial sub-models, the Zero-inflated Poisson (ZIP), the Zero-inflated Negative binomial (ZINB), and the Zero-inflated Logarithmic (ZIL) models, are crucial approaches to handle the count data, and some control charts based on them have been proposed. However, there are situations when inflation can happen at one or more points other than zero (for instance, at one) or at more than one point (for instance, zero, one, and two). In these situations, the family of zero to k inflated power series (ZKIPS) models must be used in the control. In this work, we use a weighted score test statistic to examine upper-sided Shewhart, exponentially weighted moving average, and exponentially weighted moving average control charts. We only conducted numerical experiments on the zero to k Poisson model, which is one of the zero to k power series models, as an example. In ZKIPS models, the exponentially weighted moving average control chart can identify positive changes in the basis distribution's characteristics. By adding random effects, this method, in particular, enables boosting the capability of detecting unnatural heterogeneity variability. For detecting small to moderate shifts, the proposed strategy is more effective than the current Shewhart chart, according to simulation findings obtained using the Monte Carlo methodology. To show the charts' usefulness, they are also applied to a real example. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Order-restricted statistical inference and optimal censoring scheme for Gompertz distribution based on joint type-II progressive censored data.
- Author
-
Ren, Feiyan, Ye, Tianrui, and Gui, Wenhao
- Subjects
- *
MONTE Carlo method , *FISHER information , *DISTRIBUTION (Probability theory) , *BAYES' estimation , *INFERENTIAL statistics , *EXPECTATION-maximization algorithms - Abstract
AbstractThis paper considers the order-restricted statistical inference for two populations based on the joint type-II progressive censoring scheme. The lifetime distributions of the two populations are supposed to follow the Gompertz distribution with the same shape parameter but different scale parameters. The maximum likelihood estimates of the unknown parameters are derived by employing the Newton-Raphson algorithm and the expectation-maximization algorithm, respectively. The Fisher information matrix is then employed to construct asymptotic confidence intervals. For Bayes estimation, we assume an ordered Beta-Gamma prior for the scale parameters and a Gamma prior for the common shape parameter. Bayes estimations and the highest posterior density credible intervals for unknown parameters under two different loss functions are obtained with the importance sampling technique. To evaluate the performance of order-restricted inference, extensive Monte Carlo simulations are performed and two air-conditioning systems datasets are used to illustrate the proposed inference methods. In addition, the results are compared with the case when there is no order restriction on the parameters. Finally, the optimal censoring scheme is obtained by four optimality criteria. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Revisiting the Briggs Ancient DNA Damage Model: A Fast Maximum Likelihood Method to Estimate Post‐Mortem Damage.
- Author
-
Zhao, Lei, Henriksen, Rasmus Amund, Ramsøe, Abigail, Nielsen, Rasmus, and Korneliussen, Thorfinn Sand
- Subjects
- *
FOSSIL DNA , *MAXIMUM likelihood statistics , *DAMAGE models , *DNA analysis , *DNA sequencing - Abstract
ABSTRACT One essential initial step in the analysis of ancient DNA is to authenticate that the DNA sequencing reads are actually from ancient DNA. This is done by assessing if the reads exhibit typical characteristics of post‐mortem damage (PMD), including cytosine deamination and nicks. We present a novel statistical method implemented in a fast multithreaded programme, ngsBriggs that enables rapid quantification of PMD by estimation of the Briggs ancient damage model parameters (Briggs parameters). Using a multinomial model with maximum likelihood fit, ngsBriggs accurately estimates the parameters of the Briggs model, quantifying the PMD signal from single and double‐stranded DNA regions. We extend the original Briggs model to capture PMD signals for contemporary sequencing platforms and show that ngsBriggs accurately estimates the Briggs parameters across a variety of contamination levels. Classification of reads into ancient or modern reads, for the purpose of decontamination, is significantly more accurate using ngsBriggs than using other methods available. Furthermore, ngsBriggs is substantially faster than other state‐of‐the‐art methods. ngsBriggs offers a practical and accurate method for researchers seeking to authenticate ancient DNA and improve the quality of their data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Detection and Estimation of Diffuse Signal Components Using the Periodogram.
- Author
-
Selva, Jesus
- Subjects
- *
MAXIMUM likelihood statistics , *CHEBYSHEV polynomials , *VECTOR data , *INTERPOLATION , *DETECTORS - Abstract
One basic limitation of using the periodogram as a frequency estimator is that any of its significant peaks may result from a diffuse (or spread) frequency component rather than a pure one. Diffuse components are common in applications such as channel estimation, in which a given periodogram peak reveals the presence of a complex multipath distribution (unresolvable propagation paths or diffuse scattering, for example). We present a method to detect the presence of a diffuse component in a given peak based on analyzing the projection of the data vector onto the span of the signature's derivatives up to a given order. Fundamentally, a diffuse component is detected if the energy in the derivatives' subspace is too high at the peak's frequency, and its spread is estimated as the ratio between this last energy and the peak's energy. The method is based on exploiting the signature's Vandermonde structure through the properties of discrete Chebyshev polynomials. We also present an efficient numerical procedure for computing the data component in the derivatives' span based on barycentric interpolation. The paper contains a numerical assessment of the proposed estimator and detector. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Analysis of Carbon Dioxide Value with Extreme Value Theory Using Generalized Extreme Value Distribution.
- Author
-
Khamrot, Pannawit, Phankhieo, Narin, Wachirawongsakorn, Piyada, Piros, Supanaree, and Deetae, Natthinee
- Subjects
- *
DISTRIBUTION (Probability theory) , *EXTREME value theory , *CARBON emissions , *CARBON dioxide analysis , *MAXIMUM likelihood statistics , *ENVIRONMENTAL risk - Abstract
This paper applies the generalized extreme value (GEV) distribution using maximum likelihood estimates to analyze extreme carbon dioxide data collected by the Provincial Energy Office of Phitsanulok from 2010 to 2023. The study aims to model return levels for carbon dioxide emissions for the periods of 5, 25, 50, and 100 years, utilizing data from various fuels--Gasohol E85, Gasohol E20, Gasohol 91, Gasohol 95, ULG95, and LPG. By fitting the GEV distribution, this research not only categorizes the behavior of emissions data under different subclasses of the GEV distribution but also confirms the suitability of the GEV model for this dataset. The findings indicate a trend of increasing return levels, suggesting rising peaks in carbon dioxide emissions over time. This model provides a valuable tool for forecasting and managing environmental risks associated with high emission levels. [ABSTRACT FROM AUTHOR]
- Published
- 2024
36. An Analysis of Type-I Generalized Progressive Hybrid Censoring for the One Parameter Logistic-Geometry Lifetime Distribution with Applications.
- Author
-
Nagy, Magdy, Mosilhy, Mohamed Ahmed, Mansi, Ahmed Hamdi, and Abu-Moussa, Mahmoud Hamed
- Subjects
- *
RETICULUM cell sarcoma , *MARKOV chain Monte Carlo , *MAXIMUM likelihood statistics , *DEATH rate , *HAZARD function (Statistics) - Abstract
Based on Type-I generalized progressive hybrid censored samples (GPHCSs), the parameter estimate for the unit-half logistic-geometry (UHLG) distribution is investigated in this work. Using maximum likelihood estimation (MLE) and Bayesian estimation, the parameters, reliability, and hazard functions of the UHLG distribution under GPHCSs have been assessed. Likewise, the computation is carried out for the asymptotic confidence intervals (ACIs). Furthermore, two bootstrap CIs, bootstrap-p and bootstrap-t, are mentioned. For symmetric loss functions, like squared error loss (SEL), and asymmetric loss functions, such as linear exponential loss (LL) and general entropy loss (GEL), there are specific Bayesian approximations. The Metropolis–Hastings samplers methodology were used to construct the credible intervals (CRIs). In conclusion, a genuine data set measuring the mortality statistics of a group of male mice with reticulum cell sarcoma is regarded as an application of the methods given. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Bivariate Length-Biased Exponential Distribution under Progressive Type-II Censoring: Incorporating Random Removal and Applications to Industrial and Computer Science Data.
- Author
-
Fayomi, Aisha, Almetwally, Ehab M., and Qura, Maha E.
- Subjects
- *
DISTRIBUTION (Probability theory) , *COPULA functions , *CONFIDENCE intervals , *MAXIMUM likelihood statistics , *DATA science - Abstract
In this paper, we address the analysis of bivariate lifetime data from a length-biased exponential distribution observed under Type II progressive censoring with random removals, where the number of units removed at each failure time follows a binomial distribution. We derive the likelihood function for the progressive Type II censoring scheme with random removals and apply it to the bivariate length-biased exponential distribution. The parameters of the proposed model are estimated using both likelihood and Bayesian methods for point and interval estimators, including asymptotic confidence intervals and bootstrap confidence intervals. We also employ different loss functions to construct Bayesian estimators. Additionally, a simulation study is conducted to compare the performance of censoring schemes. The effectiveness of the proposed methodology is demonstrated through the analysis of two real datasets from the industrial and computer science domains, providing valuable insights for illustrative purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Parameter Estimation of Uncertain Differential Equations Driven by Threshold Ornstein–Uhlenbeck Process with Application to U.S. Treasury Rate Analysis.
- Author
-
Li, Anshui, Wang, Jiajia, and Zhou, Lianlian
- Subjects
- *
STOCHASTIC differential equations , *MAXIMUM likelihood statistics , *DIFFERENTIAL equations , *MOMENTS method (Statistics) , *EVIDENCE gaps - Abstract
Uncertain differential equations, as an alternative to stochastic differential equations, have proved to be extremely powerful across various fields, especially in finance theory. The issue of parameter estimation for uncertain differential equations is the key step in mathematical modeling and simulation, which is very difficult, especially when the corresponding terms are driven by some complicated uncertain processes. In this paper, we propose the uncertainty counterpart of the threshold Ornstein–Uhlenbeck process in probability, named the uncertain threshold Ornstein–Uhlenbeck process, filling the gaps of the corresponding research in uncertainty theory. We then explore the parameter estimation problem under different scenarios, including cases where certain parameters are known in advance while others remain unknown. Numerical examples are provided to illustrate our method proposed. We also apply the method to study the term structure of the U.S. Treasury rates over a specific period, which can be modeled by the uncertain threshold Ornstein–Uhlenbeck process mentioned in this paper. The paper concludes with brief remarks and possible future directions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Diagnostic analytics for a GARCH model under skew-normal distributions.
- Author
-
Liu, Yonghui, Wang, Jing, Yao, Zhao, Liu, Conan, and Liu, Shuangzhe
- Subjects
- *
GARCH model , *MONTE Carlo method , *MAXIMUM likelihood statistics , *DATA analytics , *EXPECTATION-maximization algorithms , *CURVATURE - Abstract
In this paper, a generalized autoregressive conditional heteroskedasticity model under skew-normal distributions is studied. A maximum likelihood approach is taken and the parameters in the model are estimated based on the expectation-maximization algorithm. The statistical diagnostics is made through the local influence technique, with the normal curvature and diagnostics results established for the model under four perturbation schemes in identifying possible influential observations. A simulation study is conducted to evaluate the performance of our proposed method and a real-world application is presented as an illustrative example. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Reliability evaluation for Weibull distribution with heavily Type II censored data.
- Author
-
Liu, Mengyu, Zheng, Huiling, and Yang, Jun
- Subjects
- *
MAXIMUM likelihood statistics , *LEAST squares , *WEIBULL distribution , *ESTIMATION bias , *CENSORSHIP - Abstract
The lifetime data collected from the field are usually heavily censored, in which case, getting an accurate reliability evaluation based on heavily censored data is challenging. For heavily Type‐II censored data, the parameters estimation bias of traditional methods (i.e., maximum likelihood estimation (MLE) and least squares estimation (LSE)) are still large, and Bayesian methods are hard to specify the priors in practice. Therefore, considering the existing range of shape parameter for Weibull distribution, this study proposes two novel parameter estimation methods, the three‐step MLE method and the hybrid estimation method. For the three‐step MLE method, the initial estimates of shape and scale parameters are first respectively derived using MLE, then are updated by the single parameter MLE method with the range constraint of shape parameter. For the hybrid estimation method, the shape parameter is estimated by the LSE method with the existing range constraint of shape parameter, then the scale parameter estimate can be obtained by MLE. On this basis, two numerical examples are performed to demonstrate the consistency and effectiveness of the proposed methods. Finally, a case study on turbine engines is given to verify the effectiveness and applicability of the proposed methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. A SCALE PARAMETERS AND MODIFIED RELIABILITY ESTIMATION FOR THE INVERSE EXPONENTIAL RAYLEIGH DISTRIBUTION.
- Author
-
AL-Sultany, Shurooq A. K.
- Subjects
- *
RAYLEIGH model , *MAXIMUM likelihood statistics , *SAMPLE size (Statistics) - Abstract
This paper present methods for estimating a scale parameters and modified reliability for the Inverse Exponential Rayleigh Distribution include Maximum Likelihood, rank set sampling and Cramér-von-Mises Estimations. In all the mentioned estimation methods, the Newton-Raphson iterative numerical method was used. Then a simulation was conducted to compare the three methods with six cases and different sample sizes. The comparisons between scale parameter estimates were based on values from Mean Square Error while it was based on values from Integrated Mean Square Error for the estimates of the modified reliability function. The results show that Cramér-von-Mises (MCV) estimators is the best among the other two methods for estimating the modified reliability function. [ABSTRACT FROM AUTHOR]
- Published
- 2024
42. Inference on process capability index Spmk for a new lifetime distribution.
- Author
-
Karakaya, Kadir
- Subjects
- *
MONTE Carlo method , *PROCESS capability , *MAXIMUM likelihood statistics , *CONTINUOUS distributions , *CONFIDENCE intervals - Abstract
In various applied disciplines, the modeling of continuous data often requires the use of flexible continuous distributions. Meeting this demand calls for the introduction of new continuous distributions that possess desirable characteristics. This paper introduces a new continuous distribution. Several estimators for estimating the unknown parameters of the new distribution are discussed and their efficiency is assessed through Monte Carlo simulations. Furthermore, the process capability index S pmk is examined when the underlying distribution is the proposed distribution. The maximum likelihood estimation of the S pmk is also studied. The asymptotic confidence interval is also constructed for S pmk . The simulation results indicate that estimators for both the unknown parameters of the new distribution and the S pmk provide reasonable results. Some practical analyses are also performed on both the new distribution and the S pmk . The results of the conducted data analysis indicate that the new distribution yields effective outcomes in modeling lifetime data in the literature. Similarly, the data analyses performed for S pmk illustrate that the new distribution can be utilized for process capability indices by quality controllers. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Reliability estimation and statistical inference under joint progressively Type-II right-censored sampling for certain lifetime distributions.
- Author
-
Lin, Chien-Tai, Chen, Yen-Chou, Yeh, Tzu-Chi, and Ng, Hon Keung Tony
- Abstract
AbstractIn this article, the parameter estimation of several commonly used two-parameter lifetime distributions, including the Weibull, inverse Gaussian, and Birnbaum–Saunders distributions, based on joint progressively Type-II right-censored sample is studied. Different numerical methods and algorithms are used to compute the maximum likelihood estimates of the unknown model parameters. These methods include the Newton–Raphson method, the stochastic expectation–maximization (SEM) algorithm, and the dual annealing (DA) algorithm. These estimation methods are compared in terms of accuracy (e.g. the bias and mean squared error), computational time and effort (e.g. the required number of iterations), the ability to obtain the largest value of the likelihood, and convergence issues by means of a Monte Carlo simulation study. Recommendations are made based on the simulated results. A real data set is analyzed for illustrative purposes. These methods are implemented in Python, and the computer programs are available from the authors upon request. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Zero-Inflated Binary Classification Model with Elastic Net Regularization.
- Author
-
Xin, Hua, Lio, Yuhlong, Chen, Hsien-Ching, and Tsai, Tzong-Ru
- Subjects
- *
MACHINE learning , *MAXIMUM likelihood statistics , *EXPECTATION-maximization algorithms , *OPEN-ended questions , *DIABETES - Abstract
Zero inflation and overfitting can reduce the accuracy rate of using machine learning models for characterizing binary data sets. A zero-inflated Bernoulli (ZIBer) model can be the right model to characterize zero-inflated binary data sets. When the ZIBer model is used to characterize zero-inflated binary data sets, overcoming the overfitting problem is still an open question. To improve the overfitting problem for using the ZIBer model, the minus log-likelihood function of the ZIBer model with the elastic net regularization rule for an overfitting penalty is proposed as the loss function. An estimation procedure to minimize the loss function is developed in this study using the gradient descent method (GDM) with the momentum term as the learning rate. The proposed estimation method has two advantages. First, the proposed estimation method can be a general method that simultaneously uses L 1 - and L 2 -norm terms for penalty and includes the ridge and least absolute shrinkage and selection operator methods as special cases. Second, the momentum learning rate can accelerate the convergence of the GDM and enhance the computation efficiency of the proposed estimation procedure. The parameter selection strategy is studied, and the performance of the proposed method is evaluated using Monte Carlo simulations. A diabetes example is used as an illustration. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Tensile Properties of Cattail Fibres at Various Phenological Development Stages.
- Author
-
Hossain, Mohammed Shahadat, Rahman, Mashiur, and Cicek, Nazim
- Subjects
- *
MAXIMUM likelihood statistics , *CALCIUM oxalate , *WEIBULL distribution , *INDUSTRIAL capacity , *GROWING season - Abstract
Cattails (Typha latifolia L.) are naturally occurring aquatic macrophytes with significant industrial potential because of their abundance, high-quality fibers, and high fiber yields. This study is the first attempt to investigate how phenological development and plant maturity impact the quality of cattail fibers as they relate to composite applications. It was observed that fibers from all five growth stages exhibited a Weibull shape parameter greater than 1.0, with a goodness-of-fit exceeding 0.8. These calculations were performed using both the Least Square Regression (LSR) and Maximum Likelihood Estimation (MLE) methods. Among the estimators, the MLE method provided the most conservative estimation of Weibull parameters. Based on the Weibull parameters obtained with all estimators, cattail fibers from all five growth stages appear suitable for composite applications. The consistency of shape parameters across all five growth stages can be attributed to the morphological and molecular developments of cattail fiber during the vegetative period. These developments were confirmed through the presence of calcium oxalate (CaOx) plates, elemental composition, and specific infrared peaks at 2360 cm−1 contributing to the strength, cellulose peaks at 1635 cm−1, 2920 cm−1, and 3430 cm−1. In conclusion, it was found that the mechanical properties of cattail fiber remain similar when harvested multiple times in a single growing season. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Reliability analysis of two Gompertz populations under joint progressive type-ii censoring scheme based on binomial removal.
- Author
-
Abo-Kasem, O.E., Almetwally, Ehab M., and Abu El Azm, Wael S.
- Subjects
- *
MONTE Carlo method , *CENSORING (Statistics) , *BAYES' estimation , *DISTRIBUTION (Probability theory) , *MAXIMUM likelihood statistics , *MARKOV chain Monte Carlo - Abstract
Analysis of jointly censoring schemes has received considerable attention in the last few years. In this paper, maximum likelihood and Bayes methods of estimation are used to estimate the unknown parameters of two Gompertz populations under a joint progressive Type-II censoring scheme. Bayesian estimations of the unknown parameters are obtained based on squared error loss functions under the assumption of independent gamma priors. We propose to apply the Markov Chain Monte Carlo technique to carry out a Bayes estimation procedure. The approximate, bootstrap, and credible confidence intervals for the unknown parameters are also obtained. Also, reliability and hazard rate function of the two Gompertz populations under joint progressive Type-II censoring scheme is obtained and the corresponding approximate confidence intervals. Finally, all the theoretical results obtained are assessed and compared using two real-world data sets and Monte Carlo simulation studies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Concentration inequalities of MLE and robust MLE.
- Author
-
Yang, Xiaowei, Liu, Xinqiao, and Wei, Haoyu
- Subjects
- *
MAXIMUM likelihood statistics , *MACHINE learning , *STATISTICS - Abstract
The Maximum Likelihood Estimator (MLE) serves an important role in statistics and machine learning. In this article, for i.i.d. variables, we obtain constant-specified and sharp concentration inequalities and oracle inequalities for the MLE only under exponential moment conditions. Furthermore, in a robust setting, the sub-Gaussian type oracle inequalities of the log-truncated maximum likelihood estimator are derived under the second-moment condition. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. The exponentiated-Weibull proportional hazard regression model with application to censored survival data.
- Author
-
Ishag, Mohamed A.S., Wanjoya, Anthony, Adem, Aggrey, Alsultan, Rehab, Alghamdi, Abdulaziz S., and Afify, Ahmed Z.
- Subjects
PROPORTIONAL hazards models ,MONTE Carlo method ,REGRESSION analysis ,CENSORING (Statistics) ,MAXIMUM likelihood statistics - Abstract
The proportional hazard regression models are widely used statistical tools for analyzing survival data and estimating the effects of covariates on survival times. It is assumed that the effects of the covariates are constant across the time. In this paper, we propose a novel extension of the proportional hazard model by incorporating an exponentiated-Weibull distribution to model the baseline line hazard function. The proposed model offers more flexibility in capturing various shapes of failure rates and accommodates both monotonic and non-monotonic hazard shapes. The performance evaluation of the proposed model and comparison with other commonly used survival models including the generalized log–logistic, Weibull, Gompertz, and exponentiated exponential PH regression models are explored using simulation results. The results demonstrate the ability of the introduced model to capture the baseline hazard shapes and to estimate the effect of covariates on the hazard function accurately. Furthermore, two real survival medical data sets are analyzed to illustrate the practical importance of the proposed model to provide accurate predictions of survival outcomes for individual patients. Finally, the survival data analysis reveal that the model is a powerful tool for analyzing complex survival data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Frequentist and Bayesian approach for the generalized logistic lifetime model with applications to air-conditioning system failure times under joint progressive censoring data.
- Author
-
Hasaballah, Mustafa M., Balogun, Oluwafemi Samson, and Bakr, M. E.
- Subjects
MARKOV chain Monte Carlo ,MONTE Carlo method ,BAYES' estimation ,MAXIMUM likelihood statistics ,INFERENTIAL statistics - Abstract
Based on joint progressive Type-II censored data, we examined the statistical inference of the generalized logistic distribution with different shape and scale parameters in this research. Wherever possible, we explored maximum likelihood estimators for unknown parameters within the scope of the joint progressive censoring scheme. Bayesian inferences for these parameters were demonstrated using a Gamma prior under the squared error loss function and the linear exponential loss function. It was important to note that obtaining Bayes estimators and the corresponding credible intervals was not straightforward; thus, we recommended using the Markov Chain Monte Carlo method to compute them. We performed real-world data analysis for demonstrative purposes and ran Monte Carlo simulations to compare the performance of all the suggested approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Context-Driven Service Deployment Using Likelihood-Based Approach for Internet of Things Scenarios.
- Author
-
Banerji, Nandan, Paul, Chayan, Debnath, Bikash, Das, Biplab, Chhabra, Gurpreet Singh, Mohanta, Bhabendu Kumar, and Awad, Ali Ismail
- Subjects
MAXIMUM likelihood statistics ,INTERNET of things ,CONSUMPTION (Economics) ,INFORMATION services ,MIDDLEWARE - Abstract
In a context-aware Internet of Things (IoT) environment, the functional contexts of devices and users will change over time depending on their service consumption. Each iteration of an IoT middleware algorithm will also encounter changes occurring in the contexts due to the joining/leaving of new/old members; this is the inherent nature of ad hoc IoT scenarios. Individual users will have notable preferences in their service consumption patterns; by leveraging these patterns, the approach presented in this article focuses on how these changes impact performance due to functional-context switching over time. This is based on the idea that consumption patterns will exhibit certain time-variant correlations. The maximum likelihood estimation (MLE) is used in the proposed approach to capture the impact of these correlations and study them in depth. The results of this study reveal how the correlation probabilities and the system performance change over time; this also aids with the construction of the boundaries of certain time-variant correlations in users' consumption patterns. In the proposed approach, the information gleaned from the MLE is used in arranging the service information within a distributed service registry based on users' service usage preferences. Practical simulations were conducted over small (100 nodes), medium (1000 nodes), and relatively larger (10,000 nodes) networks. It was found that the approach described helps to reduce service discovery time and can improve the performance in service-oriented IoT scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.