1,065 results
Search Results
2. Bayesian acceptance sampling plan for simple step-stress model.
- Author
-
Prajapati, Deepak and Kundu, Debasis
- Subjects
- *
ACCEPTANCE sampling , *CENSORING (Statistics) , *STATISTICAL sampling , *TIME pressure - Abstract
This paper presents a Bayesian acceptance sampling plan for simple step-stress life testing model, when the stress changing time is random, and the experimental units are subject to Type-II censoring. It is assumed that the lifetime distribution of the experimental item is exponential at each stress level with different scale parameters. We first present the Bayes decision function under a specified loss function. Based on the Bayes decision function, the optimal Bayesian sampling plan is determined by minimizing the Bayes risk. We also present some optimal Bayesian sampling plan under specific cases. We further provide order restricted optimal Bayesian sampling plan also. The Bayesian sampling plans presented in this paper are quite beneficial when the experimental items are highly reliable, which is quite common in present days. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Computational aspects of likelihood-based inference for the univariate generalized hyperbolic distribution.
- Author
-
van Wyk, Arnold, Azzalini, Adelchi, and Bekker, Andriette
- Abstract
AbstractThe generalized hyperbolic distribution is among the more often adopted parametric families in a wide range of application areas, thanks to its high flexibility as the parameters vary and also to a plausible stochastic mechanism for its genesis. This high flexibility comes at some cost, however, namely the frequent difficulty of estimating its parameters due to the presence of flat areas of the log-likelihood function, so that selected points of the parameter space, while very distant, can be essentially equivalent as for data fitting. This phenomenon affects not only maximum likelihood estimation, but Bayesian methods too, since the target function is little affected by the introduction of a prior distribution. Our interest focuses in fact on maximum likelihood estimation of the Generalized hyperbolic distribution, working in the univariate case. This paper improves upon currently employed computational techniques by presenting an alternative proposal that works effectively in reaching the global maximum of the likelihood function. The paper further illustrates the above mentioned problems in a number of cases, using both simulated and real data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Outlier detection in cylindrical data based on Mahalanobis distance.
- Author
-
Dhamale, Prashant S. and Kashikar, Akanksha S.
- Subjects
- *
DATA analysis - Abstract
Cylindrical data are bivariate data formed from the combination of circular and linear variables. Identifying outliers is a crucial step in any data analysis work. This paper proposes a new distribution-free procedure to detect outliers in cylindrical data using the Mahalanobis distance concept. The use of Mahalanobis distance incorporates the correlation between the components of the cylindrical distribution, which had not been accounted for in the earlier papers on outlier detection in cylindrical data. The threshold for declaring an observation to be an outlier can be obtained via parametric or non-parametric bootstrap, depending on whether the underlying distribution is known or unknown. The performance of the proposed method is examined via extensive simulations from the Johnson-Wehrly distribution. The proposed method is applied to two real datasets, and the outliers are identified in those datasets. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
5. Moment estimation for uncertain regression model with application to factors analysis of grain yield.
- Author
-
Liu, Yang
- Subjects
- *
FACTOR analysis , *REGRESSION analysis , *GRAIN yields , *STOCHASTIC models , *DATA modeling - Abstract
Uncertain regression analysis is a powerful analytical tool to model the relationships between explanatory variables and the response variable by uncertainty theory. One of the core problems in uncertain regression analysis is to estimate the unknown parameters of an uncertain regression model and the uncertain disturbance term. In this paper, the moment estimation of uncertain regression model is proposed, which can determine both the uncertain regression model and the disturbance term at one time. After that, the uncertain hypothesis test is used to test whether the estimated uncertain regression model is appropriate. Furthermore, a real-world example of factors analysis of grain yield is provided to illustrate the moment estimation. Finally, as a byproduct, this paper also indicates that the stochastic regression model cannot model the agriculture data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. The TSCR method for precision estimation of ill-posed mixed additive and multiplicative random error model.
- Author
-
Wang, Leyang, Chen, Tao, and Zou, Chuanyi
- Subjects
- *
PARAMETER estimation , *TAYLOR'S series , *NONLINEAR functions , *TIKHONOV regularization - Abstract
Estimating the precision information of parameter estimation can fully reflect the quality of parameter estimation. In this paper, we first derive the weighted least-square regularization iterative (WLSRI) solution and mean square error (MSE) matrix of the ill-posed mixed additive and multiplicative random error (MAMRE) model. Then, considering that the gradual iterative process of the WLSRI solution will affect the final parameter estimation and precision information and further lead to a complex nonlinear function relationship, the traditional Taylor expansion approximate function method cannot be used to solve. Therefore, this paper introduces the derivative-free third-degree spherical-radial cubature rule (TSCR) method for precision estimation of the ill-posed MAMRE model, which generates a series of samples with the same weight by the fixed sampling strategy and further uses the WLSRI method to calculate. Finally, the experiment research and analysis results illustrate that compared with the existing solutions without considering the ill-posed problem, the WLSRI method is applicable and can obtain reasonable parameter estimation and precision information in solving the ill-posed MAMRE model; while the TSCR method can obtain more accurate parameter estimation and precision information than the WLSRI method, which enriches the theoretical research on the precision estimation problem of ill-posed MAMRE model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Analysis of overlapping count data.
- Author
-
Ryan, Kenneth J., Brydon, Michaela S., Leatherman, Erin R., and Hamada, Michael S.
- Subjects
- *
MAXIMUM likelihood statistics , *POISSON regression , *REGRESSION analysis , *DATA modeling - Abstract
Counts of a specific characteristic were obtained within regions defined on an object that was manufactured in a proprietary setting. The count regions were altered during production and resulted in misaligned or overlapping count data. A closed-formula maximum likelihood estimator (MLE) of the new region means is derived using all of the available count data and an independent Poisson model. The MLE is shown to be preferable to estimators constructed using generalized linear models for the overlapping data setting. This closed-form estimator extends to over-dispersed overlapping count data as the quasi-MLE and also performs well with correlated overlapping count data. Standard errors for the estimator are approximated and are validated with a simulation study. Additionally, the methods are extended to overlapping multinomial data. Illustrative examples of the methods are provided throughout the paper and are reproducible with the supplemental R code. Proofs of the paper's results are also included in the supplemental material. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. A review of ridge parameter selection: minimization of the mean squared error vs. mitigation of multicollinearity.
- Author
-
García García, Catalina, Salmerón Gómez, Roman, and García Pérez, José
- Subjects
- *
MONTE Carlo method , *ESTIMATION bias , *PARAMETER estimation , *PRICE inflation - Abstract
Ridge Estimation (RE) is a widespread method to overcome the problem of collinearity defining a class of estimators depending on the non-negative scalar parameter k. A great number of papers focus on the estimation of this biasing parameter. Traditionally, the mean squared error criterion is used to compare the performance of the different proposed estimators. However, the minimization of the mean squared error (MSE) does not always guarantee the mitigation of collinearity, meaning it is possible, for example, to obtain a variance inflation factor (VIF) higher than 10 for the k that minimizes the MSE. In this paper, we propose the VIF criteria to select the biased ridge parameter. A Monte Carlo simulation is presented with results that support this idea. Also, two real life empirical applications are used to illustrate the contribution of this paper. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. A modified ridge estimator in Cox regression model.
- Author
-
Algamal, Zakariya Yahya
- Subjects
- *
PROPORTIONAL hazards models , *MONTE Carlo method , *MAXIMUM likelihood statistics , *REGRESSION analysis , *MULTICOLLINEARITY - Abstract
AbstractThe most used approach for survival data is the Cox proportional hazards regression model. Multicollinearity, however, is known to have a detrimental impact on the variance of maximum likelihood estimator of the Cox regression coefficients. It has been repeatedly shown that the ridge estimator is a desirable shrinking technique to lessen the consequences of multicollinearity. In this paper, a linearized ridge estimator is developed to transform the biasing parameter of the ordinary ridge estimator to a linearized version and study its performance in the Cox regression model under multicollinearity. Our Monte Carlo simulation findings and the real data application indicate that the suggested estimator may significantly reduce mean squared error in comparison to other competing estimators. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
10. Non-Parametric Estimation for Locally Stationary Integer-Valued Processes.
- Author
-
Bendjeddou, Sara and Sadoun, Mohamed
- Subjects
- *
ASYMPTOTIC normality , *NONPARAMETRIC estimation , *STATIONARY processes , *KERNEL functions , *INTEGERS - Abstract
This paper aims to study the non parametric Negative Binomial Quasi-Maximum Likelihood Estimation (NBQMLE) for locally stationary integer valued processes. So, we have considered two locally stationary integer-valued models of negative binomial type, namely: INARCH (p) and INGARCH (p , q) models. Imposing some contraction arguments, we have extended the stationary negative-binomial QMLE to a localized one in our non-stationary environment. This estimation method is based on a kernel function that achieves the convergence rates of n h n order. Under some regularity assumptions, the consistency, as well as the asymptotic normality of the obtained estimator, are established. The performances of the established estimators are evaluated via a simulation study and an application to real data set. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
11. New methods for phase II monitoring of multivariate simple linear profiles.
- Author
-
Ghasemi, Zohre, Zeinal Hamadani, Ali, and Ahmadi Yazdi, Ahmad
- Subjects
- *
STATISTICAL process control , *MOVING average process , *PRODUCT quality - Abstract
In some statistical process control applications, the quality of a process or product is characterized by a function that relates a response variable to one or more explanatory variables, referred to as a "profile". In some cases, multivariate simple linear profiles are required for effective quality modeling. In these profiles, there is a set of correlated response variables, regressed on an explanatory variable. There have been only a limited number of studies on multivariate simple linear profiles monitoring. In this paper, three new methods are proposed based on multivariate homogeneously weighted moving average (MHWMA) control chart to improve monitoring of multivariate simple linear profiles. The performance of the proposed methods is evaluated by simulated ARL metric. A comprehensive comparison is also conducted between the performance of the proposed methods and the existing ones. The results indicated that the proposed methods functioned very well under the various conditions considered. In addition, the practical application of the proposed control charts is also demonstrated using two real case studies. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
12. Distributed estimation for large-scale expectile regression.
- Author
-
Pan, Yingli, Wang, Haoyu, Zhao, Xiaoluo, Xu, Kaidong, and Liu, Zhan
- Subjects
- *
DISTRIBUTED algorithms , *DATA warehousing , *REGRESSION analysis , *HETEROSCEDASTICITY , *HEALTH surveys - Abstract
Analysis of large volume of data is very complex due to not only the high level of skewness and heteroscedasticity of variance but also the difficulty of data storage. Expectile regression is a common alternative method to analyze heterogeneous data. Distributed storage can reduce effectively the storage burden of a single machine. In this paper, we consider fitting linear expectile regression model to estimate conditional expectile based on large-scale data. We store the data in a distributed manner and construct a gradient-enhanced loss (GEL) function as a proxy for the global loss function. A distributed algorithm is proposed for the optimization of the GEL function. The asymptotic properties of the proposed estimator are established. Simulation studies are conducted to assess the finite-sample performance of our proposed estimator. Applications to an analysis of the National Health Interview Survey data set demonstrate the practicability of the proposed method. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
13. A new goodness of fit test for gamma distribution with censored observations.
- Author
-
K. M., Vaisakh, P., Sreedevi E., and Kattumannil, Sudheesh K.
- Subjects
- *
MONTE Carlo method , *GAMMA distributions , *U-statistics , *STATISTICS , *CENSORSHIP - Abstract
In the present paper, we develop a new goodness of fit test for gamma distribution based on the fixed point characterization. We use U-Statistics theory to derive the test statistic. We discuss how the right censored observations are incorporated into the proposed test. The asymptotic properties of the test statistics in censored and uncensored cases are studied. We carry out a Monte Carlo simulation study to validate the finite sample performance of the proposed tests. We also illustrate the test procedures using real data sets. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
14. Large portfolio allocation based on high-dimensional regression and Kendall's Tau.
- Author
-
Zhang, Zhifei, Yue, Mu, Huang, Lei, Wang, Qin, and Yang, Baoying
- Subjects
- *
SHARPE ratio , *ROBUST optimization , *RETURN on assets , *EMPIRICAL research - Abstract
In financial investments, portfolio allocation is always one of the most fundamental and challenging tasks. This paper proposes a robust portfolio optimization approach, extending the application of classical mean-variance (M-VAR) method for high-dimensional situations. The yielded assets return of the proposed method can be enhanced to some extent. It is called Kendall's tau unconstrained shrinkage regression for M-VAR method (KUSR-MV). By some representative empirical studies, it is shown to have a more robust estimation of high-dimensional portfolio allocation compared to its competitors. Besides, its Sharpe ratio can be improved while the risk constraint can be well remained. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
15. Comparison of the response time-based effort-moderated IRT model and three-parameter logistic model according to computerized adaptive test performances: a simulation study.
- Author
-
Arslan, Yusuf Kemal, Alkan, Afra, and Elhan, Atilla Halil
- Subjects
- *
COMPUTER adaptive testing , *ITEM response theory , *FISHER information , *STIMULUS & response (Psychology) , *MODEL theory - Abstract
Depending on the developments in technology and information, paper-pencil tests leave their place for computerized adaptive tests (CATs). CAT is widely used in the field of health, mainly in psychiatry. Many item response theory models have been proposed in the literature regarding the use of response time focusing on item difficulty and personal characteristics by ignoring the multidimensional interactions, therefore these results may cause bias in estimates of individual ability levels. The present simulation study was conducted to compare the performance of CAT applications of the effort-moderated item response theory (EM-IRT) model, which is based on response time, and the three-parameter logistic (3PL) model. While simulating CAT with the EM-IRT model and the 3PL model, the hybrid method was used for ability estimation and maximum Fisher information (MFI) was used for item selection. The CAT process proceeded until the standard error of the estimation was <0.3 and <0.5, or all items in the item bank were used. The number of individuals was specified as 1000, while the number of items was changed to 50, 100, and 250. All six scenarios were repeated 1000 times. With the increase in the number of items and the decrease in the standard error as a stopping criterion, consistent results were obtained with true ability levels in both methods. The CAT with the EM-IRT model estimated true ability level slightly lower than CAT with the 3PL model. The EM-IRT model enables measuring the response time that could yield additional data to the physician about the mental and cognitive condition of the patient. The CAT method can be a promising method of telemedicine in the era of the pandemic. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
16. Two new Bayesian-wavelet thresholds estimations of elliptical distribution parameters under non-linear exponential balanced loss.
- Author
-
Batvandi, Ziba, Afshari, Mahmoud, and Karamikabir, Hamid
- Subjects
- *
BAYES' estimation , *TERTIARY structure , *EXPONENTIAL functions , *TEST validity - Abstract
The estimation of mean vector parameters is very important in elliptical and spherically models. Among different methods, the Bayesian and shrinkage estimation are interesting. In this paper, the estimation of p-dimensional location parameter for p-variate elliptical and spherical distributions under an asymmetric loss function is investigated. We find generalized Bayes estimator of location parameters for elliptical and spherical distributions. Also we show the minimaxity and admissibility of generalized Bayes estimator in class of S S p (θ , σ 2 I p). We introduce two new shrinkage soft-wavelet threshold estimators based on Huang shrinkage wavelet estimator (empirical) and Stein's unbiased risk estimator (SURE) for elliptical and spherical distributions under non-linear exponential-balanced loss function. At the end, we present a simulation study to test the validity of the class of proposed estimators and physicochemical properties of the tertiary structure data set that is given to test the efficiency of this estimators in denoising. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
17. Copula-based measures and tests for conditional asymmetry.
- Author
-
Mokhtari, E., Dolati, A., and Dastbaravarde, A.
- Subjects
- *
DATA analysis , *SYMMETRY , *STATISTICS - Abstract
In this paper, we develop copula-based measures and testing procedures for the conditional asymmetry of a random variable given another. The resulting tests are ranked based and thus robust to the outliers. The asymptotic properties of the test statistics are investigated. Simulation studies and real data analysis are conducted to illustrate the performance of the proposed tests. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
18. Improving inference in exponential logarithmic distribution.
- Author
-
Singh Nayal, Amit, Ramos, Pedro Luiz, Tyagi, Abhishek, and Singh, Bhupendra
- Subjects
- *
MONTE Carlo method , *MAXIMUM likelihood statistics , *LEAST squares , *PERCENTILES , *COMPARATIVE studies - Abstract
AbstractThis paper presents a comprehensive comparative study of twelve estimation methods for the exponential logarithmic distribution. The methods investigated include maximum likelihood estimator, moments estimator, modified moments estimator, least squares estimator, weighted least squares estimator, percentiles estimator, maximum product spacings estimator, minimum spacing absolute distance estimator, minimum spacing absolute-log distance estimator, Cramér-von Mises estimator, Anderson-Darling estimator, and the right tail Anderson-Darling estimator. Through simulation studies, we assess the efficiency of each estimator based on the mean relative estimate, mean squared error, D
abs , and Dmax criteria, with preference given to estimators exhibiting a mean relative estimate close to one and smaller values of mean squared error, Dabs , and Dmax . Additionally, the practical applicability of these methods is demonstrated through the analysis of three real datasets, showcasing their effectiveness in various scenarios. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
19. Optimal dividend policy for a jump-diffusion process with Markov switching.
- Author
-
Zhang, Zhenzhong, Hua, Zheng, Tong, Jinying, and Zhao, Xin
- Abstract
AbstractIn this paper, we focus on the dividend optimization problem for a corporation with a hybrid jump-diffusion process in the presence of fixed and proportional transaction costs. Under some assumptions, a verification theorem is proved to determine the optimal control problem. The optimal dividend policy and the corresponding value functions are found for any finite Markov chain. In addition, a stochastic approximation algorithm is used to estimate the optimal dividend barriers. Finally, some numerical examples are shown to illustrate our results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Moment estimation for uncertain moving average model with application to CO2 emission.
- Author
-
Liu, Zhe and Liu, Yang
- Subjects
- *
UNCERTAINTY (Information theory) , *TIME series analysis , *MOMENTS method (Statistics) , *PARAMETER estimation , *CARBON emissions - Abstract
AbstractTime series analysis plays a pivotal role in numerous domains, including finance, economics, meteorology, and engineering. When precise observations cannot be obtained due to economic and technological limitations, or when the frequency is unstable due to world complexity, uncertain time series models are more suggested than stochastic time series models. Since there are some unknown parameters in uncertain time series models, reasonable estimations of model parameters are fundamental to the development of effective models. Motivated by this, this paper examines a novel parameter estimation method based on the method of moments for uncertain moving average model, which can be applied with both imprecise and precise observations. Based on the proposed estimation method, forecast value and confidence interval for future value are also provided. Numerical example and real data analyses using imprecise and precise observations for carbon emission data are documented to illustrate the effectiveness of our methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Copula-based multivariate control charts for monitoring multiple dependent Weibull processes.
- Author
-
Chen, Peile and Zhang, Jiujun
- Subjects
- *
MONTE Carlo method , *WEIBULL distribution , *MOVING average process , *QUALITY control charts , *POLYSEMY - Abstract
AbstractMonitoring multivariate time between events (MTBE) data is critical in areas such as manufacturing and service operations. Monitoring of multiple dependent Weibull processes is often required in high quality processes, yet existing methods for monitoring Weibull time between events (TBE) data have been developed based on univariate processes. This paper develops multivariate exponentially weighted moving average (MEWMA) and multivariate cumulative sum (MCUSUM) control charts for monitoring the mean vector of multiple dependent Weibull processes from the Normal, Clayton, Frank, Gumbel and Joe copula models. The performance of the two charts is evaluated using Monte Carlo simulation based on the average time to signal (ATS) metric for in-control and out-of-control states. To further illustrate the use of the chart, two applications are illustrated. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Over-sampling methods for mixed data in imbalanced problems.
- Author
-
Alonso, Hugo and Pinto da Costa, Joaquim Fernando
- Subjects
- *
SCALING (Social sciences) , *CLASSIFICATION , *COMPARATIVE studies - Abstract
AbstractIn practice, it is common to find imbalanced classification problems, where one or more classes have many fewer examples than the others. There are several ways to deal with imbalance in order to improve the classification results in the less represented class(es) and one of them consists in applying re-sampling methods. Furthermore, it is no less common for data sets in imbalanced classification problems to be a mix of nominal, ordinal, quantitative discrete and continuous data. However, the true nature of the data tends to be ignored, like when ordinal data are treated as nominal. In this paper, we propose several re-sampling methods for mixed data, which take into account the four scales of measurement usually found in real data. They are based on the popular synthetic minority over-sampling technique or SMOTE. We consider different measures of distance adequate for mixed data. We also introduce new ways of creating the synthetic examples, using all of the nearest neighbors. We show through a comparative study that it pays off taking into account the true nature of the data and the new ways of creating synthetic examples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Tests for spatial dependence and temporal heterogeneity in time-varying coefficient spatial autoregressive panel data model.
- Author
-
Li, Tizheng and Tan, Yundi
- Subjects
- *
INFERENTIAL statistics , *PANEL analysis , *HOME prices , *DATA analysis , *DATA modeling - Abstract
AbstractTime-varying coefficient spatial autoregressive panel data model is a powerful tool to simultaneously deal with spatial dependence and temporal heterogeneity in spatial panel data analysis. However, little work has been devoted to related statistical inference issues, which largely limits the application scope of the model. In this paper, we develop two generalized-likelihood-ratio-statistic-based bootstrap tests to detect spatial dependence of the response variable and temporal heterogeneity of the regression relationship, respectively. The simulation studies show that both tests are of accurate size and satisfactory power, and are quite robust to non-normality of error distribution. A house price data set is finally analyzed to demonstrate the application of the proposed tests in detecting spatial dependence and temporal heterogeneity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Asymptotic properties of histogram density estimation for long-span high-frequency data in diffusion processes.
- Author
-
Zhang, Shasha, Liang, Dan, Yang, Shanchao, Wu, Zhaoshuo, Yang, Xiangjun, and Yang, Xiutao
- Abstract
AbstractThis paper mainly focuses on the asymptotic properties of histogram density estimation for long-span high-frequency data in diffusion processes. We use some inequalities of mixing long-span high-frequency data as research tools to derive the asymptotic variance, integrated mean squared error, and globally optimal bandwidth under strong mixing conditions. Furthermore, the asymptotic normality with the convergence rate of the histogram density estimation is provided. The simulation results show that as the sample size increases, the fitting and asymptotic normality of the histogram density estimation become better and better, and it is less affected by the sampling interval. The empirical analysis demonstrates that the histogram density estimation not only characterizes the density function of data but also describes the distribution features of data. We can identify distribution types by using the distribution features of histogram density estimation, and can fit the Laplace distribution by the least square deviation principle, which offers more research avenues for the financial field. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Jackknife empirical likelihood inference for the lifetime performance index.
- Author
-
Estabraqi, Javad, Mahmoudi, Eisa, and Nadeb, Hossein
- Subjects
- *
MONTE Carlo method , *MANUFACTURING industries , *CONFIDENCE intervals , *PROBABILITY theory - Abstract
AbstractAn important topic in the manufacturing industry is the assessment of lifetime performance. In this paper, we propose the jackknife empirical likelihood (JEL) and some of its developments to construct confidence intervals for the lifetime performance index. We compare the performance of the proposed methods with the normal approximation (NA) and two bootstrap methods in terms of the coverage probability, average length, and computational cost criteria by Monte Carlo simulation. Then, we apply the proposed methods to construct some confidence intervals based on a real dataset. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Ordinal family data analysis with measurement error in covariates.
- Author
-
Rana, Subrata and Roy, Surupa
- Subjects
- *
ERRORS-in-variables models , *PARAMETER estimation , *MEASUREMENT errors , *REGRESSION analysis , *DATA analysis - Abstract
AbstractMost genetic studies recruit high-risk families and the discoveries are based on nonrandomly selected groups. We must consider the consequences of this ascertainment process in order to apply the results of genetic research to the general population. Furthermore, in such studies, problem arises in inferring the disease status correctly since measurements of risk factors that are observed in the study are often noisy or indirect. In this paper, we examine the ascertainment as well as measurement error effects on the parameter estimates. En route, we develop a flexible model that adjusts the standard cumulative logistic regression model for clustered ordinal data by incorporating ascertainment and measurement error corrections. A customized Monte Carlo EM (MCEM) based technique is employed to analyze the model. The novelty of this technique is that it has the flexibility to handle the high dimensional setup. The simulation studies suggest that the ascertainment adjustment is necessary and effective. The need for measurement error adjustment in the model is also justified. The methodology is illustrated using a small scale survey data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Model selection for big multivariate time series data using emulators.
- Author
-
Wu, Brian
- Subjects
- *
TIME series analysis , *CROSS correlation , *BIG data , *LODGING-houses , *HUMIDITY , *KRIGING - Abstract
AbstractOrder identification for models of big time series data presents computational challenges. Results from previous studies on big univariate time series suggest that methods based on kriging and optimization can reduce the computing time substantially while providing adequately plausible model orders. In today’s world, however, one must analyze multiple big time series simultaneously, such as multiple stocks or measuring humidity in various rooms of a house. This becomes a much bigger computational challenge to address, as one must take into account the cross-correlation between the individual time series. The goal of this paper is to detail a method to fit big multivariate time series. The results show that the proposed technique can substantially decrease computing time while still providing reasonably accurate model orders. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Design of a one-sided adaptive EWMA <italic>X̄</italic> control chart using the truncation method in the presence of measurement errors.
- Author
-
Xie, FuPeng, Hu, XueLong, Tang, Anan, and Zhang, Jiujun
- Subjects
- *
MARKOV chain Monte Carlo , *MONTE Carlo method , *ERRORS-in-variables models , *QUALITY control charts , *MOVING average process , *MEASUREMENT errors - Abstract
AbstractIn this paper, a one-sided adaptive exponentially weighted moving average X¯ scheme utilizing the truncation method (called the one-sided TAEWMA X¯ chart) is proposed for monitoring processes involving measurement errors. Leveraging the linear covariate error model, both the Markov chain model and the Monte Carlo simulation are established to evaluate the run length (RL) properties of the proposed chart. In the presence of measurement errors, an optimal search strategy is developed to identify the optimal design parameters of the scheme. Subsequently, the effects of measurement errors on both zero-state and steady-state average run length (ARL) performance of the proposed scheme are studied. Comparative studies with two other existing schemes reveal that, despite the significant effects of measurement errors on ARL performance, the proposed chart remains superior to the existing competing schemes in shift detection, especially when multiple measurement operations are involved. Finally, an illustrative example is provided to demonstrate the application of the suggested scheme. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. Spatial lag quantile regression for compositional data.
- Author
-
Zhao, Yizhen, Ma, Xuejun, and Chao, Yue
- Subjects
- *
LINEAR programming , *REGRESSION analysis , *COMPUTER simulation , *DATA modeling , *QUANTILE regression - Abstract
AbstractWhile research on spatial lag quantile regression models exists, the extension to incorporating compositional data within this framework remains unexplored. The unique characteristics of compositional data present significant issues in constructing such a model. In this paper, we investigate the estimation problems for the spatial lag quantile regression model in compositional data. We propose the constrained two-stage quantile regression (CTS-QR) and constrained instrumental variable quantile regression (CIV-QR) methods based on linear programming. Numerical simulations show that our proposed methods are more accurate than traditional unconstrained estimation methods. A real data application of our methods is also provided. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Evaluation of threshold selection methods for adaptive wavelet quantile density estimation in the presence of bias.
- Author
-
Shirazi, Esmaeil and Doosti, Hassan
- Subjects
- *
BESOV spaces , *ESTIMATION bias , *DENSITY - Abstract
In this paper, the estimation of the quantile density function based on i.i.d biased observations is investigated. The bias function is assumed to be positive and bounded. Of the various smoothing methods for selecting the model parameters, hard and block thresholding methods are proposed and two adaptive estimators based on them are constructed. We evaluate these theoretical performances via the minimax approach over Besov balls. We show that these estimators obtain near-optimal and optimal convergence rates under some mild assumptions. Finally, with a simulation study and application on a real set of data, the performance quality of these estimators will be compared to other wavelet methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. A novel algorithm for classification using a low rank approximation of kernel-based support vector machines with applications.
- Author
-
Chatrabgoun, O., Esmaeilbeigi, M., Daneshkhah, A., Kamandi, A., and Salimi, N.
- Subjects
- *
SUPPORT vector machines , *MATRIX decomposition , *CLASSIFICATION algorithms , *RECEIVER operating characteristic curves , *STOCK price indexes - Abstract
Support vector machines (SVMs), as a powerful technique for classification, are becoming increasingly popular in a wide range of applications. This is simply due to their robustness against several types of model assumptions violations and outliers. The Kernel-based SVM are very useful to capture non-linear patterns in the data, and for classification. However, this kernel-based method could become computationally very challenging because it increases the required time to train data. This increase in computational time is mainly due to the appearance of the kernel in solving the quadratic optimization problem (QOP). In order to tackle this computational complexity, we propose a novel method based on the low-rank approximation, by adapting a truncated Mercer series to the kernels. The quadratic optimization problem in the structure of kernel-based SVM will then be replaced with a much simpler optimization problem. In the proposed approach, the required time for the vector computations and matrix decompositions will be much faster such that these changes lead to efficiently resolve the QOP and ultimately increase efficiency in classification. We finally present some numerical illustrations based on the ROC curves and other classification performance benchmarks considered in this paper to assess the performance of the proposed low-rank approximation to the kernel in SVM structure. The results suggest considerable efficiency improvement has been observed in classification with significant reduction in computational time required to train and forecast the stock market index (S&P 500 index) and promoter recognition in DNA sequences. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Estimating quality adjusted life years in the absence of standard utility values – a dynamic joint modeling approach.
- Author
-
Deo, Vishal and Grover, Gurprit
- Subjects
- *
UTILITY functions , *SURVIVAL analysis (Biometry) , *MEDICAL economics , *AIDS patients , *QUALITY of life - Abstract
Estimation of Quality Adjusted Life Years (QALYs) is pivotal toward cost-effectiveness analysis (CEA) of medical interventions. The popular multi-state decision analytic modeling approach to CEA uses standard utility values assigned to each disease state to estimate QALY. In this paper, we have formulated a new approach to estimate QALY by defining utility as a function of a longitudinal covariate significantly associated with disease progression. Association parameter between the longitudinal covariate and survival times has been estimated through joint modeling of the longitudinal and the Weibull accelerated failure time survival model. MCMC techniques have been used to predict expected survival times of each censored case using the fitted model. Time-dependent utility values, calculated using projected values of the longitudinal covariate, have been used to evaluate QALYs for each patient. Proposed methodology has been demonstrated on a retrospective survival data of HIV/AIDS patients. A simulation exercise has also been carried out to gauge the predictive capability of the joint model in projecting the values of the longitudinal covariate. Results show that the proposed dynamic approach to estimate QALY can be a promising alternative to the popular multi-state decision analytic modeling approach, especially when the standard utility values are not available. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Analysis of power in preprocessing methodologies for datasets with missing values.
- Author
-
Carvalho, Iago A. and Moreira, Arthur F.
- Subjects
- *
DECISION making , *MISSING data (Statistics) , *INFERENTIAL statistics , *STATISTICS , *ALGORITHMS - Abstract
The empirical evaluation of algorithms usually produces a large set of data that needs to be assessed through an appropriate statistical methodology. Sometimes, the generated dataset has missing entries due to the inability of an algorithm to compute a solution for a given benchmark. These missing entries largely restrict the use of statistical tests in such a way that classic parametric or non-parametric tests cannot correctly evaluate such datasets. There are some preprocessing methods in the literature to deal with this problem. In this paper, we evaluate four of these methods: the Bi-objective Lexicographical Ranking Scheme, PAR10 scores, the Skillings–Mack test, and the Wittkowski test. We measure the power of the Friedman's test when each one of them is used. Our results indicate that the Bi-objective Lexicographical Ranking Scheme or the PAR10 scores should be used when the number of missing entries is small or unknown in advance, while the Skillings–Mack test is recommended when more than 30 % of the entries are missing. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Distribution of squared sum of products of independent Nakagami-m random variables.
- Author
-
Samuh, M. H. and Salhab, A. M.
- Subjects
- *
CENTRAL limit theorem , *GAMMA distributions , *WIRELESS communications , *STATISTICS - Abstract
The need for the distribution of combination of random variables (RVs) arises in many areas of sciences and engineering. In this paper, closed-form approximations for the distribution of squared sum of products of independent Nakagami-m RVs are derived. Three different approaches (central limit theorem, Edgeworth expansion, and a one-term gamma approximation) are considered. The Kolmogorov-Smirnov, Anderson-Darling, and Cramer-von Mises statistics are used as a quantitative metrics to compare the derived distribution forms with the empirical distribution obtained from a simulation study. Furthermore, an application for the derived distributions in wireless communication field is presented. As a result, it is shown that the most accurate and simplest closed-form expression is the one obtained by a one-term gamma approximation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. A computationally efficient Gibbs sampler based on data-augmentation strategy for estimating the reparameterized DINA model.
- Author
-
Zhang, Jiwei, Zhang, Zhaoyuan, and Lu, Jing
- Subjects
- *
ABILITY grouping (Education) , *BAYESIAN field theory , *REDUCTION potential , *ALGORITHMS , *GIBBS sampling , *PROBABILITY theory - Abstract
With the increasing demand for precise test feedback, cognitive diagnosis models (CDMs) have attracted more and more attention for fine classification of students with regard to their ability to master given skills. The aim of this paper is to use a highly effective Gibbs algorithm based on auxiliary variables (GAAV) to estimate the deterministic input noisy "and" gate (DINA) model that is widely used for cognitive diagnosis. The applicability of the algorithm to other CDMs is also discussed. Unlike the Metropolis–Hastings algorithm, this new algorithm does not require repeated adjustment of the turning parameters to achieve an appropriate acceptance probability, and it also overcomes the dependence of the traditional Gibbs sampling algorithm on the conjugate prior distribution. Four simulation studies are conducted, and a detailed analysis of fraction subtraction test data is carried out to further illustrate the proposed methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Solution to the N-player gambler's ruin using recursions based on multigraphs.
- Author
-
Marfil, Ramon Iñigo D. and David, Guido
- Subjects
- *
DISCRETE mathematics , *GRAPH theory , *MARKET capitalization , *LINEAR systems , *MULTIGRAPH , *MARKOV processes - Abstract
This paper studies the N-player gambler's ruin problem where two players are involved in an even-money bet during each round. The objective is to solve for each player's final placing probabilities given their initial wealths, as well as the expected time until ruin. Multigraphs were constructed to model the transitions between chip states. Linear systems were constructed based on these graphs. Solutions for the placing probabilities of each player were obtained using these linear systems. A numerical algorithm is developed to solve the general N-player gambler's ruin for any positive integer chip total. Expected time until ruin for any initial state can be solved using absorbing Markov chains. The solution leads to exact values and equities in this model depend on the exact number, not just proportion, of chips each player holds. Compared with the usual lattice model, the multigraph model uses smaller matrices in computations, yielding considerable savings in the number of operations and program run time. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Exact and explicit analytical solutions for optimal pension management with general utilities.
- Author
-
Chen, Wenting, Yin, Changhao, and Lv, Kaiyu
- Subjects
- *
PENSION trust management , *UTILITY functions , *ANALYTICAL solutions , *PENSIONS , *NONLINEAR equations - Abstract
This paper considers the optimal management for both the defined-contribution (DC) and defined-benefit (DB) pension plans in a continuous time framework. In particular, analytical solutions for both plans with general utility functions are derived for the first time. The current solutions are written in the form of Taylor series expansions and constructed through the homotopy analysis method (HAM). It is theoretically shown that the series solutions, if convergent, are indeed the exact solutions of the nonlinear HJB equations arising from the optimal pension fund management. Numerical experiments are presented to demonstrate the accuracy and versatility of the current solution approach. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. A unified model specification for sparse and dense functional/longitudinal data.
- Author
-
Hu, Lixia and Rui, Gao
- Subjects
- *
MONTE Carlo method , *INFERENTIAL statistics , *DATA modeling , *ADDITIVES - Abstract
The semi varying-coefficient additive model is a flexible nonparametric regression method, including varying-coefficient model and additive model as its special cases. However, a complex model may lead over-fitting phenomenon, which motivates us to develop a set of testing procedure to judge whether a parsimonious submodel is sufficient or not. Specifically, we propose hypothesis testings to check time-varying property of varying-coefficient component functions and linearity of additive component functions, respectively. For repeated measurements data, it is a subjective choice between sparse and dense case, which may lead to wrong statistical inference. One major contribution of this paper is to introduce consistent testing methodologies in a unified framework of sparse, dense and ultra dense cases of the data, which avoids a subjective choice of data types in practical applications. Extensive Monte Carlo simulation studies investigating the finite sample performance of the proposed methodologies confirm our asymptotic results. We further illustrate our methodologies via two real-life data applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Comparison of two coefficients of variation: a new Bayesian approach.
- Author
-
Bertolino, F., Columbu, S., Manca, M., and Musio, M.
- Subjects
- *
MARKOV chain Monte Carlo , *HYPOTHESIS , *LITERATURE , *MEASUREMENT - Abstract
The coefficient of variation is a useful indicator for comparing the spread of values between datasets with different units or widely different means. In this paper we address the problem of investigating the equality of the coefficients of variation from two independent populations. In order to do this we rely on the Bayesian Discrepancy Measure recently introduced in the literature. Computing this Bayesian measure of evidence is straightforward when the coefficient of variation is a function of a single parameter of the distribution. In contrast, it becomes difficult when it is a function of more parameters, often requiring the use of MCMC methods. We calculate the Bayesian Discrepancy Measure by considering a variety of distributions whose coefficients of variation depend on more than one parameter. We consider also applications to real data. As far as we know, some of the examined problems have not yet been covered in the literature. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Uncertain vector autoregressive smoothly moving average model.
- Author
-
Shi, Yuxin and Sheng, Yuhong
- Subjects
- *
MOVING average process , *TIME series analysis , *FIX-point estimation , *LEAST squares , *AUTOREGRESSIVE models - Abstract
Uncertain time series analysis is an effective method to predict the variable with time index under imprecise observations. Sometimes, the time series model is built directly on the data which the prediction result is inaccurate. In this paper, an uncertain vector autoregressive smoothly moving average model (UVARSMA) is given. The least absolute deviation estimation and the least square estimation are given to estimate the unknown parameters. In order to predict effectively, we analyze the residuals and give the point estimation and interval estimation about the prediction. The relevant results are compared with those of the uncertain vector autoregressive model. Finally, a practical example about air index in Beijing, from 9 March 2022 to 23 April 2022, is given to verify the feasibility and accuracy of the new model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Additive multi-task learning models and task diagnostics.
- Author
-
Miller, Nikolay and Zhang, Guoyi
- Subjects
- *
MACHINE learning , *SUPPORT vector machines , *TEST methods , *ADDITIVES - Abstract
This paper develops a model for multi-task machine learning that incorporates per-task parametric and nonparametric effects in an additive way. This allows a practitioner the flexibility of modeling the tasks in a customized manner, increasing model performance compared to other modern multi-task methods, while maintaining a high degree of model explainability. We also introduce novel methods for task diagnostics, which are based on the statistical influence of tasks on the model's performance, and propose testing methods and remedial measures for outlier tasks. Additive multi-task learning model with task diagnostics is examined on a well-known real-world multi-task benchmark dataset and shows a significant performance improvement over other modern multi-task methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. On Periodic Generalized Poisson INAR(1) Model.
- Author
-
Bentarzi, Mohamed and Souakri, Roufaida
- Subjects
- *
TIME series analysis , *LEAST squares , *AUTOREGRESSIVE models , *STATIONARY processes , *CHRONIC myeloid leukemia - Abstract
In this paper, we introduce a first-order Periodic Generalized Poisson Integer-Valued Autoregressive model PGPINAR (1) which has been shown to be useful to describe overdispersion, equidispersion and underdispersion feature encountered in periodically correlated Integer-Valued time series. Some probabilistic and statistical properties are established, such as the periodically correlated stationarity conditions, in the first and the second moments are provided and the closed-forms of these moments are, under these conditions, derived. Moreover, the structure of the periodic autocovariance is obtained. The estimation problem is addressed through the Yule-Walker (YW) , the Two-Stage Conditional Least Squares (CLS) and the Conditional Maximum Likelihood (CML) methods. The performance of these methods is done through an intensive simulation study and an application on real data set is accomplished. Keywords and phrases: Periodic Generalized Poisson, Integer-Valued Autoregressive, Periodically correlated process, periodically stationary condition. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Local influence in linear mixed measurement error models with ridge estimation.
- Author
-
Maksaei, Najmieh, Rasekh, Abdolrahman, and Babadi, Babak
- Subjects
- *
ERRORS-in-variables models , *PARAMETER estimation , *LENGTH measurement , *COVARIANCE matrices , *PERFORMANCE theory - Abstract
This paper deals with the assessment of the effects of minor perturbations of data in the linear mixed measurement error models with Ridge estimation, based on the corrected score function. The local influence approach is used for assessing the influence of small perturbations on the parameter estimates. We examine different types of perturbation schemes including the covariance matrix of the conditional errors, case weight, response, and explanatory perturbation, to identify influential observations. A real data application and a simulation study illustrate the performance of the proposed diagnostics. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Pearson chi-squared and unweighted residual sum of square tests of fit for a probit model.
- Author
-
Pho, Kim-Hung and Truong, Buu-Chau
- Subjects
- *
SUM of squares , *GOODNESS-of-fit tests , *GAUSSIAN distribution , *REGRESSION analysis , *DEPENDENT variables - Abstract
Likewise the logistic model and the probit model is often used to consider the associations between a dichotomous dependent variable and some covariates. The probit model is a model that has a lot of important and engaging applications in practice especially in economics and finance. In the framework paper, we propose the Pearson chi-squared (PSC) and unweighted residual sum of square (URSS) tests of fit for a probit model, to enrich and diversify the topic of goodness of fit (GOF) test for regression models. This is a very interesting topic in theory and has enormous practical applications. Theoretically, we have proved that both the proposed approaches are asymptotic to the standard normal distribution. In terms of numerical research and practical applications, several simulations and a real-life data set a fishing data set are investigated in this work. The obtained results in these parts also help to illustrate numerical that the proposed formulas are very reliable. Besides, the findings in the empirical analysis are very consistent with reality. It will be very meaningful evidence to illustrate to everyone how to fish to get the most amount of fish while fishing. Finally, some discussions, conclusions, and future work are also included in this study. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. A new bivariate INAR(1) model with paired Poisson-weighted exponential distributed innovations.
- Author
-
Sajjadnia, Zahra, Sharafi, Maryam, Mamode Khan, Naushad, and Soobhug, Ashwinee Devi
- Subjects
- *
DISTRIBUTION (Probability theory) , *MONTE Carlo method , *MAXIMUM likelihood statistics , *CHRONIC myeloid leukemia - Abstract
This paper proposes a novel Bivariate integer-valued auto-regressive model of order 1 with paired Poisson Weighted Exponential (PWE) distributed innovations which is denoted by INAR(1)-PWE with two Sarmanov and classical versions. The CML and CLS estimators of the parameters are obtained and the performance of the proposed models are assessed through some Monte Carlo simulation experiments. Also, the BINAR(1)-PWE is applied to the two real data sets and is compared with some bivariate INAR processes. The research findings commend the BINAR(1)-PWE as another suitable alternative to analyze bivariate series of counts and open the avenues to explore the Sarmanov-based bivariate models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Estimation of population variance using optional randomized response technique model in the presence of measurement errors.
- Author
-
Gupta, Sat, Aloraini, Badr, Qureshi, Muhammad Nouman, and Khalil, Sadia
- Subjects
- *
ERRORS-in-variables models , *RANDOMIZED response , *MEASUREMENT errors , *COMPUTER simulation , *PRIVACY - Abstract
Measurement errors are an important consideration in sample surveys. Neglecting their influence can give misleading and inaccurate results. In this paper, we propose three variance estimators using Optional Randomized Response Technique (ORRT) in the presence of measurement error. Performance of the proposed estimators are evaluated through a simulation study and a numerical example. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Estimation of five parameter bivariate modified Weibull singular distribution.
- Author
-
Kumar, Sanjay, Kundu, Debasis, and Mitra, Sharmishtha
- Subjects
- *
LIKELIHOOD ratio tests , *EXPECTATION-maximization algorithms , *WEIBULL distribution , *OPTIMIZATION algorithms , *MAXIMUM likelihood statistics - Abstract
AbstractRecently, El-Bassiouny et al. (El-Bassiouny, Shahen, and Abouhawwash 2018) proposed the flexible Marshall-Olkin type bivariate modified Weibull distribution (BMWD) and obtained the maximum likelihood estimates (MLEs) of the five unknown parameters of the BMWD by solving five non-linear equations. The MLEs of parameters cannot be obtained in closed form. In this paper, we discuss about the computation of the MLEs using the efficient EM algorithm, which reduces the problem to a two-dimensional (2-D) optimization problem. The proposed EM algorithm performs better than the standard existing optimization algorithms, e.g. optim and nlm. The bivariate goodness-of-fit (GoF) test is discussed based on statistically equivalence blocks (SEBs). Some nonstandard likelihood ratio (LR) tests are considered for testing the relative GoF to the BMWD. An extensive simulation study is carried out. The EM algorithm performs quite well. A real-life data set is analyzed for illustrative purposes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Integrating clustering and sequential analysis for improving the spectral density estimation and dependency structure of time series.
- Author
-
Laala, Barkahoum, Elsawah, A. M., Vishwakarma, Gajendra K., and Fang, Kai-Tai
- Subjects
- *
SIGNAL frequency estimation , *TIME series analysis , *ESTIMATION theory , *SPECTRAL energy distribution , *SOCIAL facts - Abstract
AbstractTime series models (TSMs) are used to forecast events based on verified historical data that are widely applied to describe natural and social phenomena. The spectral density function (SDF) of the TSM has the benefit to determine trends, periodic patterns and behavior of observable events and is used in degree of memory (DOM) which is used to identify independent and dependent structures. Therefore, paying attention to the estimator of SDF (SDE) is the cornerstone and desired goal in the statistical properties of TSMs. This paper gives a novel perspective for improving the SDE then dependency structure of TSMs by integrating clustering and sequential analysis. The accuracy of the lag window estimator (LWE) and periodogram estimator (PE), which are commonly used as SDEs in practice, is examined both with and without integrating the new proposed technique in order to figure out the efficacy of the new technique. Several circumstances, including independent and dependent data structures, TSMs with long and short DOMs, different numbers of clusters, and different dataset sizes, are used to carry out the investigation. The main findings indicate that: (i) The SDEs (PE and LWE) perform better when combined with the new proposed technique than when used alone. (ii) Using the new technique in conjunction with the SDEs results in absolutely perfect estimators for independent processes. (iii) High efficiency estimators for short memory TSMs arise when the new proposed technique is combined with the SDEs in the dependent processes; however, high efficiency in long memory TSMs (LMTSMs) is not guaranteed. (iv) Using the new technique, the correlation estimator behavior between SDEs and frequencies improves significantly with increasing DOM values of LMTSMs. (v) By applying the new technique for LMTSMs, the DOM estimator provides results indicating significant dependence. This study demonstrates that there are other efficient ways to improve the estimator than estimate techniques. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Uncertain vector moving average model based on Welsch loss function.
- Author
-
Xiao, Xiang and Sheng, Yuhong
- Subjects
- *
MOVING average process , *TIME series analysis , *PARTICULATE matter , *FORECASTING - Abstract
AbstractIn uncertain multivariate time series, the most basic model is the uncertain vector autoregressive (UVAR) model. In this paper, we will propose another uncertain multivariate time series model, the uncertain vector moving average (UVMA) model as well as transform the UVMA model into a UVAR model and use the Welsch loss function to estimate the unknown parameters. Analyzing residuals and forecasting future trends. In addition, the best-fit model is determined based on the value of the sum of squared errors. Finally, the robustness of the Welsch estimation and the validity of the UVMA model are illustrated by numerical examples, and our method is also applicable to predict the data of PM2.5 and PM10. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Nonparametric model-assisted estimation from randomized response survey data.
- Author
-
Mostafa, Sayed A.
- Subjects
- *
RANDOMIZED response , *NONPARAMETRIC estimation , *DATA release , *CONSUMPTION (Economics) , *NONRESPONSE (Statistics) - Abstract
AbstractThe randomized response technique offers an effective way for reducing potential bias resulting from nonresponse and untruthful responses when asking questions about sensitive behaviors or beliefs. The technique is also used for conducting statistical disclosure control of public use data files released by statistical agencies such as the U.S. Census Bureau. In both cases, the technique works by randomizing the actual survey responses using some known randomization model. In the case of asking sensitive survey questions, the randomization of responses is done by the survey respondents and only the randomized responses are collected, whereas in the case of disclosure control, the survey agency implements the randomization of responses after collecting the survey data and prior to releasing it for public use. This paper considers estimating the finite population mean from a survey where randomized responses are available for the study variable along with complete non-randomized auxiliary information. We define and study a class of nonparametric model-assisted estimators that make efficient use of the available auxiliary information and account for the complex survey design. The asymptotic properties of the proposed estimators are derived and a bootstrap variance estimator is presented. The finite sample performance of the proposed estimators is studied
via extensive simulations accounting for a wide range of forms for the relationship between the study variable and auxiliary variable. The empirical results support the theoretical analyses and suggest that our proposed estimators are superior to existing estimators in most cases. Furthermore, the proposed methods are illustrated using real data from the 2015 U.S. consumer expenditure survey. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.