14,687 results on '"Heteroscedasticity"'
Search Results
102. Adaptive tests for ANOVA in Fisher–von Mises–Langevin populations under heteroscedasticity.
- Author
-
Basak, Shreyashi, Pauly, Markus, and Kumar, Somesh
- Subjects
- *
ADAPTIVE testing , *HETEROSCEDASTICITY , *LIKELIHOOD ratio tests , *ANALYSIS of variance - Abstract
Fisher–von Mises–Langevin distributions are widely used for modeling directional data. In this paper, the problem of testing homogeneity of mean directions of several Fisher–von Mises–Langevin populations is considered when the concentration parameters are unknown and heterogeneous. First, an adaptive test based on the likelihood ratio statistic is proposed. Critical points are evaluated using a parametric bootstrap. Second, a heuristic test statistic is considered based on pairwise group differences. A nonparametric bootstrap procedure is adapted for evaluating critical points. Finally, a permutation test is also proposed. An extensive simulation study is performed to compare the size and power values of these tests with those proposed earlier. It is observed that both parametric and nonparametric bootstrap based tests achieve size values quite close to the nominal size. Asymptotic tests and permutation tests have size values higher than the nominal size. Bootstrap tests are seen to have very good power performance. The robustness of tests is also studied by considering contamination in Fisher–von Mises–Langevin distributions. R packages are developed for the actual implementation of all tests. A real data set has been considered for illustrations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
103. How Does Self-Efficacy and Perceived Organizational Support (POS) Increase Organizational Commitment? Mataram Evidence from the Lombok Raya Hotel.
- Author
-
Setyono, Langgeng
- Subjects
SELF-efficacy ,ORGANIZATIONAL commitment ,HOTEL employees ,HOTELS ,HETEROSCEDASTICITY ,MULTICOLLINEARITY - Abstract
Organizational commitment, with an emphasis on organizational dedication. Successful organizations typically have a highly dedicated staff. Some aspects impact organizational commitment, such as self-efficacy and perceived organizational support. This research aims to determine the effect of self-efficacy and perceived organizational support (POS) on organizational commitment (study on Lombok Raya Hotel employees). This research is quantitative; the data used are primary and secondary. 103 respondents who worked at the Lombok Raya Mataram Hotel completed questionnaires for this study. Then, this study employed conventional assumption tests, such as normality tests, multicollinearity tests, and heteroskedasticity tests. T-tests, f-tests, coefficient of determination tests, and multiple linear regression tests are some of the hypothesis tests (R2) used. According to the study's findings, organizational commitment is positively and significantly influenced by self-efficacy, POS, and all these variables together. The outcomes of descriptive data analysis show that staff members of the Lombok Raya Mataram Hotel are worried about the future and sustainability of the company or hotel. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
104. Model averaging for right censored data with measurement error.
- Author
-
Liang, Zhongqi, Zhang, Caiya, and Xu, Linjun
- Subjects
MEASUREMENT errors ,REGRESSION analysis ,CENSORSHIP ,CENSORING (Statistics) - Abstract
This paper studies a novel model averaging estimation issue for linear regression models when the responses are right censored and the covariates are measured with error. A novel weighted Mallows-type criterion is proposed for the considered issue by introducing multiple candidate models. The weight vector for model averaging is selected by minimizing the proposed criterion. Under some regularity conditions, the asymptotic optimality of the selected weight vector is established in terms of its ability to achieve the lowest squared loss asymptotically. Simulation results show that the proposed method is superior to the other existing related methods. A real data example is provided to supplement the actual performance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
105. The effect of teacher empowerment-based course program on teachers' knowledge of Pancasila character on the IndonesiaMalaysia literacy.
- Author
-
Wijaya, Daya Negri, Yafie, Evania, Hariyono, H., Pratama, Andhika Yudha, Azizah, Alfi Rohmatul, and Nur Azizah, Suti Mega
- Subjects
SELF-efficacy in teachers ,PANCASILA ,EFFECTIVE teaching ,TEACHING methods ,QUANTITATIVE research ,HETEROSCEDASTICITY - Abstract
Copyright of Journal of Community Service & Empowerment is the property of Journal of Community Service & Empowerment and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
106. Estimation of Matrix Exponential Unbalanced Panel Data Models with Fixed Effects: An Application to US Outward FDI Stock.
- Author
-
Yang, Ye, Doğan, Osman, and Inar, Süleyman Taşp
- Subjects
MATRIX exponential ,PANEL analysis ,FIXED effects model ,DATA modeling ,FOREIGN investments - Abstract
In this article, we consider a matrix exponential unbalanced panel data model that allows for (i) spillover effects using matrix exponential terms, (ii) unobserved heterogeneity across entities and time, and (iii) potential heteroscedasticity in the error terms across entities and time. We adopt a likelihood based direct estimation approach in which we jointly estimate the common parameters and fixed effects. To ensure that our estimator has the standard large sample properties, we show how the score functions should be suitably adjusted under both homoscedasticity and heteroscedasticity. We define our suggested estimator as the root of the adjusted score functions, and therefore our approach can be called the M-estimation approach. For inference, we suggest an analytical bias correction approach involving the sample counterpart and plug-in methods to consistently estimate the variance-covariance matrix of the suggested M-estimator. Through an extensive Monte Carlo study, we show that the suggested M-estimator has good finite sample properties. In an empirical application, we use our model to investigate the third country effects on the U.S. outward foreign direct investment (FDI) stock at the industry level. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
107. Predictive model averaging with parameter instability and heteroskedasticity.
- Author
-
Yin, Anwen
- Subjects
PREDICTION models ,HETEROSCEDASTICITY ,ASYMPTOTIC distribution ,MALVACEAE ,FORECASTING - Abstract
This paper proposes a frequentist model averaging approach in the presence of parameter instability and heteroskedasticity. We derive optimal weights combining the stable and break specifications of a predictive model, with the weights from minimizing the leave‐one‐out cross‐validation information criterion (CV). We characterize the asymptotic distribution of the CV and provide the analytical expressions of the feasible optimal CV weights. Our simulations and applications forecasting the US and Taiwanese GDP growth demonstrate the superior performance of the CV model averaging relative to other methods such as the Mallows averaging, the approximate Bayesian averaging, and equal weighting. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
108. Heteroscedasticity of residual spending after risk equalization: a potential source of selection incentives in health insurance markets with premium regulation.
- Author
-
Oskam, Michel, van Kleef, Richard C., and Douven, Rudy
- Subjects
HEALTH insurance premiums ,INSURANCE companies ,HETEROSCEDASTICITY ,CHRONICALLY ill ,PROFIT & loss - Abstract
Many community-rated health insurance markets include risk equalization (also known as risk adjustment) to mitigate risk selection incentives for competing insurers. Empirical evaluations of risk equalization typically quantify selection incentives through predictable profits and losses net of risk equalization for various groups of consumers (e.g. the healthy versus the chronically ill). The underlying assumption is that absence of predictable profits and losses implies absence of selection incentives. This paper questions this assumption. We show that even when risk equalization perfectly compensates insurers for predictable differences in mean spending between groups, selection incentives are likely to remain. The reason is that the uncertainty about residual spending (i.e., spending net of risk equalization) differs across groups, e.g., the risk of substantial losses is larger for the chronically ill than for the healthy. In a risk-rated market, insurers are likely to charge a higher profit mark-up (to cover uncertainty in residual spending) and a higher safety mark-up (to cover the risk of large losses) to chronically ill than to healthy individuals. When such differentiation is not allowed, insurers face incentives to select in favor of the healthy. Although the exact size of these selection incentives depends on contextual factors, our empirical simulations indicate they can be non-trivial. Our findings suggest that – in addition to the equalization of differences in mean spending between the healthy and the chronically ill – policy measures might be needed to diminish (or compensate insurers for) heteroscedasticity of residual spending across groups. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
109. Searching for gene-gene interactions through variance quantitative trait loci of 29 continuous Taiwan Biobank phenotypes.
- Author
-
Wan-Yu Lin
- Subjects
LOCUS (Genetics) ,GENETIC variation ,ERYTHROCYTES ,GENOME-wide association studies ,HUMAN phenotype - Abstract
Introduction: After the era of genome-wide association studies (GWAS), thousands of genetic variants have been identified to exhibit main effects on human phenotypes. The next critical issue would be to explore the interplay between genes, the so-called “gene-gene interactions” (GxG) or epistasis. An exhaustive search for all single-nucleotide polymorphism (SNP) pairs is not recommended because this will induce a harsh penalty of multiple testing. Limiting the search of epistasis on SNPs reported by previous GWAS may miss essential interactions between SNPs without significant marginal effects. Moreover, most methods are computationally intensive and can be challenging to implement genome-wide. Methods: I here searched for GxG through variance quantitative trait loci (vQTLs) of 29 continuous Taiwan Biobank (TWB) phenotypes. A discovery cohort of 86,536 and a replication cohort of 25,460 TWB individuals were analyzed, respectively. Results: A total of 18 nearly independent vQTLs with linkage disequilibrium measure r
2 < 0.01 were identified and replicated from nine phenotypes. 15 significant GxG were found with p-values <1.1E-5 (in the discovery cohort) and false discovery rates <2% (in the replication cohort). Among these 15 GxG, 11 were detected for blood traits including red blood cells, hemoglobin, and hematocrit; 2 for total bilirubin; 1 for fasting glucose; and 1 for total cholesterol (TCHO). All GxG were observed for gene pairs on the same chromosome, except for the APOA5 (chromosome 11)—TOMM40 (chromosome 19) interaction for TCHO. Discussion: This study provided a computationally feasible way to search for GxG genome-wide and applied this approach to 29 phenotypes. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
110. Estimating correlations among elliptically distributed random variables under any form of heteroskedasticity.
- Author
-
Pelagatti, Matteo and Sbrana, Giacomo
- Subjects
- *
RANDOM variables , *RATE of return on stocks , *PEARSON correlation (Statistics) , *HETEROSCEDASTICITY , *MOVING average process - Abstract
The paper introduces a semiparametric estimator of the correlations among elliptically distributed random variables invariant to any form of heteroscedasticity, robust to outliers, and asymptotically normal. Our estimator is particularly fit for financial applications as vectors of stock returns are generally well approximated by heteroskedastic processes with elliptical (conditional) distributions and heavy tails. The superiority of our estimator with respect to Pearson's sample correlation in financial applications is illustrated using simulated data and real high-frequency stock returns. Using simple exponentially weighted moving averages, we extend our estimator to the case of time-varying correlations and compare it to the popular GARCH-DCC model. We show that the two approaches have comparable performances through simulations and a simple application. However, our estimator is extremely fast to compute, computationally robust, and straightforward to implement. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
111. Comparison of standard long memory time series.
- Author
-
Silva, H. P. T. N., Dissanayake, G. S., and Peiris, T. S. G.
- Subjects
- *
HETEROSCEDASTICITY - Abstract
Standard long memory models are in abundance in the literature today. Selecting the best such a model in terms of capturing key requisite features and trends in data becomes a challenge. This paper addresses the issue through a sequence of Monte Carlo experiments on simulated data and introduces an interval estimate on the asymptotic variance for the long-range dependence parameter of the entire family of standard long memory time series considered within the scope of the study. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
112. Assessing the non-inferiority of a new treatment in a three-arm trial with unknown coefficient of variation.
- Author
-
Wu, Wei-Hwa and Hsieh, Hsin-Neng
- Subjects
- *
EXPERIMENTAL groups , *SAMPLE size (Statistics) , *HETEROSCEDASTICITY , *EMPIRICAL research , *VARIANCES - Abstract
We consider a non-inferiority test conducted in the three-arm trials with coefficient of variations (CV) in sample means. A procedure for this non-inferiority test is proposed based on the generalized p-value (GPV) based method under normality assumption and heteroskedasticity. We demonstrated that first, the GPV-based method outperforms the comparable alternative, the Delta method in terms of empirical sizes and powers. Second, under GPV-based method, when the population mean of groups in the trial is estimated by the sample mean with CV, the empirical sizes control the significance level better than when CV is not included, especially when the ratio of the variance of the reference group to that of the experimental group and the ratio of the variance of the placebo group to that of the experimental group are not too small and that the sample size is small. Third, when the CV is included in the sample mean, the empirical powers are always higher, regardless of the sample size and methods. Additionally, when the data violates the normality assumption, it is found that the proposed GPV-based method in conjunction with CV appear to be highly acceptable through simulation results. Finally, the proposed method is illustrated with two data examples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
113. A novel Bayesian framework to address unknown heteroscedasticity for the linear regression model.
- Author
-
Altaf, Saima, Rashid, Fareeha, Aslam, Muhammad, and Riasat, Sadaf
- Subjects
- *
HETEROSCEDASTICITY , *REGRESSION analysis , *MONTE Carlo method , *LEAST squares , *MARKOV chain Monte Carlo - Abstract
A common problem that we encounter in linear regression models is that the error variances are not the same for all the observations, that is, there is an issue of heteroscedasticity. To avoid the adverse effects of this issue on the estimates obtained by the ordinary least squares, it is a usual practice to use the estimated weighted least squares (EWLS) method or to use some adaptive methods, especially when the form of heteroscedasticity is unknown. The current article proposes a novel Bayesian version of the EWLS estimator. The performance of the proposed estimator has been evaluated in terms of its efficiency using the Monte Carlo simulations. An example has also been included for the demonstration. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
114. Nonparametric Conditional Risk Mapping Under Heteroscedasticity.
- Author
-
Fernández-Casal, Rubén, Castillo-Páez, Sergio, and Francisco-Fernández, Mario
- Subjects
- *
NONPARAMETRIC estimation , *CONDITIONAL probability , *HETEROSCEDASTICITY , *SIMULATION methods & models , *BANDWIDTHS , *ALGORITHMS - Abstract
A nonparametric procedure to estimate the conditional probability that a nonstationary geostatistical process exceeds a certain threshold value is proposed. The method consists of a bootstrap algorithm that combines conditional simulation techniques with nonparametric estimations of the trend and the variability. The nonparametric local linear estimator, considering a bandwidth matrix selected by a method that takes the spatial dependence into account, is used to estimate the trend. The variability is modeled estimating the conditional variance and the variogram from corrected residuals to avoid the biasses. The proposed method allows to obtain estimates of the conditional exceedance risk in non-observed spatial locations. The performance of the approach is analyzed by simulation and illustrated with the application to a real data set of precipitations in the USA.Supplementary materials accompanying this paper appear on-line. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
115. A Nonparametric Bootstrap Method for Heteroscedastic Functional Data.
- Author
-
Fernández-Casal, Rubén, Castillo-Páez, Sergio, and Flores, Miguel
- Subjects
- *
CONTINUOUS processing , *OZONE , *FUNCTIONAL analysis , *HETEROSCEDASTICITY , *POLLUTION - Abstract
The objective is to provide a nonparametric bootstrap method for functional data that consists of independent realizations of a continuous one-dimensional process. The process is assumed to be nonstationary, with a functional mean and a functional variance, and dependent. The resampling method is based on nonparametric estimates of the model components. Numerical studies were conducted to check the performance of the proposed procedure, by approximating the bias and the standard error of two estimators. A practical application of the proposed approach to pollution data has also been included. Specifically, it is employed to make inference about the annual trend of ground-level ozone concentration at Yarner Wood monitoring station in the United Kingdom. Supplementary material to this paper is provided online. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
116. Modelleme ve Tahmin Amaçlı Veri Ön İşleme Yöntemlerinin Ürün Kurutma Örneği ile Açıklanması.
- Author
-
KORKMAZ, Cem and KACAR, İlyas
- Abstract
Although regression is a traditional data processing method, machine and deep learning methods have been widely used in the literature in recent years for both modelling and prediction. However, in order to use these methods efficiently, it is important to perform a preliminary evaluation to understand the data type. Therefore, preevaluation procedures are described in this study. Experimental uncertainty analysis was performed to determine the measurement uncertainties in the measurement devices and sensors used in the drying experimental setup. Significant and insignificant relationships between variables in the data set were determined by Pearson correlation matrix. Autocorrelation and partial autocorrelation functions were used to determine the time series lag in the drying data and an AR(5) series with 5 lags was determined. The data were found to have variable variance due to peaks and troughs in the raw data resulting from the natural behaviour of the drying process. Modelling success was achieved with the normalisation pre-evaluation process performed without distorting the raw data. Thus, it has been shown that better models can be obtained compared to traditional models. In order to avoid unnecessary time and computational costs in the trial and error method used to determine the number of hidden layers and neurons in the machine learning method, various formulas proposed in the literature were compared. It is shown that the correlation coefficient alone is not sufficient to determine the goodness of the model. In modelling the data in this study, the NARX model was found to converge to the desired value faster and with less error than ANFIS and LSTM models. In the simulation of a rotary drum dryer, the optimum number of mesh elements was determined as 1137 by mesh independence analysis. In this way, unnecessary over-calculations were also prevented. Of course, all these methods are already available in statistical science. However, in this study, the methods to be used for modelling and prediction purposes are carefully selected and explained with examples, especially for young researchers who are outside this field to gain speed and easy comprehension. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
117. QMLE for periodic absolute value GARCH models.
- Author
-
Slimani, Walid, Lescheb, Ines, and Cherfaoui, Mouloud
- Subjects
- *
ABSOLUTE value , *GARCH model , *ASYMPTOTIC normality , *FOREIGN exchange rates , *HETEROSCEDASTICITY , *U.S. dollar - Abstract
Periodic generalized autoregressive conditionally heteroscedastic (PGARCH) models were introduced by Bollerslev and Ghysels [T. Bollerslev and E. Ghysels, Periodic autoregressive conditional heteroscedasticity, J. Bus. Econom. Statist. 14 1996, 2, 139–151]; these models have gained considerable interest and continued to attract the attention of researchers. This paper is devoted to extensions of the standard absolute value GARCH (AVGARCH) model to the periodically time-varying coefficients (PAVGARCH) one. In this class of models, the parameters are allowed to switch between different regimes. Moreover, these models allow to integrate asymmetric effects in the volatility, Firstly, we give necessary and sufficient conditions ensuring the existence of stationary solutions (in the periodic sense). Secondary, a quasi-maximum likelihood (QML) estimation approach for estimating the PAVGARCH model is developed. The strong consistency and the asymptotic normality of the estimator are studied given mild regularity conditions, requiring strict stationarity and the finiteness of moments of some order for the errors term. Next, we present a set of numerical experiments illustrating the practical relevance of our theoretical results. Finally, we apply our model to two foreign exchange rates: of Algerian Dinar to the European currency Euro (Euro/Dinar) and the American currency Dollar (Dollar/Dinar). This empirical work shows that our approach also outperforms and fits the data well. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
118. Robust tests for multivariate repeated measures with small samples.
- Author
-
Zeng, Ting and Harrar, Solomon W.
- Subjects
- *
MULTIVARIATE analysis , *UNIVARIATE analysis , *HETEROSCEDASTICITY , *EXPERIMENTAL design , *COVARIANCE matrices , *FACTORIAL experiment designs , *FACTORIALS - Abstract
Multivariate repeated measures data naturally arise in clinical trials and other fields such as biomedical science, public health, agriculture, social science and so on. For data of this type, the classical approach is to conduct multivariate analysis of variance (MANOVA) based on Wilks' Lambda and other multivariate statistics, which require the assumptions of multivariate normality and homogeneity of within-cell covariance matrices. However, data being analyzed nowadays show marked departure from multivariate normality and homogeneity. This paper proposes a finite-sample test by modifying the sums of squares matrices to make them insensitive to the heterogeneity in MANOVA. The proposed test is invariant to affine transformation and robust against nonnormality. The proposed method can be used in various experimental designs, for example, factorial design and crossover design. Under various simulation settings, the proposed method outperforms the classical Doubly Multivariate Model and Multivariate Mixed Model proposed elsewhere, especially for unbalanced sample sizes with heteroscedasticity. The applications of the proposed method are illustrated with ophthalmology data in factorial and crossover designs. The proposed method successfully identified and validated a significant main effect and demonstrated that univariate analysis could be oversensitive to small but clinically unimportant interactions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
119. Structural nonlinear damage identification based on the information distance of GNPAX/GARCH model and its experimental study.
- Author
-
Zuo, Heng and Guo, Huiyong
- Subjects
GARCH model ,STRUCTURAL health monitoring ,HETEROSCEDASTICITY ,POLYNOMIAL approximation ,STRUCTURAL frames ,SYSTEM identification - Abstract
In the structural health monitoring (SHM) of civil engineering, most of the structural damage is nonlinear damage, such as breathing cracks and bolt looseness. Under the excitation of external loads, the time-domain response data of the structure produced by these nonlinear damages have nonlinear features. In order to solve the time-domain nonlinear damage identification problem of complex structures, this paper proposes a nonlinear damage identification method based on the information distance of GNPAX/GARCH (general expression of system identification for linear and nonlinear with polynomial approximation and exogenous inputs/generalized autoregressive conditional heteroskedasticity) model. First, an order determination method based on Bayesian optimization to select the order of the GNPAX/GARCH model was proposed, and the GNPAX/GARCH model was established for damage identification. Then, the redundant structural items of GNPAX/GARCH model were removed by the model optimization method based on the structural pruning algorithm. Finally, the information distance of the GNPAX/GARCH model conditional heteroscedasticity series between the baseline state and test state was derived, and the structural damage source locations were determined according to the information distance. A three-story frame structure experiment and a stand structure experiment were used to verify the effectiveness of the proposed method. The results show that the proposed method can effectively identify the nonlinear damages caused by the component breathing crack and joint bolt looseness, verifying its robustness to the nonlinear damage identification of the multi-story and multi-span complex structures. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
120. Weighted weak convergence of the sequential tail empirical process for heteroscedastic time series with an application to extreme value index estimation.
- Author
-
Jennessen, Tobias and Bücher, Axel
- Subjects
EXTREME value theory ,EMPIRICAL research ,HETEROSCEDASTICITY ,CENTRAL limit theorem ,TIME series analysis ,STOCHASTIC models - Abstract
The sequential tail empirical process is analyzed in a stochastic model allowing for serially dependent observations and heteroscedasticity of extremes in the sense of Einmahl et al. (J. R. Stat. Soc. Ser. B. Stat. Methodol. 78(1), 31–51, 2016). Weighted weak convergence of the sequential tail empirical process is established. As an application, a central limit theorem for an estimator of the extreme value index is proven. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
121. Estimating Linear Dynamic Panels with Recentered Moments.
- Author
-
Bao, Yong
- Subjects
DEPENDENT variables ,HETEROSCEDASTICITY ,UNITS of time - Abstract
This paper proposes estimating linear dynamic panels by explicitly exploiting the endogeneity of lagged dependent variables and expressing the crossmoments between the endogenous lagged dependent variables and disturbances in terms of model parameters. These moments, when recentered, form the basis for model estimation. The resulting estimator's asymptotic properties are derived under different asymptotic regimes (large number of cross-sectional units or long time spans), stable conditions (with or without a unit root), and error characteristics (homoskedasticity or heteroskedasticity of different forms). Monte Carlo experiments show that it has very good finite-sample performance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
122. Building a Sustainable GARCH Model to Forecast Rubber Price: Modified Huber Weighting Function Approach.
- Author
-
Ghani, Intan Martina Md. and Rahim, Hanafi A.
- Subjects
GARCH model ,PRICES ,RUBBER ,VALUE (Economics) ,OUTLIER detection ,HETEROSCEDASTICITY ,FORECASTING - Abstract
Copyright of Baghdad Science Journal is the property of Republic of Iraq Ministry of Higher Education & Scientific Research (MOHESR) and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
123. Weighted linear regression estimation and verification of rock shear strength parameters.
- Author
-
HU Yunhan and LI Bin
- Subjects
SHEAR strength ,HETEROSCEDASTICITY ,ROCK analysis ,REGRESSION analysis ,LINEAR statistical models ,GAUSSIAN distribution - Abstract
In this paper, the optimal estimation method of rock shear strength parameters is studied. Aiming at the heteroscedasticity problem in rock shear strength data, a weighted linear regression method is introduced. In the linear regression analysis of rock shear strength, it is generally assumed that the observation errors of dependent variables obey the same normal distribution, but the heteroscedasticity problem in triaxial strength data is ignored. In this paper, the open-pit sandstone in the southwest of Sichuan Province is used as the research object, and the triaxial strength data of sandstone under different confining pressures are obtained. The weighted linear regression method is used to analyze and solve the problem of heteroscedasticity of the data. This paper first introduces the theoretical basis and related statistical methods of rock shear strength and then elaborates on the concept and application scope of weighted linear regression. The theoretical formula and test method of weighted linear regression are proposed, and the definition and calculation method of weight in weighted regression are explained. Finally, the experimental data of open-pit sandstone in the southwest of Sichuan Province are verified. The research results prove the effectiveness of the weighted least squares method in the estimation of rock shear strength parameters. The research results can provide a relevant theoretical basis for the selection of shear strength parameters. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
124. Identifying The Knowledge Spillover Hotspot and its Role in Neighbouring Country Innovation.
- Author
-
Darfo-Oduro, Raymond and Stejskal, Jan
- Subjects
- *
ECONOMIC policy , *DATA analysis , *ECONOMIC development , *HETEROSCEDASTICITY , *INFORMATION economy - Abstract
The question on how to finance innovation activities of countries has taken a center stage in economic policy discussions in countries and among regional bodies. Such discussions require a policy direction to present alternative ways of financing innovation activities at lower cost in the face of dwindling resource available to countries and regional bodies for innovation activities. One important way of dealing with this challenge is to invest the limited resources in countries and sectors with the potential of higher knowledge spillover to benefit other countries and sectors. In this study therefore, we investigate to determine the knowledge spillover hotspot countries in Europe and how they affect neighbouring country's innovation performance. For the purposes of policy to improve innovation performance in Europe, the knowledge spillover hotspot countries will guide European regional bodies to concentrate innovation investments in countries with the potential of high knowledge spillovers for the benefit of other countries. The study specifically investigated R&D spillover and explicit knowledge spillover hotspots in the manufacturing sectors of Europe and their effects. Data for the study is unstructured and sourced from the World bank with the longest spans being 2005 to 2020 and shortest being 2013 to 2017. In all, nine countries were sampled based on data availability for the study. These countries include Poland, Germany, Slovakia, Slovenia, Lithuania, Belgium, France, Spain and Czech Republic. The study employed panel data analysis. Based on the Hausman test, fixed effect model was chosen as against random effect model. The results of the study show that after controlling for institutional and economic factors and ensuring robustness against heteroskedasticity and autocorrelation R&D spillover hotspots in Europe include Germany and Slovakia whereas explicit knowledge spillover hotspots are Poland, Slovenia, Lithuania and France. The results of the study have shown that the relationship between knowledge spillover from the hotspot countries and surrounding country's innovation varies. For some of the hotspots, the relationship is linear whilst in others the relationship is nonlinear. The study also confirm that explicit knowledge is more susceptible to Knowledge spillover. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
125. The impacts of inflation and inflation uncertainty in evaluating the effectiveness of monetary policy: The case of ASEAN-5.
- Author
-
Hang, Yun Jie, Tham, Chi Cheng, Sek, Siok Kun, and Sim, Khang Yi
- Subjects
- *
MONETARY policy , *PRICE inflation , *INFLATIONARY universe , *ECONOMIC policy , *ECONOMIC equilibrium , *HETEROSCEDASTICITY - Abstract
Previous studies reported inconclusive results in the nexus between inflation and its uncertainty. On the other hand, the interaction between the inflation-output gap-monetary policy requires research exploration as results might provide evaluations on the effectiveness of monetary policy and economic stability. Hence, this study aims to investigate the nexus of inflation-inflation uncertainty and the interaction mentioned above in ASEAN-5 (Indonesia, Malaysia, the Philippines, Singapore, and Thailand). The multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) models, namely Diagonal VECH (DVECH), Diagonal Baba, Engle, Kraft and Kroner (Diagonal BEKK), and Constant Conditional Correlation (CCC) are applied. The data is in monthly, ranging from January 1986 to December 2018. The highest persistency in inflation uncertainty is found in the Philippines and the least in Malaysia. Output gap causes a large change in the policy rate, but inflation does not. The results imply that output gap is the main policy concern in ASEAN-5. Output gap and monetary policy uncertainties are determined by the short-run disturbances in the Philippines, Indonesia, and Thailand, while inflation uncertainty is determined by the long-run disturbances or uncertainty persistency. In terms of policy performance, the monetary policy in ASEAN-5 countries has a low influence on the output gap and inflation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
126. Accounting the US-China trade war and COVID-19 effects in forecasting gold price using ARIMAX-GARCH model.
- Author
-
Andreas, Christopher, Nugroho, Hariawan Widi, Ulyah, Siti Maghfirotul, Mardianto, M. Fariz Fadillah, and Pusporani, Elly
- Subjects
- *
GOLD sales & prices , *INTERNATIONAL trade disputes , *BOX-Jenkins forecasting , *TIME series analysis , *HETEROSCEDASTICITY , *COVID-19 pandemic - Abstract
Investor demand for gold as a safe investment tool tends to increase during the Covid-19 pandemic. In addition, global uncertainty due to the trade war between the US and China also affects the movement of gold prices which makes gold prices tend to fluctuate. Modeling and prediction of gold prices are very important to minimize risk factors. The 10 years range of data (2001-2021) was analyzed. It was found that the conditions of the trade war between the US and China, as well as the Covid-19 pandemic had a significant impact on the movement of gold prices. By using these two variables as exogenous variables in the Autoregressive Integrated Moving Average with Exogenous Variable (ARIMAX) approach, we get a model that does not meet the classical assumptions of parametric time series analysis, which is the serial correlation in the squared residual indicating heteroskedasticity. To overcome this problem, the Autoregressive Conditional Heteroscedasticity – Generalized Autoregressive Conditional Heteroscedasticity (ARCH-GARCH) approach is used to capture the effect of volatility. The final model is ARIMAX(4,2,1)-GARCH(0,2) with MAPE value of 3.53% and 5.76% for training and testing data. Thus, the model has been able to predict gold prices with high accuracy. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
127. The Treatment of Uncertainty in Hydrometric Observations: A Probabilistic Description of Streamflow Records
- Author
-
Oliveira, Debora Y and Vrugt, Jasper A
- Subjects
discharge ,heteroscedasticity ,variance estimation ,diagnostic analysis ,Physical Geography and Environmental Geoscience ,Civil Engineering ,Environmental Engineering - Abstract
In this paper, we introduce a relatively simple data-driven method for the representation of the uncertainty in daily discharge records. The proposed method relies only on hourly discharge data and takes advantage of a nonparametric difference-based estimator in the characterization of random errors in discharge time series. We illustrate with corrupted streamflow data that the nonparametric estimator provides an accurate characterization of the nature (homoscedastic or heteroscedastic) and magnitude of these errors. In addition, we demonstrate the practical usefulness of the estimator using discharge time series of 500+ watersheds of the Catchment Attributes and MEteorology for Large-sample Studies data set. This analysis reveals that the magnitude of errors of aleatory nature in the investigated discharge records is rather small (less than 3% for 80% of the records). We then combine the effect of random errors and measurement frequency into a daily variance estimate, which serves as input to a streamflow generation approach. This procedure produces replicates of the discharge record which portray accurately the assigned streamflow uncertainty, preserve key statistical properties of the discharge record and are hydrologically realistic. The proposed method facilitates Bayesian analysis and supports tasks such as model diagnostics, data assimilation, uncertainty quantification and regionalization.
- Published
- 2022
128. Assessing Potential Heteroscedasticity in Psychological Data: A GAMLSS approach
- Author
-
Correa, Juan C., Kneib, Thomas, Ospina, Raydonal, Tejada, Julian, and Marmolejo-Ramos, Fernando
- Subjects
heteroscedasticity ,gamlss ,scientific evidence. ,Psychology ,BF1-990 - Abstract
This paper provides a tutorial for analyzing psychological research data with GAMLSS, an R package that uses the family of generalized additive models for location, scale, and shape. These models extend the capacities of traditional parametric and non-parametric tools that primarily rely on the first moment of the statistical distribution. When psychological data fails the assumption of homoscedasticity, the GAMLSS approach might yield less biased estimates while offering more insights about the data when considering sources of heteroscedasticity. The supplemental material and data help newcomers understand the implementation of this approach in a straightforward step-by-step procedure.
- Published
- 2023
- Full Text
- View/download PDF
129. Does investor's sentiment affect industries' return? – A case of selected Indian industries
- Author
-
Rohilla, Amit, Tripathi, Neeta, and Bhandari, Varun
- Published
- 2023
- Full Text
- View/download PDF
130. Robust control chart for nonlinear conditionally heteroscedastic time series based on Huber support vector regression.
- Author
-
Kim, Chang Kyeom, Yoon, Min Hyeok, and Lee, Sangyeol
- Subjects
- *
QUALITY control charts , *HETEROSCEDASTICITY , *ROBUST control , *NASDAQ composite index , *MONTE Carlo method , *STOCK price indexes , *TIME series analysis , *STATISTICAL bootstrapping - Abstract
This study proposes a control chart that monitors conditionally heteroscedastic time series by integrating the Huber support vector regression (HSVR) and the one-class classification (OCC) method. For this task, we consider the model that incorporates nonlinearity to the generalized autoregressive conditionally heteroscedastic (GARCH) time series, named HSVR-GARCH, to robustly estimate the conditional volatility when the structure of time series is not specified with parameters. Using the squared residuals, we construct the OCC-based control chart that does not require any posterior modifications of residuals unlike previous studies. Monte Carlo simulations reveal that deploying squared residuals from the HSVR-GARCH model to control charts can be immensely beneficial when the underlying model becomes more complicated and contaminated with noises. Moreover, a real data analysis with the Nasdaq composite index and Korea Composite Stock Price Index (KOSPI) datasets further disclose the validity of using the bootstrap method in constructing control charts. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
131. Chaos, Fractionality, Nonlinear Contagion, and Causality Dynamics of the Metaverse, Energy Consumption, and Environmental Pollution: Markov-Switching Generalized Autoregressive Conditional Heteroskedasticity Copula and Causality Methods.
- Author
-
Bildirici, Melike, Ersin, Özgür Ömer, and Ibrahim, Blend
- Subjects
- *
CAUSATION (Philosophy) , *POLLUTION , *SHARED virtual environments , *HETEROSCEDASTICITY , *ENERGY consumption , *KOLMOGOROV complexity - Abstract
Metaverse (MV) technology introduces new tools for users each day. MV companies have a significant share in the total stock markets today, and their size is increasing. However, MV technologies are questioned as to whether they contribute to environmental pollution with their increasing energy consumption (EC). This study explores complex nonlinear contagion with tail dependence and causality between MV stocks, EC, and environmental pollution proxied with carbon dioxide emissions (CO2) with a decade-long daily dataset covering 18 May 2012–16 March 2023. The Mandelbrot–Wallis and Lo's rescaled range (R/S) tests confirm long-term dependence and fractionality, and the largest Lyapunov exponents, Shannon and Havrda, Charvât, and Tsallis (HCT) entropy tests followed by the Kolmogorov–Sinai (KS) complexity measure confirm chaos, entropy, and complexity. The Brock, Dechert, and Scheinkman (BDS) test of independence test confirms nonlinearity, and White's test of heteroskedasticity of nonlinear forms and Engle's autoregressive conditional heteroskedasticity test confirm heteroskedasticity, in addition to fractionality and chaos. In modeling, the marginal distributions are modeled with Markov-Switching Generalized Autoregressive Conditional Heteroskedasticity Copula (MS-GARCH–Copula) processes with two regimes for low and high volatility and asymmetric tail dependence between MV, EC, and CO2 in all regimes. The findings indicate relatively higher contagion with larger copula parameters in high-volatility regimes. Nonlinear causality is modeled under regime-switching heteroskedasticity, and the results indicate unidirectional causality from MV to EC, from MV to CO2, and from EC to CO2, in addition to bidirectional causality among MV and EC, which amplifies the effects on air pollution. The findings of this paper offer vital insights into the MV, EC, and CO2 nexus under chaos, fractionality, and nonlinearity. Important policy recommendations are generated. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
132. An empirical analysis of pork price fluctuations in China with the autoregressive conditional heteroscedasticity model.
- Author
-
Ting Jin and Lei Li
- Subjects
- *
ARCH model (Econometrics) , *PRICE fluctuations , *PORK , *PRICE indexes , *HETEROSCEDASTICITY , *PRICES , *ARCHES , *PORK industry , *DEMAND forecasting - Abstract
Pork price fluctuations are closely related to the national economy and people's livelihoods in China. Based on the monthly pork price fluctuations in China from January 2011 to August 2022, this study uses ARCH family models to assess the characteristics and laws of these fluctuations in China. The pork price fluctuations show obvious clustering, with external shock information from the previous month affecting the pork price in the following period; the pork market price is characterized by risk compensation, with the high risk of pork supply driving the pork price up. In addition, the pork price fluctuations are characterized by asymmetry, with a greater impact of good than of bad news on the pork price. Due to the pork industry' low entry threshold and the existence of sunk costs, positive information on the pork market has a stronger impact on price fluctuations than negative information. To guide pork supply, we recommend improving monitoring and earlywarning mechanisms in the pork market to identify the pork price volatility threshold and measure the price volatility. In addition, price index insurance products should be constantly strengthened, with different types of insurance products being offered to meeting the insurance demand of various sectors in the pig meat supply chain. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
133. Production analysis with asymmetric noise.
- Author
-
Badunenko, Oleg and Henderson, Daniel J.
- Subjects
NOISE ,INDUSTRIAL costs ,HETEROSCEDASTICITY - Abstract
Symmetric noise is the prevailing assumption in production analysis, but it is often violated in practice. Not only does asymmetric noise cause least-squares models to be inefficient, it can hide important features of the data which may be useful to the firm/policymaker. Here, we outline how to introduce asymmetric noise into a production or cost framework as well as develop a model to introduce inefficiency into said models. We derive closed-form solutions for the convolution of the noise and inefficiency distributions, the log-likelihood function, and inefficiency, as well as show how to introduce determinants of heteroskedasticity, efficiency and skewness to allow for heterogenous results. We perform a Monte Carlo study and profile analysis to examine the finite sample performance of the proposed estimators. We outline R and Stata packages that we have developed and apply to three empirical applications to show how our methods lead to improved fit, explain features of the data hidden by assuming symmetry, and how our approach is still able to estimate efficiency scores when the least-squares model exhibits the well-known "wrong skewness" problem in production analysis. The proposed models are useful for modeling risk linked to the outcome variable by allowing error asymmetry with or without inefficiency. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
134. Debt to the Penny and US Dollar Index: a lead-lag relationship of the US economy under impacts of the Covid-19 outbreak.
- Author
-
Nguyen, Bao Khac Quoc, Phan, Nguyet Thi Bich, and Le, Van
- Subjects
U.S. dollar ,COVID-19 pandemic ,ECONOMIC indicators ,BUDGET ,HETEROSCEDASTICITY ,FOREIGN exchange - Abstract
Purpose: This study investigates the interactions between the US daily public debt and currency power under impacts of the Covid-19 crisis. Design/methodology/approach: The authors employ the multivariate generalized autoregressive conditional heteroskedasticity (MGARCH) modeling to explore the interactions between daily changes in the US Debt to the Penny and the US Dollar Index. The data sets are from April 01, 1993, to May 27, 2022, in which noticeable points include the Covid-19 outbreak (January 01, 2020) and the US vaccination campaign commencement (December 14, 2020). Findings: The authors find that the daily change in public debt positively affects the USD index return, and the past performance of currency power significantly mitigates the Debt to the Penny. Due to the Covid-19 outbreak, the impact of public debt on currency power becomes negative. This effect remains unchanged after the pandemic. These findings indicate that policy-makers could feasibly obtain both the budget stability and currency power objectives in pursuit of either public debt sustainability or power of currency. However, such policies should be considered that public debt could be a negative influencer during crisis periods. Originality/value: The authors propose a pioneering approach to explore the relationship between leading and lagging indicators of an economy as characterized by their daily data sets. In accordance, empirical findings of this study inspire future research in relation to public debt and its connections with several economic indicators. Peer review: The peer review history for this article is available at: https://publons.com/publon/10.1108/IJSE-08-2022-0581 [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
135. Another look at bandwidth-free inference: a sample splitting approach.
- Author
-
Zhang, Yi and Shao, Xiaofeng
- Subjects
ECONOMETRICS ,CONFORMANCE testing ,SAMPLE size (Statistics) ,HETEROSCEDASTICITY ,STATISTICS - Abstract
The bandwidth-free tests for a multi-dimensional parameter have attracted considerable attention in econometrics and statistics literature. These tests can be conveniently implemented due to their tuning-parameter free nature and possess more accurate size as compared to the traditional heteroskedasticity and autocorrelation consistent-based approaches. However, when sample size is small/medium, these bandwidth-free tests exhibit large size distortion when both the dimension of the parameter and the magnitude of temporal dependence are moderate, making them unreliable to use in practice. In this paper, we propose a sample splitting-based approach to reduce the dimension of the parameter to one for the subsequent bandwidth-free inference. Our SS–SN (sample splitting plus self-normalisation) idea is broadly applicable to many testing problems for time series, including mean testing, testing for zero autocorrelation, and testing for a change point in multivariate mean, among others. Specifically, we propose two types of SS–SN test statistics and derive their limiting distributions under both the null and alternatives and show their effectiveness in alleviating size distortion via simulations. In addition, we obtain the limiting distributions for both SS–SN test statistics in the multivariate mean testing problem when the dimension is allowed to diverge. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
136. Towards more credible conceptual replications under heteroscedasticity and unbalanced designs.
- Author
-
Mateu, Pedro, Applegate, Brooks, and Coryn, Chris L.
- Subjects
MONTE Carlo method ,NULL hypothesis ,HETEROSCEDASTICITY - Abstract
Theory cannot be fully validated unless original results have been replicated, resulting in conclusion consistency. Replications are the strongest source of evidence to verify research findings and knowledge claims. In the social sciences, replication studies often fail and thus a continuing need for replication studies to confirm tentative facts, expand knowledge to gain new understanding, and verify hypotheses. Failure to replicate in the social and behavioral sciences sometimes arises due to dissimilarity between hypotheses formulated in original and replication studies. Alternatively, failure to replicate also occurs when the same hypothesis is tested; but done so in the absence of knowledge from previous investigations, as when original study effect sizes are not considered in replication studies. To increase replicability of research findings, this paper demonstrates that the application of two one-sided tests to evaluate a replication question provides a superior means for conducting replications, assuming all other methodological procedures remained as similar as possible. Furthermore, this paper sought to explore the impact of heteroscedasticity and unbalanced designs in replication studies in four paired conditions of variance and sample size. Two Monte Carlo simulations, each with two stages, were conducted to investigate conclusion consistency among different replication procedures to determine the repeatability of an observed effect. Overall, the proposed approach yielded a higher proportion of successful replications than the conventional approach (testing the original null hypothesis of no effect). Thus, findings can be confirmed by replications and in the absence of confirmation, there cannot be a final statement about any theory. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
137. Inactivation kinetics of selected pathogenic and non-pathogenic bacteria by aqueous ozone to validate minimum usage in purified water.
- Author
-
Yuqian Lou, Kasler, David R., Hawkins, Zach L., Zhen Li, Sannito, Dan, Fritz, Ronald D., and Yousef, Ahmed E.
- Subjects
PATHOGENIC bacteria ,OZONE ,COLIFORMS ,ESCHERICHIA coli ,ENTEROCOCCUS faecium ,BOTTLED water ,SALMONELLA enterica - Abstract
Ozone is often used as an antimicrobial agent at the final step in purified water processing. When used in purified bottled water manufacturing, residual ozone should not exceed 0.4 mg/L, per US-FDA regulations. These regulations require the control of Escherichia coli and other coliform bacteria; however, non-coliform pathogens can contaminate bottled water. Hence, it is prudent to test the efficacy of ozone against such pathogens to determine if the regulated ozone level adequately ensures the safety of the product. Inactivation of selected pathogenic and non-pathogenic bacteria in purified water was investigated as a function of ozone dose, expressed in Ct units (mg O
3 *min/L). Bacterial species tested were Enterococcus faecium, E. coli (two serotypes), Listeria monocytogenes (three strains), Pseudomonas aeruginosa, and Salmonella enterica (three serovars). Resulting dose (Ct)-response (reduction in populations' log10 CFU/mL) relationships were mostly linear with obvious heteroscedasticity. This heteroscedastic relationship required developing a novel statistical approach to analyze these data so that the lower bound of the dose-response relationships can be determined and appropriate predictive models for such a bound can be formulated. An example of this analysis was determining the 95%-confidence lower bound equation for the pooled dose-responses of all tested species; the model can be presented as follows: Logpopulationreduction = 3.80Ct + 1.84. Based on this relationship, application ozone at a Ct of 0.832 and 21°C achieves ≥ 5-log reduction in the population of any of the tested pathogenic and non-pathogenic bacteria. This dose can be implemented by applying ozone at 0.832 mg/L for 1 min, 0.416 mg/L for 2 min, or other combinations. The study also proved the suitability of E. faecium ATCC 8459 as a surrogate strain for the pathogens tested in the current study for validating water decontamination processes by ozone. In conclusion, the study findings can be usefully implemented in processing validation of purified water and possibly other water types. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
138. Heteroscedasticity identification and variable selection via multiple quantile regression.
- Author
-
Wang, Mingqiu, Kang, Xiaoning, Liang, Jiajuan, Wang, Kun, and Wu, Yuanshan
- Subjects
- *
QUANTILE regression , *HETEROSCEDASTICITY , *REGRESSION analysis - Abstract
High-dimensional data often display heteroscedasticity. If the heteroscedasticity is neglected in the regression model, it will produce inefficient inference for the regression coefficients. Quantile regression is not only robust to outliers, but also accommodates heteroscedasticity. This paper aims to simultaneously carry out variable selection and heteroscedasticity identification for the linear location-scale model under a unified framework. We develop a regularized multiple quantile regression approach simultaneously identifying the heteroscedasticity, seeking common features of quantile coefficients and eliminating irrelevant variables. We also establish the theoretical properties of the proposed method under some regularity conditions. Simulation studies are conducted to evaluate the finite sample performance of the proposed method, showing that it is able to identify the covariates that affect the variability of the response. We further apply the proposed method to analyse the Wage data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
139. Partial Derivatives Estimation of Multivariate Variance Function in Heteroscedastic Model via Wavelet Method.
- Author
-
Kou, Junke and Zhang, Hao
- Subjects
- *
DERIVATIVES (Mathematics) , *HETEROSCEDASTICITY , *NONPARAMETRIC estimation , *WAVELET transforms - Abstract
For derivative function estimation, conventional research only focuses on the derivative estimation of one-dimensional functions. This paper considers partial derivatives estimation of a multivariate variance function in a heteroscedastic model. A wavelet estimator of partial derivatives of a multivariate variance function is proposed. The convergence rates of a wavelet estimator under different estimation errors are discussed. It turns out that the strong convergence rate of the wavelet estimator is the same as the optimal uniform almost sure convergence rate of nonparametric function problems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
140. Impact of external shocks on international corn price fluctuations.
- Author
-
SHUAI LIU, DINGYU LIU, and SIBO GE
- Subjects
- *
ABSOLUTE return funds , *PRICE fluctuations , *CORN prices , *FUTURES market , *INTERNATIONAL finance , *HETEROSCEDASTICITY - Abstract
In recent years, the external shock represented by COVID-19 has caused significant fluctuations in global corn prices. Based on the weekly data on international corn prices from 2020 to 2023, this paper constructs autoregressive conditional heteroskedasticity (ARCH) class and time-varying parameter - vector autoregression (TVP-VAR) models. After analysing the characteristics of corn price fluctuations, it further analyses the influence of external uncertainties such as COVID-19, international finance, the corn futures market, and international exports of corn on corn price fluctuations. The results show that international corn price fluctuations always have significant asymmetry. Nevertheless, the influence of past changes on the future will gradually disappear, and the corn market is not characterised by high risk and high return because of the phenomenon of flat or declining absolute returns during the periods of high volatility. All the selected external shocks also have a time-varying impact on corn price fluctuations, and there are differences in the impact size, impact direction, and impact duration. The external shocks led by COVID-19 had a transmission effect on other factors and then affected corn price fluctuations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
141. Fast optimization methods for high-dimensional row-sparse multivariate quantile linear regression.
- Author
-
Chen, Bingzhen and Chen, Canyi
- Subjects
- *
QUANTILE regression , *LEAST squares , *REGRESSION analysis , *HETEROSCEDASTICITY , *DATA analysis - Abstract
Sparse quantile regression is a useful tool for variable selection, robust estimation and heteroscedasticity treatment in high-dimensional data analysis. Due to the non-smooth of quantile loss, the computation is heavier than the least square models. In the literature, there are various numerical methods for linear quantile regression, such as ADMM developed in Gu et al. [Technometrics. 2017;60(3):319–331]. However, the computation of multivariate quantile regression has not yet been fully resolved, especially when the dimension is high. Motivated by this, we will focus on the design of fast numerical algorithms for row-sparse multivariate quantile regression model. By virtue of proximal operator and Majorize–Minimization, four smoothed algorithms are designed. For all the obtained algorithms, we analyse their convergence and the parameter selection. We conduct plenty of simulations and four real data sets analysis. Finally, we conclude that the smoothed method is faster than the non-smooth method, especially when the number of predictors is large. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
142. Using stochastic frontier analysis instead of data envelopment analysis in modelling investment performance.
- Author
-
Lamb, John D. and Tee, Kai-Hong
- Subjects
- *
DATA envelopment analysis , *STOCHASTIC frontier analysis , *INVESTMENT analysis , *MEASUREMENT errors , *HETEROSCEDASTICITY - Abstract
We introduce methods to apply stochastic frontier analysis (SFA) to financial assets as an alternative to data envelopment analysis, because SFA allows us to fit a frontier with noisy data. In contrast to conventional SFA, we wish to deal with estimation risk, heteroscedasticity in noise and inefficiency terms. We investigate measurement error in the risk and return measures using a simulation–extrapolation method and develop residual plots to test model fit. We find that shrinkage estimators for estimation risk makes a striking difference to model fit, dealing with measurement error only improves confidence in the model, and the residual plots are vital for establishing model fit. The methods are important because they allow us to fit a frontier under the assumption that the risks and returns are not known exactly. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
143. Modeling Risk Factors for Intraindividual Variability: A Mixed-Effects Beta-Binomial Model Applied to Cognitive Function in Older People in the English Longitudinal Study of Ageing.
- Author
-
Parker, Richard M A, Tilling, Kate, Terrera, Graciela Muniz, and Barrett, Jessica K
- Subjects
- *
MEMORY , *INDIVIDUALITY , *TASK performance , *INTERVIEWING , *ACTIVITIES of daily living , *RISK assessment , *SEX distribution , *AGING , *DESCRIPTIVE statistics , *RESEARCH funding , *STATISTICAL models , *COGNITION in old age , *LONGITUDINAL method , *EDUCATIONAL attainment - Abstract
Cognitive functioning in older age profoundly impacts quality of life and health. While most research on cognition in older age has focused on mean levels, intraindividual variability (IIV) around this may have risk factors and outcomes independent of the mean value. Investigating risk factors associated with IIV has typically involved deriving a summary statistic for each person from residual error around a fitted mean. However, this ignores uncertainty in the estimates, prohibits exploring associations with time-varying factors, and is biased by floor/ceiling effects. To address this, we propose a mixed-effects location scale beta-binomial model for estimating average probability and IIV in a word recall test in the English Longitudinal Study of Ageing. After adjusting for mean performance, an analysis of 9,873 individuals across 7 (mean = 3.4) waves (2002–2015) found IIV to be greater at older ages, with lower education, in females, with more difficulties in activities of daily living, in later birth cohorts, and when interviewers recorded issues potentially affecting test performance. Our study introduces a novel method for identifying groups with greater IIV in bounded discrete outcomes. Our findings have implications for daily functioning and care, and further work is needed to identify the impact for future health outcomes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
144. Estimation in the Presence of Heteroskedasticity of Unknown Form: A Lasso-based Approach.
- Author
-
González-Coya, Emilio and Perron, Pierre
- Subjects
- *
HETEROSCEDASTICITY , *TIME series analysis , *REGRESSION analysis , *STATISTICAL sampling , *COVARIANCE matrices - Abstract
We study the Feasible Generalized Least-Squares (FGLS) estimation of the parameters of a linear regression model in the presence of heteroskedasticity of unknown form in the errors. We suggest a Lasso based procedure to estimate the skedastic function of the residuals. The advantage of using Lasso is that it can handle a large number of potential covariates, yet still yields a parsimonious specification. Using extensive simulation experiments, we show that our suggested procedure always provide some improvements in the precision of the parameter of interest (lower Mean-Squared Errors) when heteroskedasticity is present and is equivalent to OLS when there is none. It also performs better than previously suggested procedures. Since the fitted value of the skedastic function falls short of the true specification, we form confidence intervals using a bias-corrected version of the usual heteroskedasticity-robust covariance matrix estimator. These have the correct size and substantially shorter length than when using OLS. Our method is applicable to both cross-section (with a random sample) and time series models, though here we concentrate on the former. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
145. How certain are we about the role of uncertainty in the economy?
- Author
-
Herwartz, Helmut and Lange, Alexander
- Subjects
- *
BUSINESS cycles , *RECESSIONS , *AUTOREGRESSIVE models , *HETEROSCEDASTICITY - Abstract
While causes and consequences of uncertainty in the US economy have attracted viable interest, the literature still lacks a consensus on several aspects. To name two matters of debate, it remains unclear whether uncertainty shocks are a source or the result of recessions and whether uncertainty shocks have adverse (or even stimulating) effects on the economy. We find that ambiguous results in these regards can be traced back to the selection of an appropriate identification strategy in structural vector autoregressive models. We find that both macroeconomic and financial uncertainty are exogenous to business cycle fluctuations and cause economic slowdowns. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
146. Estimating Fluctuating Volatility Using Advanced Garch Models: Evidence from Denmark Stock Exchange.
- Author
-
Kumari, Puja, Meher, Bharat Kumar, Birau, Ramona, Anand, Abhishek, Paswan, Mukesh, Simion, Mircea Laurentiu, Lupu (Filip), Gabriela Ana Maria, and Nioata (Chireac), Roxana-Mihaela
- Subjects
GARCH model ,STOCKS (Finance) ,STOCK price indexes ,LAGRANGE multiplier ,PRICES - Abstract
In the stock market, volatility is a term used to describe the degree to which the prices of assets oscillate and determines the level of risk or uncertainty. The foremost objective of the present analysis is to model the behaviour of the Denmark stock market using data from December 20, 2016, to September 20, 2023. Through the application of GARCH family prototypes which, include GARCH/TARCH, EGARCH, Component ARCH (1,1), and PARCH. The analysis used a sample number of 1668 daily observations for OMXC 25 or OMX Copenhagen 25 Stock Index representing the Denmark stock market. We used some statistical techniques such as Phillips-Perron and Augmented Dickey Fuller tests, Kwiatkowski-Phillips-Schmidt-Shin test statistic, The ARCH Lagrange Multiplier (LM) test, PARCH model. We utilized the E-Views 12 Econometrics package. This empirical investigation adds to the corpus of financial econometrics and emphasizes the importance of precisely and painstakingly modelling the behaviour of stock markets. Our ability to forecast market movements and make informed decisions in a turbulent financial climate will be improved by the findings and research methodologies covered in this paper, which will serve as a solid foundation for future investigations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
147. THE IMPLEMENTATION OF THE ALEXANDER-GOVERN TEST IN FACTORIAL DESIGN ANALYSIS.
- Author
-
Ishak Latfi, Nurul Syafiqah and Abdullah, Suhaida
- Subjects
FACTORIAL experiment designs ,ANALYSIS of variance ,FALSE positive error ,GAUSSIAN distribution ,HETEROSCEDASTICITY - Abstract
This study proposed to evaluate the performance of the Alexander-Govern test (AG test), Analysis of Variance (ANOVA), and t-test by analyzing the Type I error rate. The AG test is regarded as a reliable control Type I error rate. This technique is insensitive in the presence of heteroscedasticity under a normal distribution. Simulation research was carried out using Statistical Analysis Software (SAS) to assess the effectiveness of the tests that are based on the rate of Type I error. By creating the conditions that could highlight the strengths and weaknesses of each test, three variables are being manipulated: sample size, variance heterogeneity, and type of pairings. The performance of the AG tests is convincing when it is able to control the Type I error rates better compared to ANOVA under all conditions of heterogeneous variances. Meanwhile, the ANOVA performs best only when the variances are homogenous. A real data experiment was applied to validate the result. In the battery life design experiment, the p-value using the AG test and ANOVA are computed and compared. The AG test provides valid results when it can test the main effect and the interaction effect, as well as the ANOVA. With good performance in the simulation study, the AG test can be considered a good alternative to the ANOVA when the assumptions of the homogeneity of variances are violated in the case of factorial design. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
148. STUDY ON THE PROFITABILITY OF AGRICULTURAL ENTERPRISES IN UKRAINE DURING THE RUSSIAN MILITARY INVASION OF UKRAINE.
- Author
-
TSESLIV, OLGA, DUNAIEVA, TAMARA, YERESHKO, JULIA, and TSESLIV, OLEKSANDR
- Subjects
RUSSIAN invasion of Ukraine, 2022- ,CORPORATE profits ,AGRICULTURAL industries ,PRICE fluctuations ,NONLINEAR regression ,AGRICULTURAL forecasts - Abstract
This paper examines the effectiveness of grouping agricultural enterprises according to the wheat harvested area and assesses their profitability. We have developed linear and non-linear regression equations to predict the income for said groups of enterprises. The methodology is designed for cases when future market prices are probabilistic in nature. With the help of the developed methodology, it is possible to calculate the necessary production volumes in the conditions of price fluctuations. We have used the Goldfeld–Quandt parametric test to test the model for heteroscedasticity. Calculations show that agricultural holdings are indeed inefficient, and preference should be given to enterprises with medium crop areas. Application of the Lagrange multipliers method when solving the problem of agricultural enterprise optimization makes it possible to increase profitability. The case of price risk, when future market prices are not deterministic, is considered. Therefore, it is necessary to be guided by two criteria when making managerial decisions: to maximize the expected total net income and to minimize the variance of the total net income. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
149. Investigating the Effect of Democracy and Governance Quality on Income Inequality: Evidence from BRICS.
- Author
-
Basumatary, Iragdao Raja and Das, Manjit
- Subjects
INCOME inequality ,LEAST squares ,HETEROSCEDASTICITY ,DEMOCRACY - Abstract
This paper empirically investigates the influence of democracy and governance quality on income inequality in the rapidly growing emerging BRICS (Brazil, Russia, India, China, and South Africa) countries during the period from 1996-2020. The study employed feasible generalized least squares (FGLS), panel corrected standard errors (PCSE), and the Driscoll-Kraay (DK) standard error estimation method to deal with the problems of autocorrelation, heteroskedasticity, and cross-sectional dependence and to find the effect of democracy and governance quality on income inequality. The results of the study indicate that democracy in BRICS countries exacerbates income inequality, while governance quality helps reduce income inequality. These insights offer valuable implications for decision-makers in crafting policies within these spheres. [ABSTRACT FROM AUTHOR]
- Published
- 2024
150. Comparison of Value at Risk (VaR) Multivariate Forecast Models.
- Author
-
Müller, Fernanda Maria and Righi, Marcelo Brutti
- Subjects
VALUE at risk ,MARGINAL distributions ,FORECASTING ,HETEROSCEDASTICITY ,GAUSSIAN distribution ,PORTFOLIO performance - Abstract
We investigate the performance of VaR (Value at Risk) forecasts, considering different multivariate models: HS (Historical Simulation), DCC-GARCH (Dynamic Conditional Correlation-Generalized Autoregressive Conditional Heteroskedasticity) with normal and Student's t distribution, GO-GARCH (Generalized Orthogonal-Generalized Autoregressive Conditional Heteroskedasticity), and copulas Vine (C-Vine, D-Vine, and R-Vine). For copula models, we consider that marginal distribution follow normal, Student's t and skewed Student's t distribution. We assessed the performance of the models using stocks belonging to the Ibovespa index during the period from January 2012 to April 2022. We build portfolios with 6 and 12 stocks considering two strategies to form the portfolio weights. We use a rolling estimation window of 500 and 1000 observations and 1%, 2.5%, and 5% as significance levels for the risk estimation. To evaluate the quality of the risk forecasts, we compute the realized loss and cost. Our results show that the performance of the models is sensitive to the use of different significance levels, rolling windows, and strategies to determine portfolio weights. Furthermore, we find that the model that presents the best trade-off between the costs from risk overestimation and underestimation does not coincide with the model suggested by the realized loss. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.