26 results on '"F-test"'
Search Results
2. A nonparametric test for homogeneity of variances.
- Author
-
Villase∼nor, J.A. and González-Estrada, E.
- Abstract
Abstract.A nonparametric test is proposed for the two-sample variance equality problem based on the fact that the covariance of U=X+Y and W=X−Y is zero when
X andY have equal variances. It is shown that the test statistic has an asymptotic standard normal distribution, which is useful for obtaining critical values for moderate sample sizes. The results of a Monte Carlo simulation experiment conducted in order to study the power properties of the test are presented. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
3. On detecting the effect of exposure mixture.
- Author
-
Liu, Xinhua and Jin, Zhezhen
- Subjects
- *
INDEPENDENT variables , *STATISTICAL software - Abstract
To study the effect of exposure mixture on the continuous health outcomes, one can use the linear model with a weighted sum of multiple standardized exposure variables as an index predictor and its coefficient for the overall effect. The unknown weights typically range between zero and one, indicating contributions of individual exposures to the overall effect. Because the weight parameters present only when the parameter for overall effect is non-zero, testing hypotheses on the overall effect can be challenging, especially when the number of exposure variables is above two. This paper presents a working model based approach to estimate the parameter for overall effect and to test specific hypotheses, including two tests for detecting the overall effect and one test for detecting unequal weights when the overall effect is evident. The statistics are computationally easy and one can apply existing statistical software to perform the analysis. A simulation study shows that the proposed estimators for the parameters of interest may have better finite sample performance than some other estimators. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
4. Minimal sample size in balanced ANOVA models of crossed, nested, and mixed classifications.
- Author
-
Spangl, Bernhard, Kaiblinger, Norbert, Ruckdeschel, Peter, and Rasch, Dieter
- Subjects
- *
OPTIMAL designs (Statistics) , *SAMPLE size (Statistics) , *ANALYSIS of variance , *CLASSIFICATION - Abstract
We consider balanced one-way, two-way, and three-way ANOVA models to test the hypothesis that the fixed factor A has no effect. The other factors are fixed or random. We determine the noncentrality parameter for the exact F-test, describe its minimal value by a sharp lower bound, and thus we can guarantee the worst-case power for the F-test. These results allow us to compute the minimal sample size, i.e. the minimal number of experiments needed. We also provide a structural result for the minimum sample size, proving a conjecture on the optimal experimental design. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. A Minimax Optimal Ridge-Type Set Test for Global Hypothesis With Applications in Whole Genome Sequencing Association Studies.
- Author
-
Liu, Yaowu, Li, Zilin, and Lin, Xihong
- Subjects
- *
WHOLE genome sequencing , *NUCLEOTIDE sequencing , *FALSE positive error , *HYPOTHESIS , *TEST scoring - Abstract
Testing a global hypothesis for a set of variables is a fundamental problem in statistics with a wide range of applications. A few well-known classical tests include the Hotelling's T2 test, the F-test, and the empirical Bayes based score test. These classical tests, however, are not robust to the signal strength and could have a substantial loss of power when signals are weak or moderate, a situation we commonly encounter in contemporary applications. In this article, we propose a minimax optimal ridge-type set test (MORST), a simple and generic method for testing a global hypothesis. The power of MORST is robust and considerably higher than that of the classical tests when the strength of signals is weak or moderate. In the meantime, MORST only requires a slight increase in computation compared to these existing tests, making it applicable to the analysis of massive genome-wide data. We also provide the generalizations of MORST that are parallel to the traditional Wald test and Rao's score test in asymptotic settings. Extensive simulations demonstrated the robust power of MORST and that the Type I error of MORST was well controlled. We applied MORST to the analysis of the whole-genome sequencing data from the Atherosclerosis Risk in Communities study, where MORST detected 20%–250% more signal regions than the classical tests. for this article are available online. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
6. Student satisfaction with honours programme in Azerbaijan.
- Author
-
Abizada, Azar and Mirzaliyeva, Fizza
- Subjects
- *
STUDENTS , *HIGHER education , *F-test (Mathematical statistics) , *T-test (Statistics) - Abstract
In 2014, the Ministry of Education of the Azerbaijan Republic launched an honours programme in several universities to introduce the advanced curriculum and interactive teaching methodology. The purpose of this research is to evaluate the success of the programme via the satisfaction level of the honours students with the delivery of the promised purposes of this programme, such as curriculum, teaching methodology and the programme administration in general. Preliminary results show that honours students are satisfied with each of those variables. Moreover, to understand whether the satisfaction of the students is purely due to the structure of the honours programmes, the satisfaction level of honours students was compared with that of non-honours students from the corresponding degree programmes. The results showed that honours students are more satisfied than the non-honours students in all the above-mentioned criteria. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
7. Paleomagnetic evidence for the emplacement mechanism of an enigmatic boulder accumulation on a coastal cliff top in New South Wales: implications for the Australian Megatsunami Hypothesis.
- Author
-
Thomas, D. N. and Schmidt, P. W.
- Subjects
- *
PALEOMAGNETISM , *MAGNETIZATION , *SOIL creep , *SHIELDS (Geology) , *BOULDERS - Abstract
Paleomagnetic sampling and measurement of a boulder accumulation on Little Beecroft Head on the Illawarra coastline of New South Wales was undertaken to evaluate potential emplacement mechanisms. This deposit is of central importance in the Australian Megatsunami Hypothesis (AMH) debate, but to date, there has been no unequivocal determination of its provenance. The most likely emplacement mechanisms are by slow collapse during denudation of overlying strata, storm wave overwash or a combination of these. Characteristic Remanent Magnetisation (ChRM) directions were obtained from 15 individual boulders and the in situ bedrock platform on which they currently rest. The in situ Permian bedrock has a normal polarity mean ChRM direction of D/I = 1.6°/-66.7° (α95 = 5.2°; k = 33.9) that is statistically indistinguishable from the Present Earth Field direction at the site. The magnetisation is most likely due to Cenozoic/recent weathering, which is common in surficial rocks throughout the Sydney Basin. ChRM directions for the boulders are stable but scattered, although not random, and the mean boulder direction is indistinguishable in geographic (i.e. current in situ) coordinates, at the 5% significance level, from the mean direction of the in situ bedrock. Further statistical tests confirm that the scatter in the mean directions of the boulders and the in situ bedrock is different, at the 5% significance level, with the boulder mean being more scattered. At an individual boulder level, some blocks have mean ChRM directions that are statistically indistinguishable from the mean in situ rock ChRM direction, whereas others are distinguishable at the 5% significance level. These results indicate that the boulders were magnetised prior to emplacement but were not moved far from their original positions during emplacement. The emplacement age is constrained to the last ca 780 000 years. These observations strongly support the hypothesis that the Little Beecroft Head boulder deposit was emplaced by a non-catastrophic mechanism, namely slow collapse during denudation of pre-existing cliff material or overtopping from severe storms, which occur regularly on the east coast of New South Wales. Even if a catastrophic wave were responsible, the results constrain the age of that event to be older than 780 000 years. Therefore, the results presented here are not supportive of the AMH as it currently stands. Further paleomagnetic work, on similar deposits along the Illawarra coastline and from elsewhere in Australia, is needed to evaluate the interpretations presented here. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
8. Empirical likelihood-based robust tests for genetic association analysis with quantitative traits.
- Author
-
Xiong, Wenjun, Su, You, and Ding, Juan
- Subjects
- *
MEDICAL genetics , *CHROMOSOMES , *GENES , *LIKELIHOOD ratio tests , *NUCLEOTIDES , *CHROMOSOME polymorphism - Abstract
Genome-wide association studies (GWAS) are effective in investigating the loci related with complex diseases. For most of these studies, the genetic inheritance model is not known in advance and therefore robust tests are preferred. Empirical likelihood (EL) method is well known for its flexibility and nonparametric properties, but is rarely investigated in GWAS. In this study, we develop EL-based test statistics to detect the association of a disease and genetic loci while the genetic model is unknown. The performance of proposed tests is evaluated by simulations and compared with several existing methods. For illustration, we apply these tests to identify the single nucleotide polymorphisms associated with alkaline phosphatase level on mouse chromosome 6. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
9. Outlier-free merging of homogeneous groups of pre-classified observations under contamination.
- Author
-
Cerasa, Andrea and Cerioli, Andrea
- Subjects
- *
FRAUD prevention , *MAXIMUM likelihood statistics , *INTERNATIONAL trade , *ROBUST statistics , *REGRESSION analysis - Abstract
We study the problem of merging homogeneous groups of pre-classified observations from a robust perspective motivated by the anti-fraud analysis of international trade data. This problem may be seen as a clustering task which exploits preliminary information on the potential clusters, available in the form of group-wise linear regressions. Robustness is then needed because of the sensitivity of likelihood-based regression methods to deviations from the postulated model. Through simulations run under different contamination scenarios, we assess the impact of outliers both on group-wise regression fitting and on the quality of the final clusters. We also compare alternative robust methods that can be adopted to detect the outliers and thus to clean the data. One major conclusion of our study is that the use of robust procedures for preliminary outlier detection is generally recommended, except perhaps when contamination is weak and the identification of cluster labels is more important than the estimation of group-specific population parameters. We also apply the methodology to find homogeneous groups of transactions in one empirical example that illustrates our motivating anti-fraud framework. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
10. Uncertainty evaluation and validation of the test specimen size reduction in the determination of water content in crude oils by coulometric Karl Fischer titration.
- Author
-
de Almeida, F. C., Biazon, C. L., and de Oliveira, E. C.
- Subjects
- *
PETROLEUM , *SIZE reduction of materials , *COULOMETRY , *KARL Fischer technique , *F-test (Mathematical statistics) - Abstract
The authors aim to evaluate the uncertainty and to validate the test specimen size reduction in the determination of water content in crude oils by coulometric Karl Fischer titration. The most relevant uncertainty sources are the volume of the microsyringe (resolution and temperature effect) and the repeatability.F-test andt-test show that there is no significant effect in the reduction of the test specimen size. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
11. How the Maximal Evidence of P -Values Against Point Null Hypotheses Depends on Sample Size.
- Author
-
Held, Leonhard and Ott, Manuela
- Subjects
EYEWITNESS accounts ,NULL hypothesis ,FALSE positive error ,SAMPLING (Process) ,EXPERIMENTAL design - Abstract
Minimum Bayes factors are commonly used to transform two-sidedp-values to lower bounds on the posterior probability of the null hypothesis. Several proposals exist in the literature, but none of them depends on the sample size. However, the evidence of ap-value against a point null hypothesis is known to depend on the sample size. In this article, we considerp-values in the linear model and propose new minimum Bayes factors that depend on sample size and converge to existing bounds as the sample size goes to infinity. It turns out that the maximal evidence of an exact two-sidedp-value increases with decreasing sample size. The effect of adjusting minimum Bayes factors for sample size is shown in two applications. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
12. Predicting the uniaxial compressive strength of cemented paste backfill from ultrasonic pulse velocity test.
- Author
-
Yılmaz, Tekin and Ercikdi, Bayram
- Subjects
- *
MATERIALS compression testing , *CONTROLLED low-strength materials (Cement) , *METAL tailings , *ULTRASONIC measurement , *SCANNING electron microscopy - Abstract
The aim of this study is to investigate the predictability of the uniaxial compressive strength (UCS) of cemented paste backfill (CPB) prepared from three different tailings (Tailings T1, Tailings T2 and Tailings T3) using ultrasonic pulse velocity (UPV) test. For this purpose, 180 CPB samples with diameter × height of 5 × 10 cm (similar to NX size) prepared at different binder dosages and consistencies were subjected to the UPV and UCS tests at 7–56 days of curing periods. The effects of binder dosage and consistency on the UPV and UCS properties of CPB samples were investigated and UCS values were correlated with the corresponding UPV data. Microstructural analyses were also performed on CPB samples in order to understand the effect of microstructure (i.e. total porosity) on the UPV data. The UPV and UCSs of CPB samples increased with increasing binder dosage and reducing the consistency irrespective of the tailings type and curing periods. Changes in the mixture properties observed to have a lesser extent on the UPV properties of CPB, while, their effect on the UCS of CPB was significant. Empirical equations were produced for each mixture in order to predict the UCSs of CPB through UPV. The validity of the equations was also checked byt- andF-test. The results showed that a linear relation appeared to exist between the UPV and UCS with high correlation coefficients (r ≥ 0.79) and all models were valid by statistical analysis. Mercury intrusion porosimetry (MIP) and scanning electron microscope (SEM) analyses have revealed that the UPV properties of CPB samples were highly associated with their respective microstructural properties (i.e. total porosity). The major output of this study is that UPV test can be effectively used for a preliminary prediction of the strength of CPB. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
13. Investigating efficacy of robust M-estimation of deformation from observation differences.
- Author
-
Nowel, K.
- Abstract
Generalized robust M-estimation of deformation from observation differences (GREDOD) is a robust method for deformation analysis of geodetic control networks. This method has been developed based on the well known robust Iterative weighted similarity transformation (IWST) method. Hence, in the GREDOD method, as in the IWST method, the L1-norm weight function is the weight function for the displacement vector and the displacement vector components are the weight function variables. The L1-norm weight function for variables in the form of the displacement vector components is the simplest and most natural solution, but it is not known whether this is the most efficacious solution for the GREDOD method. To assess this, the current study used different robust weight functions which were tested for variables in the widely used form of the displacement vector components and for variables in the form of displacement lengths. All solutions were tested on the basis of the simulated two-epoch observations of the absolute control network of the Montsalvens dam in Switzerland. The efficacy measure for individual solutions was the mean success rate (MSR). [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
14. Nonparametric Tests for Evaluation of Biosimilarity in Variability of Follow-on Biologics.
- Author
-
Zhang, Nan, Yang, Jun, Chow, Shein-Chung, and Chi, Eric
- Subjects
- *
BIOLOGICALS , *NONPARAMETRIC estimation , *BIOLOGICAL products ,PATIENT Protection & Affordable Care Act - Abstract
As more biologic products are going off patent protection, the development of follow-on biologic products (also known as biosimilars) has gained much attention from both the biotechnology industry and regulatory agencies. Unlike small molecules, the development of biologic products is not only more complicated but also sensitive to a small change in procedure/environment during the manufacturing process. In practice, biologics are expected to have much larger variation, which will potentially impact the product quality and potency. Thus, it is suggested that the assessment of biosimilarity between biologic products should take variability into consideration, in addition to average biosimilarity of endpoints of interest. In this article, we propose the use of nonparametric tests for evaluation of biosimilarity in variability between the follow-on biologic product and the reference product. Extensive simulations are conducted to compare the relative performance of the proposed methods with the adapted parametric F-test in terms of correctly concluding biosimilarity in variability. Under normality assumption, the proposed nonparametric tests are found to be comparably well with the adapted F-test. However, the proposed methods are more robust when the assumption is violated. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
15. A Heteroskedasticity-Robust F -Test Statistic for Individual Effects.
- Author
-
Orme, ChrisD. and Yamagata, Takashi
- Subjects
- *
HETEROSCEDASTICITY , *F-test (Mathematical statistics) , *ASYMPTOTIC distribution , *PANEL analysis , *STATISTICAL errors , *MATHEMATICAL transformations - Abstract
We derive the asymptotic distribution of the standard F-test statistic for fixed effects, in static linear panel data models, under both non-normality and heteroskedasticity of the error terms, when the cross-section dimension is large but the time series dimension is fixed. It is shown that a simple linear transformation of the F-test statistic yields asymptotically valid inferences and under local fixed (or correlated) individual effects, this heteroskedasticity-robust F-test enjoys higher asymptotic power than a suitably robustified Random Effects test. Wild bootstrap versions of these tests are considered which, in a Monte Carlo study, provide more reliable inference in finite samples. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
16. Testing linear factor models on individual stocks using the average F -test.
- Author
-
Hwang, Soosung and Satchell, Stephen E.
- Subjects
LINEAR statistical models ,STOCKS (Finance) ,CAPITAL assets pricing model ,MULTIVARIATE analysis ,PERTURBATION theory ,ANALYSIS of covariance - Abstract
In this paper, we propose the averageF-statistic for testing linear asset pricing models. The average pricing error, captured in the statistic, is of more interest than theex postmaximum pricing error of the multivariateF-statistic that is associated with extreme long and short positions and excessively sensitive to small perturbations in the estimates of asset means and covariances. The averageF-test can be applied to thousands of individual stocks and thus is free from the information loss or the data-snooping biases from grouping. This test is robust to ellipticity, and more importantly, our simulation and bootstrapping results show that the power of the averageF-test continues to increase as the number of stocks increases. Empirical tests using individual stocks from 1967 to 2006 demonstrate that the popular four-factor model (i.e. Fama–French three factors and momentum) is rejected in two sub-periods from 1967 to 1971 and from 1982 to 1986. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
17. Construction of a parsimonious kinetic model to capture microbial dynamics via parameter estimation.
- Author
-
Feng, Xueyang, Tang, Yinjie J., and Dolan, Kirk D.
- Subjects
- *
PARSIMONIOUS models , *PARAMETER estimation , *CHEMICAL synthesis , *BIOREMEDIATION , *CHEMICAL kinetics , *SEQUENTIAL analysis , *MATHEMATICAL models - Abstract
Understanding microbial kinetic behaviour is important for bioprocessing engineering, such as chemical synthesis and bioremediation. However, development of proper models to capture complicated microbial kinetics is a challenging task. In this study, we demonstrate a comparative analysis of the complexity and accuracy tradeoff for modelling the growth ofShewanella oneidensisMR-1 in a batch culture. Based on a series of analyses, including residual analysis, scaled sensitivity coefficient analysis, parameter correlation analysis and theF-test, we estimated model parameters to construct a parsimonious Monod-based model that used the fewest parameters for best simulation ofShewanellagrowth using different carbon substrates. Sequential analysis was also applied to identify the time window for estimating each parameter in the kinetic model. This study shows that statistics-based parameter estimation is an efficient method to successively reconstruct and fine tune kinetic models for complex biological systems. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
18. Random Effects Coefficient of Determination for Mixed and Meta-Analysis Models.
- Author
-
Demidenko, Eugene, Sargent, James, and Onega, Tracy
- Subjects
- *
COEFFICIENTS (Statistics) , *META-analysis , *STATISTICAL models , *ESTIMATION theory , *VARIANCES , *REGRESSION analysis - Abstract
The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, , that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects—the model can be estimated using the dummy variable approach. We derive explicit formulas for in three special cases: the random intercept model, growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S.; (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset; and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
19. Sequential testing of process capability indices.
- Author
-
Hussein, Abdulkadir, Ahmed, S. Ejaz, and Bhatti, S.
- Subjects
- *
SEQUENTIAL analysis , *WIENER processes , *APPROXIMATION theory , *ALGORITHMS , *MONTE Carlo method , *PERFORMANCE evaluation , *CALIBRATION - Abstract
A Process Capability Index (PCI) is a numeric summary that compares the behaviour of a product or process characteristics with engineering specifications. We propose a sequential procedure for testing whether two processes are equally capable by using the PCI (C pm). We employ a non-sequential Wald-type statistic and provide its sequential version by Brownian motion approximations. We point out that, as a byproduct, the non-sequential Wald-type statistic used here provides an easily computable alternative to Boyels’ approximate F-test [Boyels, The Taguchi Capability Index, J. Quality Technol. 23 (1991), pp. 17–26]. We give an algorithm for conducting the sequential test and we examine its performance by using Monte Carlo simulations. Finally, we illustrate the method by testing capability improvement of an industrial process before and after calibration based on published data. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
20. Interval Estimation for the Difference of Two Independent Variances.
- Author
-
Herbert, RobertD., Hayen, Andrew, Macaskill, Petra, and Walter, StephenD.
- Subjects
- *
INTERVAL analysis , *INDEPENDENT component analysis , *ESTIMATION theory , *VARIANCES , *SIMULATION methods & models , *ROBUST control , *HOMOSCEDASTICITY - Abstract
Analytical methods for interval estimation of differences between variances have not been described. A simple analytical method is given for interval estimation of the difference between variances of two independent samples. It is shown, using simulations, that confidence intervals generated with this method have close to nominal coverage even when sample sizes are small and unequal and observations are highly skewed and leptokurtic, provided the difference in variances is not very large. The method is also adapted for testing the hypothesis of no difference between variances. The test is robust but slightly less powerful than Bonett's test with small samples. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
21. Design and Sample Size for Evaluating Combinations of Drugs of Linear and Loglinear Dose-Response Curves.
- Author
-
Fang, Hong-Bin, Tian, Guo-Liang, Li, Wei, and Tan, Ming
- Subjects
- *
DOSE-effect relationship in pharmacology , *DRUG synergism , *DRUG antagonism , *COMBINATION drug therapy , *CLINICAL drug trials , *DRUG development , *SAMPLE size (Statistics) - Abstract
The study of drug combinations has become important in drug development due to its potential for efficacy at lower, less toxic doses and the need to move new therapies rapidly into clinical trials. The goal is to identify which combinations are additive, synergistic, or antagonistic. Although there exists statistical framework for finding doses and sample sizes needed to detect departure from additivity, e.g., the power maximized F-test, different classes of drugs of different does-response shapes require different derivation for calculating sample size and finding doses. Motivated by two anticancer combination studies that we are involved with, this article proposes dose-finding and sample size method for detecting departures from additivity of two drugs with linear and log-linear single dose-response curves. The first study involves combination of two drugs, where one single drug dose-response curve is linear and the other is log-linear. The second study involves combinations of drugs whose single drug dose-response curves are linear. The experiment had been planned with the common fixed ratio design before we were consulted, but the resulting data missed the synergistic combinations. However, the experiment based on the proposed design was able to identify the synergistic combinations as anticipated. Thus we shall summarize the analysis of the data collected according to the proposed design and discuss why the commonly used fixed ratio method failed and the implications of the proposed method for other combination studies. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
22. Power of double-sampling tests for General Linear Hypotheses.
- Author
-
Causeur, David and Husson, François
- Subjects
- *
STATISTICAL sampling , *STATISTICS , *APPROXIMATION theory , *REGRESSION analysis , *MATHEMATICAL optimization , *NUMERICAL analysis - Abstract
In this paper, testing procedures based on double-sampling are proposed that yield gains in terms of power for the tests of General Linear Hypotheses. The distribution of a test statistic, involving both the measurements of the outcome on the smaller sample and of the covariates on the wider sample, is first derived. Then, approximations are provided in order to allow for a formal comparison between the powers of double-sampling and single-sampling strategies. Furthermore, it is shown how to allocate the measurements of the outcome and the covariates in order to maximize the power of the tests for a given experimental cost. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
23. On the geometry of F, Wald, LR, and LM tests in linear regression models.
- Author
-
Siniksaran, Enis
- Subjects
- *
GEOMETRY , *LAGRANGE equations , *GRAPHICAL projection , *ANGLES , *STATISTICS , *REGRESSION analysis - Abstract
In this article, we examine F, Wald, LR, and LM test statistics in the linear regression model using vector geometry. These four statistics are expressed as a function of one random variable-the angle between the vectors of unrestricted and restricted residuals. The exact and nominal sampling distributions of this angle are derived to illuminate some facts about the four statistics. Alternatively, we offer that the angle itself can be used as a test statistic. A Mathematica program is also written to carry out the approach. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
24. Finite sample performance of sequential designs for model identification.
- Author
-
Dette, Holger and Kwiecien, Robert
- Subjects
- *
REGRESSION analysis , *MULTIVARIATE analysis , *MATHEMATICS , *SEQUENTIAL analysis , *MATHEMATICAL statistics - Abstract
Classical regression analysis is usually performed in two steps. In the first step, an appropriate model is identified to describe the data generating process and in the second step, statistical inference is performed in the identified model. An intuitively appealing approach to the design of experiment for these different purposes are sequential strategies, which use parts of the sample for model identification and adapt the design according to the outcome of the identification steps. In this article, we investigate the finite sample properties of two sequential design strategies, which were recently proposed in the literature. A detailed comparison of sequential designs for model discrimination in several regression models is given by means of a simulation study. Some non-sequential designs are also included in the study. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
25. Reducing production variability using factorial optimisation: A case study from the food-packaging industry.
- Author
-
Macak, Tomas, Hron, Jan, and Ticha, Ivana
- Subjects
- *
PACKAGING industry , *FOOD packaging , *OPTIMAL designs (Statistics) , *PRODUCTION (Economic theory) , *SUPPLY chain management - Abstract
In industry, many phenomena and events that arise that cannot be predicted because they represent changes in production. Such random effects and events can significantly influence all aspects of the manufacturing process. The optimal design of any production system can be generated based on a complete, correct set of input information. However, such a requirement is unrealistic, as manufacturing systems are affected by several random factors. Current practice is based on determining the worst possible conditions in which a production system could run (the longest possible duration of outages in the supply chain, extreme weather conditions in agriculture, etc.). This article aimed to identify factors influencing the variability of the manufacturing process in the field of CNC (Computer Numerical Control) machining for food production. A secondary aim was to minimise the variability of the manufacturing process using a factorial design. The variability reduction was verified using statistical F-tests. The study on reducing variability in production was performed at the Czech Yuncheng Plate Making Co., Ltd., a professional rotogravure cylinder-making company. The cylinders are used for the food industry. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
26. Reinforcing Sampling Distributions Through a Randomization-Based Activity for Introducing ANOVA.
- Author
-
Taylor, Laura and Doehler, Kirsten
- Subjects
- *
STATISTICAL sampling , *RANDOMIZATION (Statistics) , *ANALYSIS of variance , *F-test (Mathematical statistics) , *STATISTICAL hypothesis testing - Abstract
This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course participated in this study. When the topic of ANOVA was introduced, two sections were randomly assigned to participate in a traditional approach (one for each instructor), while the other two sections participated in a randomization-based method. Students were administered a pre-test and a post-test that asked them to apply hypothesis testing concepts to a new scenario. Students also responded to some basic conceptual questions. This study showed mixed results related to students' understanding of the logic of inference, but the activity utilized shows some promising gains in student understanding of sampling distributions. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.