178 results
Search Results
2. Rejoinder.
- Author
-
Zhou, Yang, Xue, Lirong, Shi, Zhengyu, Wu, Libo, and Fan, Jianqing
- Subjects
- *
STATISTICS , *BIG data - Abstract
We agree that converting data to similar resolutions would lose some statistical information from our data. We thank gratefully Professors Wei Tu, Bei Jiang, Linglong Kong, and Professor Sudipto Banerjee for their insightful comments and discussion of our paper. The inference of the occupancy status of houses sharing one location might require more information at the individual level, which is the primary data barrier in social science. [Extracted from the article]
- Published
- 2022
- Full Text
- View/download PDF
3. Comment on: "Confidence Intervals for Nonparametric Empirical Bayes Analysis" by Ignatiadis and Wager.
- Author
-
Imbens, Guido
- Subjects
- *
EMPIRICAL Bayes methods , *VALUE-added assessment (Education) , *CLINICAL drug trials , *STATISTICS , *PROBABILITY theory - Abstract
In these cases getting accurate confidence intervals is of first order importance, and the methods Ignatiadis and Wager develop are likely to be useful. I want to congratulate Nikolaos Ignatiadis and Stefan Wager on a very stimulating paper on a timely and important topic. [Extracted from the article]
- Published
- 2022
- Full Text
- View/download PDF
4. An Evaluation of Model-Dependent and Probability-Sampling Inferences in Sample Surveys.
- Author
-
Hansen, Morris H., Madow, William G., and Tepping, Benjamin J.
- Subjects
- *
STATISTICAL sampling , *PROBABILITY theory , *SURVEYS , *POPULATION , *SCALING laws (Statistical physics) , *STATISTICS , *ESTIMATION theory - Abstract
In this paper we are concerned with inferences from a sample survey to a finite population. We contrast inferences that are dependent on an assumed model with inferences based on the randomization induced by the sample selection plan. Randomization consistency for finite population estimators is defined and adopted as a requirement of probability sampling. A numerical example is examined to illustrate the dangers in the use of model-dependent estimators even when the model is apparently consonant with the sample data. The paper concludes with a summary of principles that we believe should guide the practitioner of sample surveys of finite populations. [ABSTRACT FROM AUTHOR]
- Published
- 1983
- Full Text
- View/download PDF
5. Comment: John H. Schuenemeyer.
- Author
-
Schuenemeyer, John H.
- Subjects
- *
ESTIMATION theory , *NATURAL resources , *PETROLEUM products , *NATURAL gas , *PETROLEUM , *PETROLEUM engineering , *STATISTICS - Abstract
In this article, the author comments on the research paper of John J. Wiorkowski titled "Estimating Volumes of Remaining Fossil Fuel Resources: A Critical Review," published in the September 1981 issue of "Journal of the American Statistical Association." According to the author, the strength of Wiorkowski's paper is in bringing the important problem of petroleum resource estimation to the attention of statisticians. The major weaknesses are in the organization and classification of models and the gaps in coverage. The major issues in resource estimation that Wiorkowski discusses are models for estimating the total oil and gas remaining to be found and a closely related problem, estimating the amount of appreciation in known fields. The problem with the discussion of the models is the lack of a proper classification scheme. Although Wiorkowski mentions the model of researchers E. Barouch and G.M. Kaufman, he fails to note that it is one of a general class of models that are based on the discovery process and fails to emphasize that it is a play model, whereas the others he discusses are used at a higher level of aggregation.
- Published
- 1981
- Full Text
- View/download PDF
6. Comment.
- Author
-
Kempthorne, Oscar
- Subjects
- *
MATHEMATICAL statistics , *STOCHASTIC processes , *ECONOMICS , *RANDOM variables , *PROBABILITY theory , *STATISTICIANS , *ANALYSIS of variance , *RANDOM sets , *STATISTICS - Abstract
The article presents the author's comments on paper by researcher D. Basu related to randomization analysis of experimental data. Basu writes entertainingly, perhaps, but not informatively. Basu's paper discusses prerandomization, postrandomization, and unrecorded randomization. This discussion is irrelevant. But it is useful, perhaps, to make a remark. It also discusses the sufficiency principle. As Basu has written, this is a data-reduction principle. Basu also discusses researcher R.A. Fisher randomization test. It is obvious that the population in a randomization test of a randomized experiment is "the product of the statistician's imagination." With respect to Basu's writing on "the physical act of randomization," the author believes Basu is merely plain wrong. The paper also describes randomized pair trial. In his paper, Basu gives a hypothetical interchange of a statistician and a scientist and the author. The author suggests that this serves no useful purpose. The author finds the lack of knowledge that underlies Basu's thesis rather surprising, incongruous, and deplorable.
- Published
- 1980
- Full Text
- View/download PDF
7. Density Estimation and Bump-Hunting by the Penalized Likelihood Method Exemplified by Scattering and Meteorite Data: Comment.
- Author
-
Parzen, Emanuel
- Subjects
- *
CLUSTER analysis (Statistics) , *ESTIMATION theory , *PROBABILITY theory , *STATISTICAL sampling , *DENSITY functionals , *ECONOMISTS , *STATISTICAL correlation , *MULTIVARIATE analysis , *LEAST squares , *SAMPLE size (Statistics) , *STATISTICS - Abstract
The author is pleased to discuss a paper on the estimation of probability density functions and the location of bumps. There is an extensive literature on density estimation, but many statisticians seem doubtful about the usefulness of these techniques because their application seems subjective and complicated. A major criticism the author would make of this paper is that it does not help to dispel this negative attitude of statisticians toward density estimation. One cannot help but be impressed by the ingenuity of I. J. Good and R. A. Gaskins and even to believe that they may be able to successfully fit probability densities to data. The existence of bumps at the extremes of a sample can be investigated for large sample sizes by treating the bottom and top ends of the original sample as two new samples to be analyzed by themselves. Statistical scientists should heed the flippant advice of Sir Arthur Eddington: "Never trust an experimental result until it has been confirmed by theory."
- Published
- 1980
- Full Text
- View/download PDF
8. A Bayesian Look at Inverse Linear Regression.
- Author
-
Hoadley, Bruce
- Subjects
- *
REGRESSION analysis , *INVERSE functions , *MATHEMATICAL statistics , *BAYESIAN analysis , *STATISTICS , *STATISTICAL decision making , *PROBABILITY theory - Abstract
The model considered in this paper is simple linear regression (Ey[sub i] = beta[sub 1] + beta[sub 2] x[sub 1], i = 1, ..., n), and the problem is to make statistical inferences about an unknown value of x corresponding to one or more additional observed values of y. The maximum likelihood estimator x of x and the classical (l - alpha) 100% confidence set S for x have some undesirable properties. For example, x has infinite mean square error and P {S = (- infinity, + infinity)} > 0. The purpose of this paper is to demonstrate that insight and understanding, as well as a useful class of solutions, can be obtained by looking at the problem from a Bayesian point of view. A result which follows from a general Bayes solution is that the inverse estimator [4] is Bayes with respect to a particular informative prior. [ABSTRACT FROM AUTHOR]
- Published
- 1970
- Full Text
- View/download PDF
9. MOMENTS OF THE DISTRIBUTION OF SAMPLE SIZE IN A SPRT.
- Author
-
Ghosh, B. K.
- Subjects
- *
MOMENTS method (Statistics) , *STATISTICAL sampling , *ARITHMETIC , *DISTRIBUTION (Probability theory) , *DIFFERENTIABLE functions , *PROBABILITY theory , *APPROXIMATION theory , *EQUATIONS , *STATISTICS - Abstract
The article discusses moments of the distribution of sample size, N in a sequential probability ratio test (SPRT). The present paper provides variance, the third and the fourth moments of N. The details are worked out in five common applications of the SPRIT. The relation of the variance of N to the truncation of a SPRT is discussed is also discussed in the paper. Scholar A. Wald indicated in passing how one can obtain the moments of N, but the only published literature where the author encountered a general expression for the variance of N. However, their expression is incorrect. Using scholar J. Wolfowitz's results, which they do, or differentiating Wald's, fundamental identity twice one gets provided. In many practical applications of the SPRT, μ and moments in an equation derived are differentiable functions of a real-valued parameter. The limiting expressions for the moments can then be determined by standard methods of mathematical analysis. However, for the third and fourth moments the actual technique may involve an excessive amount of arithmetic.
- Published
- 1969
- Full Text
- View/download PDF
10. A COMPARISON BETWEEN THE POWER OF THE DURBIN-WATSON TEST AND THE POWER OF THE BLUS TEST.
- Author
-
Abrahamse, A. P. J. and J. Koerts
- Subjects
- *
PROBABILITY theory , *DISTRIBUTION (Probability theory) , *STATISTICS , *VON Neumann algebras , *HYPOTHESIS , *STATISTICAL hypothesis testing , *DECISION making - Abstract
In an earlier paper [5] the authors compared the power of the BLUS test with the probability of a correct decision of the Durbin-Watson bounds test. A method to compute the distribution of the Von Neumann ratio under the null hypothesis and under the alternative hypothesis was given. In the present paper the latter method is used to tabulate the BLUS-test statistic and to compute the exact significance points of the Durbin-Watson test for several examples. Powers of both tests are computed and compared. It appears that, for the cases considered, the power of the exact Durbin-Watson test exceeds that of the BLUS procedure, while the latter is greater than the probability of a correct decision in the Durbin-Watson bounds test. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
11. THE EXCEEDANCE TEST FOR TRUNCATION OF A SUPPLIER'S DATA.
- Author
-
Deely, J. J., Amos, D. E., and Steck, G. P.
- Subjects
- *
DISTRIBUTORS (Commerce) , *STATISTICAL sampling , *DATABASE management , *SUPPLIERS , *ELECTRONIC data processing , *SUPPLY chains , *SAMPLE size (Statistics) , *STATISTICS , *TESTING - Abstract
The purpose of this paper is to present an easily applied test useful in determining whether or not a supplier's data have been truncated. The proposed test has the following desirable properties: (I) it is the uniformly most powerful rank test, (ii) it is asymptotically uniformly most powerful, (iii) power computations can easily be made for arbitrary sample sizes, formulas for such computations being given in the paper. Although formulated in the context of verifying a supplier's data, the test can be applied to other situations in which false representation of data in the form of truncation is important. Such is the case, for example, in reliability demonstrations or legal suits involving physical measurements. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
12. SPECTRAL EVALUATION OF BLS AND CENSUS REVISED SEASONAL ADJUSTMENT PROCEDURES.
- Author
-
Rosenblatt, Harry M.
- Subjects
- *
SEASONAL employment , *UNEMPLOYMENT , *ECONOMIC seasonal variations , *EMPLOYMENT , *STATISTICS on the working class , *SURVEYS , *LABOR , *STATISTICS - Abstract
This paper presents spectral criteria to evaluate seasonal adjustment procedures, and applies them to the estimated spectral properties of the monthly unemployment series, as adjusted by past and present methods of the Bureau of Labor Statistics (BLS) and of the Census Bureau. It also re-examines some conclusions reported in the Gordon Committee Report (1962) and in Nerlove (1964) on past methods of adjustment. The paper concludes that (1) the loss in spectral power between the unadjusted and adjusted series spectra over most of the frequency range could arise with a satisfactory adjustment, (2) the excessive loss of spectral power at seasonal and trend-cycle frequencies present in past methods of adjustment has been reduced in present methods, (3) the effect of deviations from desired spectral properties on the uses of seasonally adjusted data must be examined in the time domain; for certain applications they have been found to be unimportant. [ABSTRACT FROM AUTHOR]
- Published
- 1968
- Full Text
- View/download PDF
13. METHOD OF CONSTRUCTION OF ATTRITION LIFE TABLES FOR THE SINGLE POPULATION BASED ON TWO SUCCESSIVE CENSUSES.
- Author
-
Kumar, Joginder
- Subjects
- *
CONSTRUCTION & demolition debris , *DISTRIBUTION (Probability theory) , *DEMOGRAPHIC surveys , *ESTIMATION theory , *MARRIAGE , *PENANCE , *STATISTICS , *SURVEYS - Abstract
This paper describes a method of estimating the number of marriages during an inter-censal period among a group of never-married persons. Two alternative procedures are suggested: the first based on data on the never-married category and the second based on data on the ever-married category. The essential data needed are marital distributions by quinquennial age-groups from two successive censuses. The marital data from the 1951 and 1961 censuses of India are used to discuss the methodological problems involved in the estimation of marriage frequencies and in the construction of nuptiality tables based on them. Marriage and death are taken as the two attrition factors for decrement of single population. Gross and Net Attrition Tables for the single population of India are constructed for each sex. Section 5 describes various nuptiality measures derived from the Attrition Tables. It is suggested that the approach developed in this paper can be utilized for reliable marital and, subsequently, fertility projections. [ABSTRACT FROM AUTHOR]
- Published
- 1967
- Full Text
- View/download PDF
14. TIES IN PAIRED-COMPARISON EXPERIMENTS: A GENERALIZATION OF THE BRADLEY-TERRY MODEL.
- Author
-
Rao, P. V. and Kupper, L. L.
- Subjects
- *
PROBABILITY theory , *MATHEMATICS , *BAYESIAN analysis , *PARAMETERS (Statistics) , *STATISTICS - Abstract
The Bradley-Terry model for a paired-comparison experiment with t treatments postulates a set of t 'true' treatment ratings π1, π2, ..., π1 such that π1 is ≥ 0, σpi;i = 1 and the probability for preferring treatment i to treatment j is π2 (π2 + πj)-1. Thus, according to this model, every comparison of two treatments results in a definite preference for one of the two. This is an unrealistic restriction since when there is no difference between the responses due to two treatments, any method of expressing preference for one over the other is somewhat arbitrary. This paper considers a modification of the Bradley-Terry model by introducing an additional parameter, called threshold parameter, into the model. This permits 'ties' in the model. The problem of estimation and tests of hypotheses for the parameters of the modified model is also dealt with in the paper. [ABSTRACT FROM AUTHOR]
- Published
- 1967
- Full Text
- View/download PDF
15. ORDER STATISTICS ESTIMATORS OF THE LOCATION OF THE CAUCHY DISTRIBUTION.
- Author
-
Barnett, V. D.
- Subjects
- *
ESTIMATION theory , *DISTRIBUTION (Probability theory) , *STATISTICS , *ARITHMETIC mean , *ORDER statistics , *VARIANCES , *STATISTICAL sampling , *MEDIAN (Mathematics) - Abstract
In a recent paper in this Journal, Rothenberg, Fisher and Tilanus [1] discuss a class of estimators of tile location parameter of the Cauchy distribution, taking the form of the arithmetic average of a central subset, of the sample order statistics. They show that the average of roughly the middle quarter of the ordered sample has minimum asymptotic variance within this class, and that asymptotically it eliminates about. 36 per cent of the efficiency loss of the median (the most commonly used estimate,r) in comparison to the maximum likelihood estimator (m.l.e.). Of course both the m.l.e, and the best linear unbiased estimator based on the order statistics (BLUE) achieve full asymptotic efficiency in the Cramer-Rao sense and there can be no dispute about the relative merits of the three estimators asymptotically, or about the inferiority of the median (with asymptotic efficiency 8/pi[sup 2] * This character cannot be converted in ASCII text) 0.8 compared with about 0.88 for the estimator of Rothenberg et al.). In any practical situation however, we will be concerned with estimation from samples of finite size and asymptotic properties will not necessarily give any guidance here. We are essentially concerned with two points in assessing the relative merits of estimators in small samples, their case of application and "small-sample efficiency" which is conveniently measured as the ratio of the Cramer-Rao lower bound to the variance of the estimator. In this paper various estimators of the location of the Cauchy distribution arc compared in these two respects for samples of up to 20 observations. The small-sample properties of the m.l.e. have been extensively discussed elsewhere (Barnett [2]) and relevant results are summarized where necessary. The main purpose of the paper is to discuss general linear estimators based on the order statistics, and to assess their utility in the present context. Since this paper was prepared a further interesting 'quick estimator', b [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
16. SOME PROBABILITIES, EXPECTATIONS AND VARIANCES FOR THE SIZE OF LARGEST CLUSTERS AND SMALLEST INTERVALS.
- Author
-
Naus, J. I.
- Subjects
- *
UNIFORM distribution (Probability theory) , *DISTRIBUTION (Probability theory) , *ANALYSIS of variance , *VARIANCES , *ESTIMATION theory , *STATISTICS , *PROBABILITY theory - Abstract
Given N points independently drawn from the uniform distribution on (0, 1), let p[sub n], be the size of the smallest interval that contains n out of the N points; let n[sub p], be the largest number of points to be found in any subinterval of (0, 1) of length p. This paper uses a result of Karlin, McGregor, Barton and Mallows to determine the distribution of n[sub p] for p = 1/k, k an integer. The paper gives simple determinations for the expectations and variances of p[sub n], for all fixed n > (N + 1)/2, and of n[sub 1/2]. The distribution and expectation of n[sub p] are estimated and tabulated for the cases p = 0.1(0.1)0.9, N =2(1)10. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
17. ON ROBUST PROCEDURES.
- Author
-
Gastwirth, Joseph L.
- Subjects
- *
DENSITY functionals , *ROBUST statistics , *DISTRIBUTION (Probability theory) , *PITMAN'S measure of closeness , *STATISTICAL sampling , *STATISTICS , *PARAMETER estimation , *RANKING - Abstract
This paper discusses a procedure for finding robust; estimators of the location parameter of symmetric unimodal distributions. The estimators are based on robust rank tests and the methods used are applicable to other one parameter problems. To every density function there corresponds an asymptotically most powerful rank test (a.m.p.r.t.). For a set F of density functions the maximin rank test, R, maximizes the minimum limiting Pitman's efficiency of R relative to the a.m.p.r.t. for each member of F. This maximin test, R, can be used to construct estimators according to the proposal of Hodges and Lehman; it generates another estimator T in the following manner. If the test based on R is the a.m.p.r.t. for samples from a density function g(x - theta), then the estimator T will be the best linear unbiased estimate (b.l.u.e.) of the location parameter for samples from g(x). Unfortunately, the estimator T is not necessarily consistent for all members of F. A class of rank tests which generate linear combinations of a few order statistics is introduced and a simple estimator using the 33 1/3rd, 50th and 66 2/3rd percentiles is proposed. The relationship of the present paper to the work of Huber is discussed and it is shown that the b.l.u.e. corresponding to his least favorable distribution is the trimmed mean. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
18. A NOTE ON THE ESTIMATION OF THE LOCATION PARAMETER OF THE CAUCHY DISTRIBUTION.
- Author
-
Bloch, Daniel
- Subjects
- *
ASYMPTOTIC theory in estimation theory , *ESTIMATION theory , *CAUCHY integrals , *LOCATION analysis , *ASYMPTOTIC expansions , *DISTRIBUTION (Probability theory) , *STATISTICAL sampling , *VARIANCES , *STATISTICS - Abstract
Recently Professors T. J. Rothenberg, F. M. Fisher, and C. B. Tilanus published a paper proposing the class of trimmed means as estimators of the location parameter of the Cauchy distribution [5]. They showed that the asymptotic sampling variance of the estimators in this class is essentially minimized by using the middle 24% of the sample order statistics. The corresponding estimate has an asymptotic relative efficiency to the best estimator for complete samples (A.R.E.) of .87796 as compared to an A.R.E. of .81057 for the sample median. In this paper a few "quick estimators" are considered as estimators for the location parameter of the Cauchy Distribution. A "quick estimate" is a linear combination (a weighted average) of one or more order statistics. Our goal is to find a simple estimator, i.e. an estimator based on only a few order statistics, which has an A.R.E. of at least 90%. We found an estimator based on five order statistics which is considerably better than the optimum trimmed mean (using the middle 24% of the sample order statistics) and much better than the sample median. The A.R.E. of the optimum censored estimate with censored fractiles .38 and .62 is also found, and a comparison between the trimmed, censored, and proposed estimators is made. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
19. SYSTEMATIC SAMPLING WITH UNEQUAL PROBABILITY AND WITHOUT REPLACEMENT.
- Author
-
Hartley, H. O.
- Subjects
- *
ESTIMATION theory , *STATISTICS , *PROBABILITY theory , *STATISTICAL sampling , *ANALYSIS of variance , *SAMPLE size (Statistics) - Abstract
Given a population of N units, it is required to draw a sample of n distinct units in such a way that the probability for the ith unit to be in the sample is proportional to its 'size' x. From the alternative methods of achieving this we consider here only the so-called systematic method which, to the best of our knowledge, was first developed by W. G. Madow (1949): The units in the population are listed in a 'particular' order, their x, accumulated and a systematic selection of n elements from a 'random start' is then made on the accumulation. In a more recent paper (H. O. Hartley and J. N. K. Rao (1962) ) an asymptotic estimation theory (for large N) associated with this procedure was developed for the case when the order of the listed units is random. In this paper we draw attention to certain properties of Madow's estimator: We utilize the fact that with systematic sampling the total number of different samples is N (rather than ([This eq. cannot be change in char.]) as with completely random sampling). This simplification in the definition of the variance of the estimator in repeated sampling enables us to identify the exact variance of Madow's estimator with a 'between sample mean square' in a special analysis of variance (see section 4) and compare it with the variance of the pps estimator in sampling with replacement as well as in other sampling procedures. We also develop two approximate methods of variance estimation (see section 5). We pay particular attention to the case when the units are listed in the order of their size. With this particular arrangement our method can be described as 'systematic with random start' and the gain in precision that we accomplish has of course, analogues in systematic sampling with equal probabilities employing ratio estimators in which there is a relation between the ratio ri =yi/Xi and xi Compared with other methods the present procedure combines the advantage of ease of systematic sample selection with the availability of exact variance formulas for any n and N. Moreover, it usually leads to a more efficient estimate. Its shortcoming resides in the fact that the estimation of the variance is based on certain assumptions. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
20. SOME SCHEFFE-TYPE TESTS FOR SOME BEHRENS-FISHER-TYPE REGRESSION PROBLEMS.
- Author
-
Potthoff, Richard F.
- Subjects
- *
REGRESSION analysis , *TEACHING , *GAUSSIAN distribution , *HYPOTHESIS , *VARIANCES , *STATISTICS , *METHODOLOGY - Abstract
In educational and psychological applications as well as in other applications, it may be necessary to make certain comparisons of two regression lines when the variances are unequal. Such problems arise, for example, in studies comparing two alternative curriculums or two different teaching methods. By generalizing an idea which Scheffe used to obtain a test for the Behrens-Fisher problem, this paper develops some tests for comparing two regression lines when the two sets of error terms are normally distributed but with two different variances. Scheffe's test itself is a randomized test, but in this paper we present both randomized and non-randomized tests. Both simple and multiple regression are considered, but the simple regression tests are computationally easier than the multiple regression tests. The basic test statistic which is used is the ordinary t-statistic. Essentially two types of problems are dealt with: (A) determining whether the two regression lines are identical when they are known to be parallel; and (B) determining whether the two regression lines are parallel. Confidence bounds as well as tests of hypotheses are available. For Problem A, a minimax estimator of the distance between the two lines is obtained. In addition to the Scheffe-type tests, we also consider some tests based on an approach of Welch and Hajek. A numerical example is presented. [ABSTRACT FROM AUTHOR]
- Published
- 1965
- Full Text
- View/download PDF
21. SAMUEL S. WILKS.
- Author
-
Stephan, Frederick F., Tukey, John W., Mosteller, Frederick, Mood, Alex M., Hansen, Morris H., Simon, Leslie E., and Dixon, W. J.
- Subjects
- *
STATISTICIANS , *MATHEMATICAL statistics , *RANDOM variables , *MATHEMATICIANS , *NONPARAMETRIC statistics , *STATISTICAL sampling , *STATISTICS - Abstract
The article presents information about Samuel S. Wilks, a great contributor to statistics. Wilks' behavior toward applications was peculiarly split, he encouraged his students to work on applications, not always an easy thing to do in a strongly theoretical mathematics department; he often told students about applications that he regarded as "neat" or "cute" or "clever"; the statistical colloquium he guided often had speakers on practical applications of mathematical statistics; he himself wrote some practical papers in statistics, but in the classroom he rarely discussed applications. Repeatedly, however, practical problems explicitly influenced both his own publications and those of his students and Wilks often used applications as motivation for the discussion of distribution theory, usually going well beyond the needs of the original problem. The papers to be discussed exhibit incompletely and fragmentarily a major influence on the work of the man and his students. Wilks watched the development of the statistical theory of order statistics closely. Indeed he wrote a masterful summary of the literature. The general area involves the study of the statistical properties of ordered measurements.
- Published
- 1965
- Full Text
- View/download PDF
22. ESTIMATION OF MULTIPLE CONTRASTS USING t-DISTRIBUTIONS.
- Author
-
Dunn, Olive Jean and Massey Jr, Frank J.
- Subjects
- *
TIME series analysis , *CHARACTERISTIC functions , *MATHEMATICAL statistics , *PROBABILITY theory , *CONFIDENCE intervals , *DISTRIBUTION (Probability theory) , *MATHEMATICAL models , *STATISTICAL sampling , *MULTIVARIATE analysis , *STATISTICS - Abstract
Various methods based on Student t variates have been suggested and used for obtaining simultaneous confidence intervals for several means, or for several contrasts among means. Determination of an overall confidence level for such intervals involves evaluating the probability mass of a multivariate t distribution over a hypercube centered at the origin, with sides paralleling the coordinate planes, or obtaining bounds for this probability mass. Since such distributions involve many nuisance parameters, an impossible number of tables would be necessary in order to make exact confidence intervals. In the virtual absence of tables, approximations and bounds become important. In this paper, an attempt has been made to investigate the adequacy of certain suggested approximations [2], [5], [8] by computing the exact distributions for some particular cases. These exact distributions have been compared with approximations. This paper is concerned with two-sided confidence intervals, rather than one-sided intervals. [ABSTRACT FROM AUTHOR]
- Published
- 1965
- Full Text
- View/download PDF
23. A COMPARISON OF A MODIFIED 'HANNAN' AND THE BLS SEASONAL ADJUSTMENT FILTERS.
- Author
-
Nerlove, Marc
- Subjects
- *
REGRESSION analysis , *STATISTICS on the working class , *LABOR , *MATHEMATICAL statistics , *STATISTICS , *TIME series analysis - Abstract
In a previous paper [12], an attempt was made to show how spectral techniques could be used to compare the effects of two seasonal adjustment procedures on the series to which they were applied. The two procedures compared were: (a) the technique currently used by the Bureau of Labor Statistics for seasonally adjusting employment, unemployment, and labor force monthly statistics, and (b) the so-called "residual" method, proposed by Brittain [2], Samuelson [16], and others. Spectra of the original and the seasonally adjusted series and the cross spectrum of the two were used to aid in the assessment of whether either procedure removed more than could be considered seasonal, introduced spurious regularities, and/or distorted temporal relationships. It was concluded that both techniques removed more than seasonal effects from, and produced some temporal distortion in, the series to which they were applied. Neither method appeared to be superior to the other. It is the purpose of this paper to carry the previous analysis one step further and to compare the BLS procedure with a modified version of the regression method of seasonal adjustment suggested by Cowden [3] and Mendershausen [11], and recently revived by Harman [9, 10] in an exceptionally sophisticated form. In addition to Hannan's work along the lines suggested, Nettheim [13] and Rosenblatt [15] have made studies. Rosenblatt [15] has carried out analyses similar to those reported here. [ABSTRACT FROM AUTHOR]
- Published
- 1965
- Full Text
- View/download PDF
24. R.A. FISHER AND THE LAST FIFTY YEARS OF STATISTICAL METHODOLOGY.
- Author
-
Bartlett, M. S.
- Subjects
- *
STATISTICS , *ECONOMICS , *INTERVAL analysis , *GENETICS , *STATISTICAL sampling , *ESTIMATION theory - Abstract
In this article, the author will talk about researcher R.A. Fisher's contributions to statistics. It is well-known that his work in genetics was of comparable status. It is largely represented by his book "The Genetical Theory of Natural Selection," though in his subsequent work his further association with ecological and experimental studies in evolutionary genetics, and his share in the development of studies in the human blood groups, might especially be recalled. Fisher's contributions to statistics began with a paper in 1912 advocating the method of maximum likelihood for fitting frequency curves, although the first paper of substance was his 1915 paper in the journal "Biometrika," on the sampling distribution of the correlation coefficient. Fisher tried to pose problems of analysis as the reduction and simplification of statistical data. He put forward his well-known concept of amount of information in estimation theory, such that information might be lost, but never gained, by analysis. His concept has been of great practical value, especially in large sample theory.
- Published
- 1965
- Full Text
- View/download PDF
25. A HISTORY OF DISTRIBUTION SAMPLING PRIOR TO THE ERA OF THE COMPUTER AND ITS RELEVANCE TO SIMULATION.
- Author
-
Teichroew, Daniel
- Subjects
- *
SIMULATION methods & models , *PROBABILITY theory , *STATISTICAL sampling , *DISTRIBUTION (Probability theory) , *STATISTICS , *TIME series analysis , *DIGITAL computer simulation , *METHODOLOGY - Abstract
The use of simulation, as a technique for attacking difficult problems, has increased greatly with the availability of the digital computer. This is illustrated by the large number of references in Shubik's (1960) bibliography[sup 2] and in the large number of studies published since then. Simulation is essentially an extension of a technique known as empirical sampling, or distribution sampling, which has been used in the field of statistics for many years. The limitations of the technique, which are well known to statisticians, are apparently not as well known, or at least not as well recognized, by those using simulation today. The first part of this paper contains an historical survey of distribution sampling as used by statisticians. The material was originally prepared in 1953 and is reproduced here in slightly revised form to bring this history to the attention of present day simulators in order that the lessons that can be learned from this part can more readily be incorporated in the development of methodology today. The second part of this paper discusses the relevance of empirical sampling to the present day state of the art of simulation. The technique of generating random members, developed for empirical sampling can be applied directly to simulation. However in other aspects simulation is more difficult than empirical sampling and here the theory of distribution sampling does not have much to offer. The difficulties are due to lack of independence among time series, non-stationarity of the time series, and the large number of parameters. [ABSTRACT FROM AUTHOR]
- Published
- 1965
- Full Text
- View/download PDF
26. Rejoinder.
- Author
-
Wood, Simon N., Pya, Natalya, and Säfken, Benjamin
- Subjects
- *
BOUNDARY element methods , *NUMERICAL analysis , *STATISTICS , *MATHEMATICAL statistics , *REGRESSION analysis , *ANALYSIS of covariance - Abstract
The article focuses on the study regarding the boundary of smoothing parameter space in statistical analysis. It mentions several papers by different authors which featured different approaches and methods to determine smoothing parameters on the edge of the feasible parameter space. It also describes the proposed fixes of the researchers that offer substantial improvement to the phase transition to smooth estimates to nonzero smoothing penalty.
- Published
- 2016
- Full Text
- View/download PDF
27. Comment.
- Author
-
Dempster, A. P.
- Subjects
- *
PROBABILITY theory , *FUZZY sets , *FUZZY logic , *FUZZY systems , *STATISTICS - Abstract
This article comments on a paper by Nozer D. Singpurwalla and Jane M. Booker which aimed to develop a line of argument that demonstrates that probability theory has a sufficiently rich structure for incorporating fuzzy sets within its framework. Google reports about 670,000 websites that connect with the phrase fuzzy logic. From a mathematical perspective, the concept of a fuzzy set is somewhat disconcerting, because a fuzzy set is generally not a set, but rather a variable. A crucial issue, which was debated already in the earliest days of fuzzy logic, concerns whether or not membership values are just probabilities in disguise. Certainly, the original motivating examples deal primarily with formalizing variables that correspond to natural language concepts that are in everyday use. The paper's section 4 introduces Lofti Zadeh's rejection of the law of the excluded middle, with reference to differentiating between membership values and probabilities. The commenter's reading of the paper's section 4, and of formula (1) in particular, suggests that Zadeh in 1968 might well have approved a solution along the lines of the preceding paragraph, except that he would have used the posterior expectation of m(x) as a summary of the posterior distribution.
- Published
- 2004
- Full Text
- View/download PDF
28. Comment.
- Author
-
Lindley, D. V. and Laviolette, Michael
- Subjects
- *
PROBABILITY theory , *FUZZY sets , *FUZZY logic , *FUZZY systems , *MATHEMATICS , *STATISTICS - Abstract
This article comments on a paper by Nozer D. Singpurwalla and Jane M. Booker which aimed to develop a line of argument that probability theory has a sufficiently rich structure tor incorporating fuzzy sets within its framework. According to the commenter, the paper provides a real advance in the understanding of fuzzy sets, by providing a sensible connection between membership functions and likelihood, and thereby probability. He says the paper's modification of the basic question is answered most convincingly. However, the answer raises an apparent conflict between the two calculi of fuzzy logic and probability. What is now needed is a second article to resolve this conflict. The commenter's conjecture is that the resolution will show that the rules of fuzzy logic are untenable. According to him, his principal reason behind the conjecture is that the rules of the probability calculus follow from simple, obvious assumptions, whereas those of the fuzzy calculus have been arbitrarily selected. He points out that this arbitrariness is fine for a pure mathematician. But, according to him, an applied mathematician, faced with the reality of uncertainty in the real world, must take into account that world and cannot ignore the basic assumptions about uncertainty that underlie the probability calculus and reflect basic truths about the world.
- Published
- 2004
- Full Text
- View/download PDF
29. On Priors With a Kullback--Leibler Property.
- Author
-
Walker, Stephen, Damien, Paul, and Lenk, Peter
- Subjects
- *
BAYESIAN analysis , *STATISTICAL decision making , *STATISTICS , *DENSITY functionals , *MATHEMATICAL models - Abstract
In this paper, we highlight properties of Bayesian models in which the prior puts positive mass on all Kullback-Leibler neighborhoods of all densities. These properties are concerned with model choice via the Bayes factor, density estimation and the maximization of expected utility for decision problems. In four illustrations we focus on the Bayes factor and show that whatever models are being compared, the [log(Bayes factor)]/[sample size] converges to a non-random number which has a nice interpretation. A parametric versus semiparametric model comparison provides a fifth illustration. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
30. Methods and Criteria for Model Selection.
- Author
-
Kadane, Joseph B. and Lazar, Nicole A.
- Subjects
- *
STATISTICS , *BAYESIAN analysis , *DECISION making , *SCIENCE - Abstract
Model selection is an important part of any statistical analysis and, indeed, is central to the pursuit of science in general. Many authors have examined the question of model selection from both frequentist and Bayesian perspectives, and many tools for selecting the "best model" have been suggested in the literature. This paper considers the various proposals from a Bayesian decision-theoretic perspective. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
31. Markov Chain Monte Carlo: 10 Years and Still Running!
- Author
-
Cappé, Olivier and Robert, Christian P.
- Subjects
- *
MARKOV processes , *MONTE Carlo method , *ALGORITHMS , *PROBABILITY theory , *STATISTICS - Abstract
This article presents the Markov chain Monte Carlo (MCMC) statistical technique. The impact on the discipline is deep and durable, because these methods have opened new horizons in the scale of the problems that one can deal with, thus enhancing the position of statistics in most applied fields. The MCMC revolution has in particular boosted Bayesian statistics to new heights by providing a virtually universal tool for dealing with integration problems. This can be seen in the explosion of papers dealing with complex models, hierarchical modelings, nonparametric Bayesian estimation, and spatial statistics. This trend has also created new synergies with mathematicians and probabilists, as well as econometricians, engineers, ecologists, astronomers, and others, for theoretical requests and practical implications of MCMC techniques. The main factor in the success of MCMC algorithms is that they can be implemented with little effort in a large variety of settings. This is obviously true of the Gibbs sampler, which provided some conditional distributions are available as shown by the BUGS software. We mentioned BUGS and CODA as existing software dedicated to MCMC algorithms but much remains to be done before MCMC becomes part of commercial software.
- Published
- 2000
- Full Text
- View/download PDF
32. The State of Statistical Process Control as We Proceed into the 21st Century.
- Author
-
Stoumbous, Zachary G., Reynolds Jr., Marion R., Ryan, Thomas P., and Woodall, William H.
- Subjects
- *
STATISTICAL process control , *STATISTICS , *QUALITY control , *MANUFACTURED products , *MANUFACTURING processes , *EMPLOYEE empowerment , *COMPUTER integrated manufacturing systems , *MOTION control devices , *PRODUCTION engineering , *ESTIMATION theory - Abstract
This article discusses the state of statistical process control as we proceed into the 21st century. SPC refers to some statistical methods used extensively to monitor and improve the quality and productivity of manufacturing processes and service operations. SPC primarily involves the implementation of control charts, which are used to detect any change in a process that may affect the quality of the output. The first control charts were developed by Walter A. Shewhart in the 1920s. These simple Shewhart charts have dominated applications to date. The process-monitoring problem can be described in general terms. Murphy's law explains the purpose monitoring: over time, something will inevitably change and possibly cause deterioration in process quality. The Shewhart charts were designed to make it relatively easy for process personnel without statistical training to set up, apply, and interpret the charts using only a pencil and paper for calculations. Bayesian procedures appear to be naturally suited for process monitoring. The standard approach to sampling for a control chart is to use a fixed sampling rate in which samples of fixed size are obtained using a fixed-length sampling interval.
- Published
- 2000
- Full Text
- View/download PDF
33. Capture-Recapture Models.
- Author
-
Pollock, Kenneth H.
- Subjects
- *
ANIMAL populations , *WILD animal collecting , *SURVEYS , *STATISTICAL sampling , *STATISTICAL sampling software , *BIOMETRY , *POPULATION research , *COMPUTER software , *STATISTICS - Abstract
The article presents information on a study which described capture-recapture models for the estimation of demographic parameters of wild animal populations. The capture-recapture models are now also widely used in a variety of other applications such as the census undercount, incidence of disease, criminality, homelessness and computer bugs. Although they have their historical roots in the twentieth century, capture-recapture models are basically a twentieth century phenomenon. Often capture-recapture studies have a long duration, rendering the closed models impractical. Thus there has been a need for the development of models that allow for additions and deletions. Another focus of research has been to develop combinations of different sampling methods. One early example used a robust design that combines both open and closed models in one analysis. Since then many papers have used this design for many reasons, for example, to allow for unequal catchability, to separate recruitment from immigration and to estimate temporary emigration.
- Published
- 2000
- Full Text
- View/download PDF
34. Rejoinder.
- Author
-
Morris, Carl N.
- Subjects
- *
STATISTICS , *BAYESIAN analysis , *EMPIRICAL research , *INFERENCE (Logic) , *STATISTICIANS , *DISTRIBUTION (Economic theory) , *THOUGHT & thinking - Abstract
In this article the author presents his views related to the comments on his research paper entitled "Parametric Empirical Bayes Inference: Theory and Applications," by various statisticians. The author feel himself honored to have six such distinguished statisticians share their thoughts on this article. All think deeply about the foundations of statistics, as well as its methods. They seem to agree that procedures like those proposed in this paper are underutilized. He remarks that many of the commenter discuss his paper primarily in terms of its relationship to Bayesian thinking. It will be easiest for the author to discuss the general Bayes empirical Bayes issue before turning to more detailed comments. Empirical Bayes says that if the class II of all possible prior distributions contains more than one prior, then the inference rule should do well against the entire range of priors in II. He says that the statisticians need not choose between the Bayesian and frequentist, or between the Bayesian and empirical Bayesian viewpoints. All viewpoints have something to offer, none being adequate in itself Bayesians are right
- Published
- 1983
- Full Text
- View/download PDF
35. Statistical Evidence of Discrimination: Rejoinder.
- Author
-
Kave, David
- Subjects
- *
ANTI-discrimination laws , *EMPLOYMENT discrimination , *STATISTICS , *ACTIONS & defenses (Law) , *APPELLATE courts , *DECISION making - Abstract
The author expresses his delightment on somewhat haphazard survey of discrimination law and statistical analysis, that has sparked some controversy. The author responds to some of the criticism. Lea Brilmayer, in a discussion makes two major points that author's description of legal doctrine is not organized around the phrases "disparate treatment"; and that his legal analysis rejects discriminatory intent as an essential element of discrimination. Author's paper identified four points at which discrimination can occur. The first three involve "disparate treatment." At no point did he maintain that disparate treatment pertains only to "a decision that purports to be ad hoc." As indicated, the concept also pertains to a facially neutral rule applied in a discriminatory fashion, and he should now add that discrimination in the formulation of a rule is another version of disparate treatment. The final category of cases does not involve discriminatory treatment, but rather disparate impact, as the paper itself noted.
- Published
- 1982
- Full Text
- View/download PDF
36. Representing Points in Many Dimensions by Trees and Castles: Comment.
- Author
-
Jacob, Robert J. K.
- Subjects
- *
DIMENSIONAL analysis , *GRAPHICAL modeling (Statistics) , *MULTIVARIATE analysis , *STATISTICS , *TREES , *CASTLES , *GRAPHIC methods , *MULTIDIMENSIONAL databases , *COORDINATES - Abstract
The article presents author's comments on paper entitled "Representing Points in Many Dimension by Trees and Castles." On closer inspection, though, the author finds that their method really comprises two separate ideas. The paper does not make this distinction, but, by doing so, he will show it is possible to isolate the better idea and use it by itself to good effect. There are two steps one must go through to construct a two-dimensional graphical display for multidimensional data points; first, one must choose a two-dimensional graphical symbol and then one must assign the coordinates of the multidimensional data to the parameters that describe the construction of that symbol. B. Kleiner and J.A. Hartigan's method can be decomposed into a way to perform each of these two steps; for the first, they choose trees or castles as the display; for the second, they assign the data coordinates to the tree or castle construction parameters in a way designed to depict an hierarchical organization obtained from clustering the coordinates.
- Published
- 1981
- Full Text
- View/download PDF
37. Comment.
- Author
-
Rubin, Donald B.
- Subjects
- *
MATHEMATICAL statistics , *STOCHASTIC processes , *RANDOM variables , *STATISTICS , *PROBABILITY theory , *ANALYSIS of variance , *RANDOM sets - Abstract
The article presents the authors' comments on researcher D. Basu's paper related to randomization analysis of experimental data. Basu's paper on researcher R.A. Fisher's randomization test for experimental data (FRTED) is certainly entertaining. Although much of the paper is devoted to the thesis that Fisher changed his views on FRTED, apparently the primary point of the paper is to argue that FRTED is "not logically viable." Admittedly, FRTED is not the ultimate statistical weapon, even in randomized experiments, but calling it illogical is rather bizarre. Basu criticizes FRTED through two primary arguments. His first line of criticism follows from his attack on a nonparametric test labeled as "Fisher's randomization test." Basu's second line of criticism of FRTED takes the form of a discussion between a statistician and a scientist. The author sees nothing illogical about the FRTED, it is relevant for those rare situations when a purely confirmatory test of a priori sharp hypothesis is to be made using a priori defined statistic having an associated priori definition of extremeness. FRTED cannot adequately handle the full variety of real data problems that practicing statisticians face when drawing causal inferences, and for this reason it might be illogical to try to rely solely on it in practice.
- Published
- 1980
- Full Text
- View/download PDF
38. Comment.
- Author
-
Hinkley, David V.
- Subjects
- *
MATHEMATICAL statistics , *STOCHASTIC processes , *RANDOM variables , *STATISTICS , *PROBABILITY theory , *ANALYSIS of variance , *RANDOM sets - Abstract
The article presents the author's comments on researcher D. Basu's paper related to randomization analysis of experimental data. Basu has provided researchers with an interesting and provocative critique of significance tests related to randomized experiments. It does seem to be true that there is not a unified mathematical theory of significance tests developed by researcher R.A. Fisher. Nevertheless, it is important to point out a fallacy in Basu's criticism of nonunique significance level. After confessing to a "ruthless cross-examination" of the wrong topic, the non-Fisherian nonparametric tests, Basu suggests that Fisher's silence in 1956 may be used to condemn the randomization test. The empirical evidence confronting Fisher certainly suggested the necessity of randomization in most field experiments, if the standard methods of analysis were to be used. The final substantial issue of Basu's paper is that of the ancillarity of the design outcome. Technically Basu is quite correct, if the randomization has validated a parametric model, the design outcome is then ancillary by design. It would, however, be as well not to forget the purpose of an ancillary statistic, since other definitions.
- Published
- 1980
- Full Text
- View/download PDF
39. Comment.
- Author
-
Kruskal, William
- Subjects
- *
STATISTICS , *WEATHER control , *MULTIPLICITY (Mathematics) , *STATISTICAL hypothesis testing , *PROBABILITY theory , *RANDOM variables , *RAIN-making - Abstract
This article presents the views of the author on a paper by Roscoe R. Braham, Jr. which focused on involvement of statisticians in weather modification work. At two or three points in Braham's paper, distinctions are made between physical and statistical experiments (and observational programs), or between physical and statistical modes of thought. The comparison is sometimes to the discredit of statistics. In one broad sense of the word "statistics," such a distinction is otiose, for the meteorologist certainly makes inferences from quantitative data and thus does statistics, whether or not the word is used. So presumably some narrower sense of "statistics" is intended. Another sense in which the distinction might be intended is that of divergence between the result of a conventional statistical analysis, perhaps a significance test, on the one hand, and standard physical knowledge or possibly the intuition of one or more meteorologists on the other hand. Certainly cumulative scientific theory and the intuitions of scientists should be given great weight; yet scientific intuitions often vary widely, and there are many cases in which accepted doctrine has turned out to be wrong, sometimes after carrying out controlled randomized trials.
- Published
- 1979
- Full Text
- View/download PDF
40. Comment.
- Subjects
- *
CONTINGENCY tables , *MISSING data (Statistics) , *MULTIVARIATE analysis , *STATISTICS , *ESTIMATION theory , *LOG-linear models , *STANDARDIZATION - Abstract
In this article the author comments on the research paper entitled "The Design and Analysis of the Observational Study--A Review," by Sonja M. McKinlay. The author remarks that although MeKinlay devotes a large part of her review to the developments in the analysis and interpretation of contingency tables, she refers the readers to papers by other statisticians too, for a historical perspective. Since the loglinear model approach now dominates the statistical literature on contingency table analysis, one often seem to ignore the ideas of statistician K. Pearson, which according to the author is a mistake. While the multivariate generalizations of the statisticians cross-product ratio or loglinear model approach were fermenting, the technique of standardization to eliminate the effects of categorical covariates received considerable attention in the epidemiological literature. The major advances in the literature on multidimensional contingency tables in the 1960's coincided with the emergence of interest in and the availability of high-speed computers, and this work received substantial impetus from several large-scale data analysis projects.
- Published
- 1975
41. A Voyage of Discovery.
- Author
-
Billard, Lynne
- Subjects
- *
STATISTICS , *ASSOCIATIONS, institutions, etc. , *PERIODICALS , *CENSUS , *ARITHMETIC mean , *VARIATIONAL principles , *DISTRIBUTION (Probability theory) , *STATISTICAL correlation - Abstract
This article highlights the historical events that took place within 50 years since the American Statistical Association was founded in 1839 and the Journal of the American Statistical Association (JASA) was first published in 1888. For the first years, JASA contained almost exclusively nonmathematical papers. Many were mere repositories of extensive data sets, including many compilations from census counts with interpretations of what these data purportedly revealed. Others were from investigations undertaken by sociologists, economists, political scientists and historians. One area that attracted theoretical attention dealt with the concepts of averages, variation and distributions. The second area that received theoretical attention during these years was correlation and related concepts.
- Published
- 1997
- Full Text
- View/download PDF
42. Constrained Bayes Estimation With Applications.
- Author
-
Ghosh, Malay
- Subjects
- *
BAYESIAN analysis , *ANALYSIS of variance , *MATHEMATICAL models , *REGRESSION analysis , *MATHEMATICAL statistics , *STATISTICS - Abstract
Bayesian techniques are widely used in these days for simultaneous estimation of several parameters in compound decision problems. Often, however, the main objective is to produce an ensemble of parameter estimates whose histogram is in some sense close to the histogram of population parameters. This is for example the situation in subgroup analysis, where the problem is not only to estimate the different components of a parameter vector, but also to identify the parameters that are above, and the others that are below a certain specified cutoff point. We have proposed in this paper Bayes estimates in a very general context that meet this need. These estimates are obtained by matching the first two moments of the histogram of the estimates, and the posterior expectations of the first two moments of the histogram of the parameters, and minimizing, subject to these conditions, the posterior expectation of the Euclidean distance between the estimates and the parameters. Several applications of the main result are provided in the normal and other models. Also, the results are applied to an actual data set. [ABSTRACT FROM AUTHOR]
- Published
- 1992
- Full Text
- View/download PDF
43. Combined Rank Tests for Randomly Censored Paired Data.
- Author
-
Albers, Willem
- Subjects
- *
ASYMPTOTIC distribution , *STATISTICS , *DISTRIBUTION (Probability theory) , *ASYMPTOTIC expansions , *STATISTICAL hypothesis testing , *ESTIMATION theory , *LEAST squares - Abstract
Many authors have dealt with the problem of extending ordinary two-sample rank tests to cases where censoring occurs. Albers and Akritas (1987) proposed simple tests for this purpose, based on the idea of making separate rankings for uncensored and censored observations and subsequently combining the resulting two rank statistics. Similarly, some papers appeared that studied the problem of censoring for the paired-data case rather than the two-sample case. This article indicates how the approach of Albers and Akritas can be adapted to the paired-data case. The tests involved are not based on the ranks of the differences, but the differences of the ranks in the pooled sample. In this way, use is made of interblock information. The asymptotic distribution of the new statistic is obtained under the null hypothesis and contiguous location alternatives. By way of example, Wilcoxon- and Savage-type scores are introduced that are optimal for logistic location and exponential scale alternatives, respectively. The two corresponding tests are applied to some real data, and comparisons are made with the results obtained with competing tests on the same data set. [ABSTRACT FROM AUTHOR]
- Published
- 1988
- Full Text
- View/download PDF
44. Optimal Step-Type Designs for Comparing Test Treatments With a Control.
- Author
-
Ching-Shui Cheng, Majumdar, Dibyen, Stufken, John, and Ture, Tahsin Erkan
- Subjects
- *
BLOCK designs , *EXPERIMENTAL design , *OPTIMAL designs (Statistics) , *LEAST squares , *MATHEMATICAL statistics , *STATISTICS , *STATISTICAL correlation - Abstract
The problem of obtaining A-optimal designs for comparing v test treatments with a control in b blocks of size k each is considered. A condition on the parameters (v, b, k) is identified for which optimal step-type designs can be obtained. Families of such designs are given. Methods of searching for highly efficient designs are proposed for situations in which it is difficult to determine an A-optimal design. Under the usual additive homoscedastic model, an A-optimal design minimizes the average variance of the least squares estimators of the control-test treatment comparisons. Majumdar and Notz (1983) gave a method for finding A-optimal designs. Their optimal designs can basically be of two types, using the terminology of Hedayat and Majumdar (1984): rectangular (R), in which every block has the same number of replications of the control, and step (S), in which some blocks contain the control t times and the others t + 1 times. Optimal R-type designs were studied by Hedayat and Majumdar (1985). Families of such designs, particularly when each block has one replication of the control, were given in that paper. In this article, we intend to study optimal S-type designs. Step-type designs are more complicated than rectangular-type designs; the latter are balanced incomplete block designs in the test treatments augmented by an equal number of controls in each block, but the former do not have such a simple characterization. Consequently, both the optimality and the construction of such designs are more involved. [ABSTRACT FROM AUTHOR]
- Published
- 1988
- Full Text
- View/download PDF
45. Computable MINQUE-Type Estimates of Variance Components.
- Author
-
Westfall, Peter H.
- Subjects
- *
ANALYSIS of variance , *ESTIMATION theory , *INVARIANTS (Mathematics) , *VARIANCES , *MATHEMATICAL optimization , *STATISTICS , *GAUSSIAN distribution , *STATISTICAL correlation , *REGRESSION analysis - Abstract
The minimum norm quadratic unbiased estimator type (MINQUE type) of estimates considered in this article are obtained by requiring identical values for the ratios of the a priori variances to the a priori error variance and letting this common value tend to infinity. The resulting estimates are invariant quadratic unbiased estimators with certain parametric and nonparametric optimality properties: assuming normally distributed random effects the efficiency of the proposed estimates to the minimum variance quadratic unbiased estimates (MIVQUE's) approaches unity when the true variance ratios are identical and tend to infinity. Assuming nonnormal effect distributions in the model with two variance components, the estimates are asymptotically efficient: in a sequence of designs where the number of classes and the number of observations on each class approach infinity, it is shown that the asymptotic variances of the estimates are equivalent to the theoretical minimum variances for invariant quadratic unbiased estimators. The result is interesting and useful since the usual analysis of variance (ANOVA) estimate of between-classes variance has strictly larger asymptotic variance for the unbalanced one-way model. Commonly considered estimates result from this procedure; the usual residual mean square (assuming that all non-error effects are fixed) is the resulting estimate of the error variance. In the one-way model the resulting estimates coincide with estimates considered by Thomas and Hultquist (1978), Burdick and Graybill (1984), Ahrens, Kleffe, and Tenzler (1981), and Kaplan (1982). In particular, the convergence of the MINQUE estimates was proved in the latter two papers in the context of the unbalanced one-way model. The procedure yields computationally convenient estimates in the general mixed ANOVA model. Computing formulas are... [ABSTRACT FROM AUTHOR]
- Published
- 1987
- Full Text
- View/download PDF
46. Applications of Transportation Theory to Statistical Problems.
- Author
-
Causey, Beverley D., Cox, Lawrence H., and Ernst, Lawrence R.
- Subjects
- *
STATISTICAL sampling , *TRANSPORTATION , *STATISTICS , *PROBABILITY theory , *DISTRIBUTION (Probability theory) , *MATHEMATICAL statistics - Abstract
The two-dimensional controlled selection problem and the problem of maximizing the overlap of old and new primary sampling units after restratification and change of selection probabilities have been studied for several decades but have never been completely solved until now. Using transportation theory, complete solutions are obtained here for these and other problems. The solution to the controlled selection problem is based on a specific transportation model that was originally developed, in a previous paper by Cox and Ernst (1982), to solve completely the controlled rounding problem, namely the problem of optimally rounding real-valued entries in a two-way tabular array to adjacent integer values in a manner that preserves the tabular (additive) structure of the array. This model is also applied to other statistical problems, such as raking and statistical disclosure for frequency count tabulations and microdata. [ABSTRACT FROM AUTHOR]
- Published
- 1985
- Full Text
- View/download PDF
47. A Statistical Model for Position Emission Tomography: Rejoinder.
- Author
-
Vardi, V., Shepp, L. A., and Kaufman, L.
- Subjects
- *
POSITRON emission tomography , *RESEARCH , *STATISTICS , *ALGORITHMS , *DIAGNOSTIC imaging , *POSITRON emission , *X-rays - Abstract
The article presents a rejoinder to comments made on their article "A Statistical Model for Position Emission Tomography." Being a very active, multidisciplinary, field of research, it is hard to keep track of who suggested what and in what context, so in attempting to answer scholar Donald B. Rubin's question we are likely to overlook some contributors. The first use of a pixel model for an organ appeared in scholar G.N. Hounsfield's original development of X-ray computerized tomography. In 1978 scholars gave a short account of this early development in the field, including images produced using Hounsfield's algorithms. The fact that the physics of Position Emission Tomography (PET) is different, and inherently more stochastic, than that of X-ray transmission tomography has been common knowledge. Realizing that the additional randomness of PET calls for a more statistical way of thinking, scholars A.J. Rockmore and A.Macovski suggested estimating the emission density in PET, using a maximum-likelihood approach. Their paper fell short of identifying the connection between the tube count and the emission density. This omission on their part is probably a result of being unaware of the development of statistical methods for incomplete data.
- Published
- 1985
- Full Text
- View/download PDF
48. Rejoinder.
- Author
-
Pratt, J.W. and Schlaifer, Robert
- Subjects
- *
MATHEMATICAL models , *CAUSATION (Philosophy) , *STRUCTURAL frame models , *ESTIMATION theory , *MATHEMATICAL statistics , *MATHEMATICAL variables , *RESEARCH , *OBSERVATION (Psychology) , *STATISTICAL correlation , *PROBABILITY theory , *STATISTICS - Abstract
The article presents authors' reply comments on various papers related to nonexperimental data and estimation of structural effects. Authors thank researcher John Geweke for his instructive comments and in reply only want to make sure that the relation between Geweke's third paragraph and authors' article is correctly understood. Authors' language did suggest, however, that meaningful statements about correlation with excluded variables would usually involve a very large sufficient set and in the case, authors find Geweke's example instructive. In response to the comment of researcher A.P. Dawid, authors are glad that he likes parts of their, but authors disagree with Dawid's comment on observational studies. Authors feel that selection of observations is not the important new feature in observational studies. The one and only new feature that is important is that the "treatments" or "factors" are not randomized. Selection of observations is not even new, because missing values can make estimates of causal effects inconsistent even when treatments are randomized.
- Published
- 1984
- Full Text
- View/download PDF
49. Comment.
- Author
-
Boardman, Thomas J.
- Subjects
- *
HEART transplantation , *TRANSPLANTATION of organs, tissues, etc. , *STATISTICS , *EXPERIMENTAL design - Abstract
The article comments on the paper by Murray Aitkin, Nan Laird and Brian Francis which presents an analyses of survival of patients in the Stanford Heart Transplantation Program, published in the June 1983 issue of the "Journal of the American Statistical Association." The author always approaches an article that purports to consider an analysis of a classical data set with a great deal of interest. He wonders: "Will the authors be able to discover new and useful results that others have been unable to detect in the past?" Of course, authors take a chance using a popular data set to demonstrate their new methodology. The problem is that perhaps the data set really does not have too much going for it in the first place even though the authors' new approach to analysis may have merit. Such may be the case here. When an experimenter visits with a statistical consultant concerning a proper experimental design to investigate the effectiveness of a new treatment, the consultant is called on to propose valid methods for demonstrating the repeatability of the effect of the new treatment over the standard one. Students in beginning courses in experimental design often have trouble understanding the concept and importance of replication. As statisticians they need to be sure that they do not overlook this concept as it applies to verification of a new statistical methodology. In this article, of course, the authors have chosen to emphasize the analysis of a classic data set rather than the evaluation of a new methodology, a task that can not be done on one data set.
- Published
- 1983
- Full Text
- View/download PDF
50. Comment.
- Author
-
Crowley, John and Storer, Barry E.
- Subjects
- *
HEART transplantation , *TRANSPLANTATION of organs, tissues, etc. , *PARAMETER estimation , *MATHEMATICAL models , *STATISTICS ,CARDIAC surgery patients - Abstract
The article comments on the paper by Murray Aitkin, Nan Laird and Brian Francis which presents an analyses of survival of patients in the Stanford Heart Transplantation Program, published in the June 1983 issue of the "Journal of the American Statistical Association." The author congratulates the authors for their careful and thorough analysis of the Stanford heart transplant survival data. There are many statistical issues involved in this area, and also many important medical and ethical questions. The comments are concentrated on the statistical problems, returning only briefly at the end to the current activity regarding larger societal concerns with heart transplantation. Aitkin, Laird, and Francis have made a number of choices in their approach to this reanalysis, and the author would like to comment on several of them. Each transition is examined separately. Aitkin, Laird, and Francis are careful to analyze waiting time for a heart, pretransplant survival, and post-transplant survival separately, before proceeding to comparisons. In particular, this approach enables them to detect a trend in pretransplant survival with calendar time of acceptance, more recent patients being the more healthy. This is an important complication for any assessment of trends in post-transplant survival. However, more could have been done along these lines.
- Published
- 1983
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.