Summary Analysis of variance. Introduction: Some theory of I inear vector spaces can be applied at not too hard a mathematical level to some problems of analysis of variance. It is then possible to define some much used notions (main effect, interaction, confounding, orthogonality) and many experimental designs and their analysis get rather transparent, partly as a consequence of simple notations. § 1. A function (y1, ...yN) defined over N points (= e.g. experimental units in agricultural field trials) assigns a real number to any of these points. Such functions can be added and they can be multiplied with a real number: they can be considered as vectors and then form an N-dimensional vector space E. Subspaces are: the I-dimensional space of constant functions (of general means); the space of functions that are constant within the classes of a classification of the N points (fig. 2). If the classification corresponds with some influence A (amount cf phosphor added to the field; variety; fertility) then this space denoted by A is called the space of impure main-effects of influence A. An independent basis, and hence the dimension of A are determined. The number of degrees of freedom of a sub-space of E is defined to be its dimension. Also a space of impure interactions of two (or more) influences A and B is defined. §2. The stochastic variables y1, ... yn have a normal distribution with expectation values y̌1..., y̌N and all with the same variance a. They are combined to a stochastic N-dimensional vector y with expectation vector y̌. It has sense to define a metric in the vector space, determined by a scalar product:(a, b) = a1 b2+ a2 b2+...+ aN bN. The length of a vector is √(a,a) The angle between a and b isϕ :cosϕ= (a, b)/√(a,a).(b,b) The orthogonal projection of a vector yon a linear space A is denoted by yA. It follows that § 3. The space A* of pure main effects of the influence A is the linear subspace of A perpendicular to the space of general means. §4. The space of pure interactions of A and B is defined to be the subspace of the space A X B of impure interactions, perpendicular to the spaces A and B. §5. Two influences or interactions are confounded, if their (pure) spaces meet in a space of dimension > o. Confounding may be complete or partial. §6. All orthogonal two-way classifications are determined. §7- Statistical considerations are given. In particular the F-test is mentioned. §8. Two examples of orthogonal two-way-classifications. §9. How to deal with non-orthogonal classifications. An iteration technique to approximate likelihood estimates for two main effects, given by Stevens and Ha mm in g, is proved and understood in terms of vectors. §10. An example with two effects, one of which is linear fertility, is given in detail. §11. Latin squares. §12. Factorial designs. [ABSTRACT FROM AUTHOR]