Note on a problem of Ragnar Frisch

what are the conditions under which the regression of xi on x2 is linear, $, a, and ,B being independent variables and a and b being constants but unknown. The problem requires the condition for linearity of regression of xi on X2 whatever may be the values of a and b. A partial answer was given by H. V. Allen,' who has proved that if the first two moments of a and all the moments of t and ,B are finite then the necessary and sufficient condition for the linearity of regression of xi on X2 whatever a and b may be is that both t and ,B should be normally distributed. A fairly complete answer to this problem was given by the present author in a thesis submitted to the Calcutta University in 1943. The restrictions imposed on {, a, and / are that their first moments exist. The proof is now extended to a more general case which is discussed below together with a general problem that may be of interest to economists. Let xi and X2 be two variables with the joint probability density G(x1, X2) and marginal distributions Gi(x,) and G2(x2). If the regression of xi on X2 is linear and the mean values of xi and x2 exist, in which case they may be supposed to be zero without loss of generality, then the regression equation may be written as