ON MINIMIZING A CONVEX FUNCTION SUBJECT TO LINEAR INEQUALITIES

SUMMARY THE minimization of a convex function of variables subject to linear inequalities is discussed briefly in general terms. Dantzig's Simplex Method is extended to yield finite algorithms for minimizing either a convex quadratic function or the sum of the t largest of a set of linear functions, and the solution of a generalization of the latter problem is indicated. In the last two sections a form of linear programming with random variables as coefficients is described, and shown to involve the minimization of a convex ftunction. Linear programming has been studied extensively in the last few years, as indicated by Vajda (1955). Various authors have mentioned the possibility of relaxing the requirement of linearity, but the practical problems of non-linear programming do not seem to have been considered in any detail.* This paper is concerned with some aspects of the simplest form of non-linear programming-the minimization of a convex function of variables subject to linear inequalities. In principle this can always be done using the method of steepest descents, but this will rarely be practical in its primitive form. We therefore consider special methods for some important particular classes of such functions. In Section 2 the Simplex Method, originally developed by Dantzig (1951) for linear programming, is outlined in terms sufficiently general to cover the applications to non-linear programming considered in the next two sections. In Section 3 we show how to minimize a convex quadratic function. This enables one to use what.amounts to the Newton-Raphson Method for minimizing a well-behaved general convex function: one finds a feasible solution of the constraints, and at each stage minimizes the quadratic function whose first and second derivatives at the feasible solution are the same as those of the given function.