On the calculation of Jacobian matrices by the Markowitz rule

The evaluation of derivative vectors can be performed with optimal computational complexity by the forward or reverse mode of automatic differentiation. This approach may be applied to evaluate first and higher derivatives of any vector function that is defined as the composition of easily differentiated elementary functions, typically in the form of a computer program. The more general task of efficiently evaluating Jacobians or other derivative matrices leads to a combinational optimization problem, which is conjectured to be NP-hard. Here, we examine this vertex elimination problem and solve it approximately, using a greedy heuristic. Numerical experiments show the resulting Markowitz scheme for Jacobian evaluation to be more efficient than column by column or row by row evaluation using the forward or the reverse mode, respectively.