In and Out of SSA : a Denotational Specification

We present non-standard denotational specifications of the SSA form and of its conversion processes from and to imperative programming languages. Thus, we provide a strong mathematical foundation for this intermediate code representation language used in modern compilers such as GCC or Intel CC. More specifically, we provide (1) a new functional approach to SSA, the Static Single Assignment form, together with its denotational semantics, (2) a collecting denotational semantics for a simple imperative language Imp, (3) a non-standard denotational semantics specifying the conversion of Imp to SSA and (4) a non-standard denotational semantics for the reverse SSA to Imp conversion process. These translations are proven correct, ensuring that the structure of the memory states manipulated by imperative constructs is preserved in compilers' middle ends that use the SSA form as control-flow data representation. Interestingly, a s unexpected by-products of our conversion procedures, we offer (1) a new proof of the reducibility of the RAM computing model to the domain of Kleene's partial recursive functions, to which SSA is strongly related, and, on a more practical note, (2) a new algorithm to perform program slicing in imperative programming languages. All these specifications have been prototyped using GNU Common Lisp. These fundamental results prove that the widely used SSA technology is sound. Our formal denotational framework further suggests that the SSA form could become a target of choice for other optimization analysis techniques such as abstract interpretation or partial evaluation. Indeed, since the SSA form is language-independent, the resulting optimizations would be automatically enabled for any source language supported by compilers such as GCC.

[1]  Joseph E. Stoy,et al.  Denotational Semantics: The Scott-Strachey Approach to Programming Language Theory , 1981 .

[2]  Patrick Cousot,et al.  Abstract interpretation: a unified lattice model for static analysis of programs by construction or approximation of fixpoints , 1977, POPL.

[3]  Alfred V. Aho,et al.  Compilers: Principles, Techniques, and Tools , 1986, Addison-Wesley series in computer science / World student series edition.

[4]  Mark N. Wegman,et al.  Constant propagation with conditional branches , 1985, POPL.

[5]  Zahira Ammarguellat,et al.  A Control-Flow Normalization Algorithm and Its Complexity , 1992, IEEE Trans. Software Eng..

[6]  Arthur B. Maccabe,et al.  The program dependence web: a representation supporting control-, data-, and demand-driven interpretation of imperative languages , 1990, PLDI '90.

[7]  Andrew W. Appel,et al.  SSA is functional programming , 1998, SIGP.

[8]  Albert Cohen,et al.  Induction Variable Analysis with Delayed Abstractions , 2005, HiPEAC.

[9]  PingaliKeshav,et al.  Algorithms for computing the static single assignment form , 2003 .

[10]  Andrea De Lucia,et al.  Program slicing: methods and applications , 2001, Proceedings First IEEE International Workshop on Source Code Analysis and Manipulation.

[11]  Matthias Felleisen,et al.  The semantics of program dependence , 1989, PLDI '89.

[12]  Joe D. Warren,et al.  The program dependence graph and its use in optimization , 1987, TOPL.

[13]  Neil D. Jones,et al.  Computability and complexity - from a programming perspective , 1997, Foundations of computing series.

[14]  Patrick Cousot,et al.  Systematic design of program analysis frameworks , 1979, POPL.

[15]  Sabine Glesner An ASM Semantics for SSA Intermediate Representations , 2004, Abstract State Machines.

[16]  Aart J. C. Bik,et al.  Inside the Intel compiler , 2003 .

[17]  Laurie J. Hendren,et al.  Taming control flow: a structured approach to eliminating goto statements , 1994, Proceedings of 1994 IEEE International Conference on Computer Languages (ICCL'94).

[18]  Mark N. Wegman,et al.  Efficiently computing static single assignment form and the control dependence graph , 1991, TOPL.

[19]  Richard Kelsey,et al.  A correspondence between continuation passing style and static single assignment form , 1995, IR '95.

[20]  Raymond Lo,et al.  Partial redundancy elimination in SSA form , 1999, TOPL.

[21]  Steven S. Muchnick,et al.  Advanced Compiler Design and Implementation , 1997 .

[22]  Joe D. Warren,et al.  The program dependence graph and its use in optimization , 1984, TOPL.

[23]  David A. Padua,et al.  Gated SSA-based demand-driven symbolic analysis for parallelizing compilers , 1995, ICS '95.

[24]  Leon Presser,et al.  Structured languages , 1975, SIGP.

[25]  Vikram S. Adve,et al.  LLVM: a compilation framework for lifelong program analysis & transformation , 2004, International Symposium on Code Generation and Optimization, 2004. CGO 2004..

[26]  Larry Carter,et al.  Predicated static single assignment , 1999, 1999 International Conference on Parallel Architectures and Compilation Techniques (Cat. No.PR00425).

[27]  Andrew W. Appel,et al.  Modern Compiler Implementation in Java , 1997 .

[28]  Mark N. Wegman,et al.  An efficient method of computing static single assignment form , 1989, POPL '89.

[29]  Mikael Pettersson,et al.  Efficiently compiling a functional language on AMD64: the HiPE experience , 2005, PPDP '05.

[30]  Pierre Jouvelot,et al.  Semantic parallelization: a practical exercise in abstract interpretation , 1987, POPL '87.

[31]  Guy L. Steele,et al.  Common Lisp the Language , 1984 .

[32]  Keshav Pingali,et al.  Algorithms for computing the static single assignment form , 2003, JACM.

[33]  Jr. Guy L. Steele,et al.  Common LISP: the language (2nd ed.) , 1990 .