Client-Specific Equivalence Checking

Software is often built by integrating components created by different teams or even different organizations. With little understanding of changes in dependent components, it is challenging to maintain correctness and robustness of the entire system. In this paper, we investigate the effect of component changes on the behavior of their clients. We observe that changes in a component are often irrelevant to a particular client and thus can be adopted without any delays or negative effects. Following this observation, we formulate the notion of client-specific equivalence checking (CSE) and develop an automated technique optimized for checking such equivalence. We evaluate our technique on a set of benchmarks, including those from the existing literature on equivalence checking, and show its applicability and effectiveness.

[1]  Daniel Kroening,et al.  Modular Demand-Driven Analysis of Semantic Difference for Program Versions , 2017, SAS.

[2]  Patrice Godefroid,et al.  Compositional dynamic test generation , 2007, POPL '07.

[3]  Dawson R. Engler,et al.  KLEE: Unassisted and Automatic Generation of High-Coverage Tests for Complex Systems Programs , 2008, OSDI.

[4]  Shuvendu K. Lahiri,et al.  Conditional Equivalence , 2010 .

[5]  Vladimir Klebanov,et al.  Automating regression verification , 2014, Software Engineering & Management.

[6]  Martin C. Rinard,et al.  The Challenges of Staying Together While Moving Fast: An Exploratory Study , 2016, 2016 IEEE/ACM 38th International Conference on Software Engineering (ICSE).

[7]  William B. Hart,et al.  FLINT : Fast library for number theory , 2013 .

[8]  Arie Gurfinkel,et al.  Property Directed Equivalence via Abstract Simulation , 2016, CAV.

[9]  Sagar Chaki,et al.  Verification of evolving software via component substitutability analysis , 2008, Formal Methods Syst. Des..

[10]  Matthew B. Dwyer,et al.  Differential symbolic execution , 2008, SIGSOFT '08/FSE-16.

[11]  Shuvendu K. Lahiri,et al.  Differential assertion checking , 2013, ESEC/FSE 2013.

[12]  Shuvendu K. Lahiri,et al.  Statically Validating Must Summaries for Incremental Compositional Dynamic Test Generation , 2011, SAS.

[13]  Nikolai Tillmann,et al.  Demand-Driven Compositional Symbolic Execution , 2008, TACAS.

[14]  James C. King,et al.  Symbolic execution and program testing , 1976, CACM.

[15]  Shuvendu K. Lahiri,et al.  SYMDIFF: A Language-Agnostic Semantic Diff Tool for Imperative Programs , 2012, CAV.

[16]  Ofer Strichman,et al.  Regression Verification: Proving the Equivalence of Similar Programs , 2009, CAV.

[17]  Cristian Cadar,et al.  Shadow symbolic execution for better testing of evolving software , 2014, ICSE Companion.

[18]  Thomas Ball,et al.  Deconstructing Dynamic Symbolic Execution , 2015, Dependable Software Systems Engineering.

[19]  M. Gario,et al.  PySMT: a Solver-Agnostic Library for Fast Prototyping of SMT-Based Algorithms , 2015 .

[20]  Sarfraz Khurshid,et al.  Directed incremental symbolic execution , 2011, PLDI '11.

[21]  Zvonimir Rakamaric,et al.  SMACK: Decoupling Source Language Details from Verifier Implementations , 2014, CAV.

[22]  Grigory Fedyukovich,et al.  Incremental upgrade checking by means of interpolation-based function summaries , 2012, 2012 Formal Methods in Computer-Aided Design (FMCAD).

[23]  Shuvendu K. Lahiri,et al.  Towards Modularly Comparing Programs Using Automated Theorem Provers , 2013, CADE.

[24]  Bor-Yuh Evan Chang,et al.  Boogie: A Modular Reusable Verifier for Object-Oriented Programs , 2005, FMCO.