A Case Study of Performance Degradation Attributable to Run-Time Bounds Checks on C++ Vector Access

Programmers routinely omit run-time safety checks from applications because they assume that these safety checks would degrade performance. The simplest example is the use of arrays or array-like data structures that do not enforce the constraint that indices must be within bounds. This report documents an attempt to measure the performance penalty incurred by two different implementations of bounds-checking in C and C++ using a simple benchmark and a desktop PC with a modern superscalar CPU. The benchmark consisted of a loop that wrote to array elements in sequential order. With this configuration, relative to the best performance observed for any access method in C or C++, mean degradation of only (0.881 ± 0.009) % was measured for a standard bounds-checking access method in C++. This case study showed the need for further work to develop and refine measurement methods and to perform more comparisons of this type. Comparisons across different use cases, configurations, programming languages, and environments are needed to determine under what circumstances (if any) the performance advantage of unchecked access is actually sufficient to outweigh the negative consequences for security and software quality.

[1]  J. Fowler,et al.  Journal of Research of the National Institute of Standards and Technology INFORMATION TECHNOLOGY FOR ENGINEERING AND MANUFACTURING Gaithersburg , MD June 12-13 , 2000 , 2000 .

[2]  Philip Heidelberger,et al.  Computer Performance Evaluation Methodology , 1984, IEEE Transactions on Computers.

[3]  Vincent M. Weaver,et al.  Can Hardware Performance Counters Produce Expected, Deterministic Results? , 2010 .

[4]  Z. Šidák Rectangular Confidence Regions for the Means of Multivariate Normal Distributions , 1967 .

[5]  Robert F. Berry Computer Benchmark Evaluation and Design of Experiments, a Case Study , 1992, IEEE Trans. Computers.

[6]  J. Durbin,et al.  Testing for serial correlation in least squares regression. I. , 1950, Biometrika.

[7]  Walter Freiberger,et al.  Statistical Computer Performance Evaluation , 1972 .

[8]  Erhard Plödereder,et al.  Ada 2005 Reference Manual. Language and Standard Libraries - International Standard ISO/IEC 8652/1995 (E) with Technical Corrigendum 1 and Amendment 1 , 2007, Lecture Notes in Computer Science.

[9]  Raj Jain,et al.  The art of computer systems performance analysis - techniques for experimental design, measurement, simulation, and modeling , 1991, Wiley professional computing.

[10]  Frank Ch. Eigler Mudflap: Pointer use checking for C/C , 2003 .

[11]  E. Iso,et al.  Measurement Uncertainty and Probability: Guide to the Expression of Uncertainty in Measurement , 1995 .

[12]  J. Durbin,et al.  Testing for serial correlation in least squares regression. II. , 1950, Biometrika.

[13]  Shirley Moore,et al.  Non-determinism and overcount on modern hardware performance counter implementations , 2013, 2013 IEEE International Symposium on Performance Analysis of Systems and Software (ISPASS).

[14]  Sally A. McKee,et al.  Can hardware performance counters be trusted? , 2008, 2008 IEEE International Symposium on Workload Characterization.

[15]  Matthias Hauswirth,et al.  Accuracy of performance counter measurements , 2009, 2009 IEEE International Symposium on Performance Analysis of Systems and Software.

[16]  Ulrich Drepper,et al.  What Every Programmer Should Know About Memory , 2007 .

[17]  David Abrahams,et al.  C++ Template Metaprogramming: Concepts, Tools, and Techniques from Boost and Beyond (C++ In-Depth Series) , 2004 .

[18]  Ramon Puigjaner,et al.  Computer Performance Evaluation , 2000, Lecture Notes in Computer Science.