Using Checksums to Detect Data Corruption

In this paper, we consider the problem of malicious and intended corruption of data in a database, acting outside of the scope of the database management system. Although detecting an attacker who changes a set of database values at the disk level is a simple task (achievable by attaching signatures to each block of data), a more sophisticated attacker may corrupt the data by replacing the current data with copies of old block images, compromising the integrity of the data. To prevent successful completion of this attack, we provide a defense mechanism that enormously increases the intruders workload, yet maintains a low system cost during an authorized update. Our algorithm calculates and maintains two levels of signatures (checksum values) on blocks of data. The signatures are grouped in a manner that forces an extended series of block copying for any unauthorized update. Using the available information on block sizes, block reference patterns and amount of concurrently active transactions in the database, we calculate the length of this chain of copying, proving that the intruder has to perform a lot of work in order to go undetected. Therefore, our technique makes this type of attack very unlikely. Previous work has not addressed protection methods against this knowledgeable and equipped intruder who is operating outside the database management system.

[1]  John E. Dobson,et al.  Database security IX: Status and prospects , 1996 .

[2]  R.J. Lipton,et al.  Probabilistic diagnosis of hot spots , 1992, [1992] Eighth International Conference on Data Engineering.

[3]  John P. McDermott,et al.  Towards a model of storage jamming , 1996, Proceedings 9th IEEE Computer Security Foundations Workshop.

[4]  S. Sudarshan,et al.  Using codewords to protect database data from a class of software errors , 1999, Proceedings 15th International Conference on Data Engineering (Cat. No.99CB36337).

[5]  Richard J. Lipton,et al.  Probabilistic Dignosis of Hot Spots , 1992, IEEE International Conference on Data Engineering.

[6]  Eugene H. Spafford,et al.  The design and implementation of tripwire: a file system integrity checker , 1994, CCS '94.

[7]  Richard J. Lipton,et al.  A Class of Randomized Strategies for Low-Cost Comparison of File Copies , 1991, IEEE Trans. Parallel Distributed Syst..

[8]  John P. McDermott,et al.  Storage Jamming , 1995, DBSec.

[9]  Jennifer Widom,et al.  Change detection in hierarchically structured information , 1996, SIGMOD '96.

[10]  Sang Lyul Min,et al.  An adaptive block management scheme using on-line detection of block reference patterns , 1998, Proceedings International Workshop on Multi-Media Database Management Systems (Cat. No.98TB100249).

[11]  Eric Miller,et al.  Testing and evaluating computer intrusion detection systems , 1999, CACM.