Minimum description length principles for detection and classification of FTP exploits

In this paper we build on the principle of "conservation of complexity", analyzed in Evans, S et al. (2001), to measure protocol redundancy and pattern content as a metric for information assurance. We first analyze complexity estimators as a tool for detecting FTP exploits. Results showing the utility of complexity-based information assurance to detect exploits over the file transfer protocol are presented and analyzed. We show that complexity metrics are able to distinguish between FTP exploits and normal sessions within some margin of error. We then derive a new heuristic for complexity estimation using minimum description length principles and develop a new complexity estimator and compression algorithm based on grammar inference using this heuristic. This estimator is used to provide meaningful models of unknown data sets. Finally we demonstrate the capability of our complexity-based approach to classify protocol behavior based on similarity distance metrics from known behaviors.

[1]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[2]  Paul M. B. Vitányi,et al.  Meaningful Information , 2001, IEEE Transactions on Information Theory.

[3]  Stephen F. Bush,et al.  Information assurance through Kolmogorov complexity , 2001, Proceedings DARPA Information Survivability Conference and Exposition II. DISCEX'01.

[4]  Bin Ma,et al.  The similarity metric , 2001, IEEE Transactions on Information Theory.

[5]  Stephen F. Bush,et al.  Kolmogorov complexity estimation and application for information system security , 2003 .

[6]  Péter Gács,et al.  Algorithmic statistics , 2000, IEEE Trans. Inf. Theory.