A Foundation for Cyber Experimentation

Title: A Foundation for Cyber Experimentation Author: Evan Lawrence Stoner Committee Chair: Marco Carvalho, Ph.D. Despite the variety of environments and tools available to computer scientists performing cyber experimentation, the community lacks a single cohesive platform that is capable of unifying these tools across environments and research domains. In pursuit of this unifying platform, we introduce a new framework that is agnostic to the underlying environment, tools, and domain. We show the effectiveness of this framework by (1) demonstrating how an example experiment can be described within the framework and what benefits this offers, (2) presenting a prototype implementation of the framework, and (3) offering a collection of approaches to automated experimentation that the framework accommodates.

[1]  Niky Riga,et al.  Creating Repeatable Computer Science and Networking Experiments on Shared, Public Testbeds , 2015, OPSR.

[2]  Fabio Massacci,et al.  TESTREX: a Testbed for Repeatable Exploits , 2014, CSET.

[3]  Maximilian Ott,et al.  An instrumentation framework for the critical task of measurement collection in the future Internet , 2014, Comput. Networks.

[4]  Fabio Massacci,et al.  MalwareLab: Experimentation with Cybercrime Attack Tools , 2013, CSET.

[5]  Roozbeh Farahbod,et al.  An experiment specification language for goal-driven, automated performance evaluations , 2013, SAC '13.

[6]  Rouven Krebs,et al.  Automated inference of goal-oriented performance prediction functions , 2012, 2012 Proceedings of the 27th IEEE/ACM International Conference on Automated Software Engineering.

[7]  John Wroclawski,et al.  The DETER Project: Towards Structural Advances in Experimental Cybersecurity Research and Evaluation , 2012, J. Inf. Process..

[8]  Terry V. Benzel The science of cyber security experimentation: the DETER project , 2011, ACSAC '11.

[9]  Walid Dabbous,et al.  NEPI: An integration framework for Network Experimentation , 2011, SoftCOM 2011, 19th International Conference on Software, Telecommunications and Computer Networks.

[10]  Peter P. Haglich,et al.  Cyber Scientific Test Language , 2011, SEMWEB.

[11]  J. Bryan Lyles,et al.  Computational Asset Description for Cyber Experiment Support Using OWL , 2011, 2011 IEEE Fifth International Conference on Semantic Computing.

[12]  Roy A. Maxion,et al.  Should Security Researchers Experiment More and Draw More Inferences? , 2011, CSET.

[13]  Maximilian Ott,et al.  A Portal to Support Rigorous Experimental Methodology in Networking Research , 2011, TRIDENTCOM.

[14]  Amin Vahdat,et al.  Remote Control: Distributed Application Configuration, Management, and Visualization with Plush , 2007, LISA.

[15]  John McHugh,et al.  Why is there no science in cyber science?: a panel discussion at NSPW 2010 , 2010, NSPW '10.

[16]  Jens Happe,et al.  The Performance Cockpit Approach: A Framework For Systematic Performance Evaluations , 2010, 2010 36th EUROMICRO Conference on Software Engineering and Advanced Applications.

[17]  Qiang Fu,et al.  Mining dependency in distributed systems through unstructured logs analysis , 2010, OPSR.

[18]  Maximilian Ott,et al.  OMF: a control and management framework for networking testbeds , 2010, OPSR.

[19]  Klaus Wehrle,et al.  A virtual platform for network experimentation , 2009, VISA '09.

[20]  A. Greenberg,et al.  Towards highly reliable enterprise network services via inference of multi-level dependencies , 2007, SIGCOMM '07.

[21]  Calvin Ko,et al.  SEER: A Security Experimentation EnviRonment for DETER , 2007, DETER.

[22]  Jonathan S. Turner A proposed architecture for the GENI backbone platform , 2006, 2006 Symposium on Architecture For Networking And Communications Systems.

[23]  David E. Culler,et al.  A blueprint for introducing disruptive technology into the Internet , 2003, CCRV.