FlipIt: The Game of “Stealthy Takeover”

AbstractRecent targeted attacks have increased significantly in sophistication, undermining the fundamental assumptions on which most cryptographic primitives rely for security. For instance, attackers launching an Advanced Persistent Threat (APT) can steal full cryptographic keys, violating the very secrecy of “secret” keys that cryptographers assume in designing secure protocols. In this article, we introduce a game-theoretic framework for modeling various computer security scenarios prevalent today, including targeted attacks. We are particularly interested in situations in which an attacker periodically compromises a system or critical resource completely, learns all its secret information and is not immediately detected by the system owner or defender. We propose a two-player game between an attacker and defender called FlipIt or The Game of “Stealthy Takeover.” In FlipIt, players compete to control a shared resource. Unlike most existing games, FlipIt allows players to move at any given time, taking control of the resource. The identity of the player controlling the resource, however, is not revealed until a player actually moves. To move, a player pays a certain move cost. The objective of each player is to control the resource a large fraction of time, while minimizing his total move cost. FlipIt provides a simple and elegant framework in which we can formally reason about the interaction between attackers and defenders in practical scenarios. In this article, we restrict ourselves to games in which one of the players (the defender) plays with a renewal strategy, one in which the intervals between consecutive moves are chosen independently and uniformly at random from a fixed probability distribution. We consider attacker strategies ranging in increasing sophistication from simple periodic strategies (with moves spaced at equal time intervals) to more complex adaptive strategies, in which moves are determined based on feedback received during the game. For different classes of strategies employed by the attacker, we determine strongly dominant strategies for both players (when they exist), strategies that achieve higher benefit than all other strategies in a particular class. When strongly dominant strategies do not exist, our goal is to characterize the residual game consisting of strategies that are not strongly dominated by other strategies. We also prove equivalence or strict inclusion of certain classes of strategies under different conditions. Our analysis of different FlipIt variants teaches cryptographers, system designers, and the community at large some valuable lessons: 1.Systems should be designed under the assumption of repeated total compromise, including theft of cryptographic keys. FlipIt provides guidance on how to implement a cost-effective defensive strategy.2.Aggressive play by one player can motivate the opponent to drop out of the game (essentially not to play at all). Therefore, moving fast is a good defensive strategy, but it can only be implemented if move costs are low. We believe that virtualization has a huge potential in this respect.3.Close monitoring of one’s resources is beneficial in detecting potential attacks faster, gaining insight into attacker’s strategies, and scheduling defensive moves more effectively. Interestingly, FlipIt finds applications in other security realms besides modeling of targeted attacks. Examples include cryptographic key rotation, password changing policies, refreshing virtual machines, and cloud auditing.

[1]  William Feller,et al.  An Introduction to Probability Theory and Its Applications , 1951 .

[2]  Sheldon M. Ross,et al.  Stochastic Processes , 2018, Gauge Integral Structures for Stochastic Calculus and Quantum Electrodynamics.

[3]  Roger B. Myerson,et al.  Game theory - Analysis of Conflict , 1991 .

[4]  Corporate,et al.  The handbook of information security , 1991 .

[5]  Robert G. Gallager,et al.  Discrete Stochastic Processes , 1995 .

[6]  Richard S. Sutton,et al.  Introduction to Reinforcement Learning , 1998 .

[7]  Samuel N. Hamilton,et al.  Challenges in Applying Game Theory to the Domain of Information Warfare , 2001 .

[8]  Shouhuai Xu,et al.  Key-Insulated Public Key Cryptosystems , 2002, EUROCRYPT.

[9]  Ben A. Chaouch,et al.  Minimizing Cost of Continuous Audit: Counting and Time Dependent Strategies , 2003 .

[10]  Gene Itkis,et al.  Cryptographic tamper evidence , 2003, CCS '03.

[11]  Richard S. Sutton,et al.  Reinforcement Learning: An Introduction , 1998, IEEE Trans. Neural Networks.

[12]  Hossein Bidgoli Handbook of Information Security , 2005 .

[13]  G. Mailath,et al.  Repeated Games and Reputations , 2006 .

[14]  G. Mailath,et al.  Repeated Games and Reputations: Long-Run Relationships , 2006 .

[15]  Thomas A. Johnson,et al.  Homeland Security Presidential Directive-1 , 2007 .

[16]  Reza Curtmola,et al.  Provable data possession at untrusted stores , 2007, CCS '07.

[17]  Ari Juels,et al.  Pors: proofs of retrievability for large files , 2007, CCS '07.

[18]  Jonathan Katz,et al.  Bridging Game Theory and Cryptography: Recent Results and Future Directions , 2008, TCC.

[19]  Hovav Shacham,et al.  Compact Proofs of Retrievability , 2008, ASIACRYPT.

[20]  Tansu Alpcan,et al.  Security Games with Incomplete Information , 2009, 2009 IEEE International Conference on Communications.

[21]  Ari Juels,et al.  HAIL: a high-availability and integrity layer for cloud storage , 2009, CCS.

[22]  Tyler Moore,et al.  Would a 'cyber warrior' protect us: exploring trade-offs between attack and defense of information systems , 2010, NSPW '10.

[23]  Amichai Shulman The underground credentials market , 2010 .

[24]  Chase Qishi Wu,et al.  A Survey of Game Theory as Applied to Network Security , 2010, 2010 43rd Hawaii International Conference on System Sciences.

[25]  P. Mell,et al.  The NIST Definition of Cloud Computing , 2011 .

[26]  M. Dufwenberg Game theory. , 2011, Wiley interdisciplinary reviews. Cognitive science.

[27]  Miklos A. Vasarhelyi,et al.  Innovation and practice of continuous auditing , 2011, Int. J. Account. Inf. Syst..

[28]  Chris Kanich,et al.  Show Me the Money: Characterizing Spam-advertised Revenue , 2011, USENIX Security Symposium.

[29]  Dusko Pavlovic,et al.  Gaming security by obscurity , 2011, NSPW '11.

[30]  Edwin Pickstone,et al.  ILLEGITIMI NON CARBORUNDUM , 2012 .