An Initial Study of Targeted Personality Models in the FlipIt Game

Game theory typically assumes rational behavior for solution concepts such as Nash equilibrium. However, this assumption is often violated when human agents are interacting in real-world scenarios, such as cybersecurity. There are different human factors that drive human decision making, and these also vary significantly across individuals leading to substantial individual differences in behavior. Predicting these differences in behavior can help a defender to predict actions of different attacker types to provide better defender strategy tailored towards different attacker types. We conducted an initial study of this idea using a behavioral version of the FlipIt game. We show that there are identifiable differences in behavior among different groups (e.g., individuals with different Dark Triad personality scores), but our initial attempts at capturing these differences using simple known behavioral models does not lead to significantly improved defender strategies. This suggests that richer behavioral models are needed to effectively predict and target strategies in these more complex cybersecurity game.

[1]  J. Nash NON-COOPERATIVE GAMES , 1951 .

[2]  J. Harsanyi Games with Incomplete Information Played by 'Bayesian' Players, Part III. The Basic Probability Distribution of the Game , 1968 .

[3]  A. Tversky,et al.  Prospect theory: An analysis of decision under risk Econometrica 47 , 1979 .

[4]  D. Koller,et al.  Efficient Computation of Equilibria for Extensive Two-Person Games , 1996 .

[5]  John R. Anderson ACT: A simple theory of complex cognition. , 1996 .

[6]  R. McKelvey,et al.  Quantal Response Equilibria for Extensive Form Games , 1998 .

[7]  D. Paulhus,et al.  The Dark Triad of personality: Narcissism, Machiavellianism, and psychopathy , 2002 .

[8]  Michael H. Bowling,et al.  Regret Minimization in Games with Incomplete Information , 2007, NIPS.

[9]  Dipankar Dasgupta,et al.  Game theory for cyber security , 2010, CSIIRW '10.

[10]  Rong Yang,et al.  Improving Resource Allocation Strategy against Human Adversaries in Security Games , 2011, IJCAI.

[11]  Branislav Bosanský,et al.  Game Theoretic Model of Strategic Honeypot Selection in Computer Networks , 2012, GameSec.

[12]  Michael H. Bowling,et al.  Finding Optimal Abstract Strategies in Extensive-Form Games , 2012, AAAI.

[13]  Ronald L. Rivest,et al.  FlipIt: The Game of “Stealthy Takeover” , 2012, Journal of Cryptology.

[14]  W. Ziemba,et al.  Growth-optimal investments and numeraire portfolios under transactions costs , 2013 .

[15]  Amos Azaria,et al.  Analyzing the Effectiveness of Adversary Modeling in Security Games , 2013, AAAI.

[16]  Jessica L Maples,et al.  A test of two brief measures of the dark triad: the dirty dozen and short dark triad. , 2014, Psychological assessment.

[17]  Branislav Bosanský,et al.  An Exact Double-Oracle Algorithm for Zero-Sum Extensive-Form Games with Imperfect Information , 2014, J. Artif. Intell. Res..

[18]  Rong Yang,et al.  Adaptive resource allocation for wildlife protection against illegal poachers , 2014, AAMAS.

[19]  Daniel N. Jones,et al.  Introducing the Short Dark Triad (SD3) , 2014, Assessment.

[20]  Nicole D. Sintov,et al.  Human Adversaries in Opportunistic Crime Security Games: Evaluating Competing Bounded Rationality Models , 2015 .

[21]  Milind Tambe,et al.  "A Game of Thrones": When Human Behavior Models Compete in Repeated Stackelberg Security Games , 2015, AAMAS.

[22]  Viliam Lisý,et al.  Game-Theoretic Foundations for the Strategic Use of Honeypots in Network Security , 2015, Cyber Warfare.

[23]  Branislav Bosanský,et al.  Optimal Strategies for Detecting Data Exfiltration by Internal and External Attackers , 2017, GameSec.