A Theoretical Analysis: Physical Unclonable Functions and the Software Protection Problem

Physical Unclonable Functions (PUFs) or Physical One Way Functions (P-OWFs) are physical systems whose responses to input stimuli are easy to measure but hard to clone. The unclonability property is due to the accepted hardness of replicating the multitude of uncontrollable manufacturing characteristics and makes PUFs useful in solving problems such as device authentication, software protection and licensing, and certified execution. In this paper, we investigate the effectiveness of PUFs for software protection in hostile offline settings. We show that traditional non-computational (black-box) PUFs cannot solve the software protection problem in this context. We provide two real-world adversary models (weak and strong variants) and security definitions for each. We propose schemes secure against the weak adversary and show that no scheme is secure against a strong adversary without the use of trusted hardware. Finally, we present a protection scheme secure against strong adversaries based on trusted hardware.

[1]  John C. Knight,et al.  A security architecture for survivability mechanisms , 2001 .

[2]  Boris Skoric,et al.  Read-Proof Hardware from Protective Coatings , 2006, CHES.

[3]  Mikhail J. Atallah,et al.  Protecting Software Code by Guards , 2001, Digital Rights Management Workshop.

[4]  Berk Sunar,et al.  Physical unclonable function with tristate buffers , 2008, 2008 IEEE International Symposium on Circuits and Systems.

[5]  Frank Sehnke,et al.  On the Foundations of Physical Unclonable Functions , 2009, IACR Cryptol. ePrint Arch..

[6]  Radu Sion,et al.  Poster: making the case for intrinsic personal physical unclonable functions (IP-PUFs) , 2011, CCS '11.

[7]  James R. Gosler,et al.  Software Protection: Myth or Reality? , 1985, CRYPTO.

[8]  Srinivas Devadas,et al.  Controlled physical random functions , 2002, 18th Annual Computer Security Applications Conference, 2002. Proceedings..

[9]  Christian S. Collberg,et al.  A Taxonomy of Obfuscating Transformations , 1997 .

[10]  Frederik Armknecht,et al.  A Formalization of the Security Features of Physical Functions , 2011, 2011 IEEE Symposium on Security and Privacy.

[11]  Boris Skoric,et al.  An information theoretic model for physical uncloneable functions , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[12]  Stephen A. Benton,et al.  Physical one-way functions , 2001 .

[13]  Ingrid Verbauwhede,et al.  Physically Unclonable Functions: A Study on the State of the Art and Future Research Directions , 2010, Towards Hardware-Intrinsic Security.

[14]  Stephen T. Kent Protecting externally supplied software in small computers , 1980 .

[15]  Boris Skoric,et al.  Strong Authentication with Physical Unclonable Functions , 2007, Security, Privacy, and Trust in Modern Data Management.

[16]  Jorge Guajardo,et al.  FPGA Intrinsic PUFs and Their Use for IP Protection , 2007, CHES.

[17]  Frederick B. Cohen,et al.  Operating system protection through program evolution , 1993, Comput. Secur..

[18]  Amit Sahai,et al.  On the (im)possibility of obfuscating programs , 2001, JACM.

[19]  Rafail Ostrovsky,et al.  Software protection and simulation on oblivious RAMs , 1996, JACM.

[20]  Jorge Guajardo,et al.  Brand and IP protection with physical unclonable functions , 2008, 2008 IEEE International Symposium on Circuits and Systems.

[21]  G. Edward Suh,et al.  Extracting secret keys from integrated circuits , 2005, IEEE Transactions on Very Large Scale Integration (VLSI) Systems.

[22]  Srinivas Devadas,et al.  Silicon physical random functions , 2002, CCS '02.

[23]  Mikhail J. Atallah,et al.  Binding software to specific native hardware in a VM environment: the puf challenge and opportunity , 2008, VMSec '08.