Physical Turing Machines and the Formalization of Physical Cryptography

In this paper, we introduce two formal means by which physical adversarial actions and features can be modeled in cryptography and security: The concepts of a “physical Turing machine (PhTM or φ-TM)” and of a “technology” on which the PhTM operates. We show by two examples how these concepts can be applied: Firstly, we sketch their use in formalizing physical adversarial computations (quantum computation [4], optical techniques [26, 15], etc.) in classical cryptpography, which an adversary might carry out to attack complexity-based schemes. Secondly, we work out in more detail the application of PhTMs in the formal treatment of physical unclonable functions and physical cryptography in general, in which disordered, unclonable physical objects are used for cryptographic purposes. PhTMs allow a rigid formal expression of the required properties of these objects (such as their physical unclonability), and enable us to lead formal reductionist proofs in this field. The hybrid nature of PhTMs thereby allows us to combine physical with computational assumptions in the proof. As an example, we lead a formal proof of a physical scheme that combines a classical digital signature with an unclonable, unique object in order to “label” or “tag” valuable objects securely and in a forgery-proof manner. We stress that PhTMs as introduced in this paper cannot directly and straightforwardly answer the question which physical tasks are eventually feasible and infeasible in our universe. But such an expectation would be unreasonably high; recall that classical Turing machines also do not allow to draw a simple line between feasible and infeasible computations, as the NP vs. P issue shows. Rather, they provide us with a formal backbone in which relevant physical security features can be expressed and security proofs can be led. Apart from the applications sketched in this paper, many other uses of PhTMs lie at hand, for example in defining security against side channels or invasive attacks, or in the development of a “physical” structural complexity theory, which are left to future work.

[1]  Raphael Overbeck,et al.  Post-Quantum Signatures , 2004, IACR Cryptol. ePrint Arch..

[2]  Darko Kirovski,et al.  RF-DNA: Radio-Frequency Certificates of Authenticity , 2007, CHES.

[3]  Ulrich Rührmair,et al.  SIMPL Systems: On a Public Key Variant of Physical Unclonable Functions , 2009, IACR Cryptol. ePrint Arch..

[4]  Adi Shamir,et al.  Analysis and Optimization of the TWINKLE Factoring Device , 2000, EUROCRYPT.

[5]  Ulrich Rührmair,et al.  Strong PUFs: Models, Constructions, and Security Proofs , 2010, Towards Hardware-Intrinsic Security.

[6]  U. Maurer,et al.  Secret key agreement by public discussion from common information , 1993, IEEE Trans. Inf. Theory.

[7]  Miodrag Potkonjak,et al.  Hardware-Based Public-Key Cryptography with Public Physically Unclonable Functions , 2009, Information Hiding.

[8]  Darko Kirovski Anti-counterfeiting: Mixing the Physical and the Digital World , 2010, Towards Hardware-Intrinsic Security.

[9]  Adi Shamir Factoring Large Numbers with the Twinkle Device (Extended Abstract) , 1999, CHES.

[10]  Ulrich Rührmair,et al.  Oblivious Transfer Based on Physical Unclonable Functions , 2010, TRUST.

[11]  R. Pappu,et al.  Physical One-Way Functions , 2002, Science.

[12]  Cliff Wang,et al.  Introduction to Hardware Security and Trust , 2011 .

[13]  Marten van Dijk,et al.  A technique to build a secret key in integrated circuits for identification and authentication applications , 2004, 2004 Symposium on VLSI Circuits. Digest of Technical Papers (IEEE Cat. No.04CH37525).

[14]  Frank Sehnke,et al.  On the Foundations of Physical Unclonable Functions , 2009, IACR Cryptol. ePrint Arch..

[15]  Srinivas Devadas,et al.  Security Based on Physical Unclonability and Disorder , 2012 .

[16]  Stefan Katzenbeisser,et al.  Physically Uncloneable Functions in the Universal Composition Framework , 2011, CRYPTO.

[17]  Srinivas Devadas,et al.  Silicon physical random functions , 2002, CCS '02.

[18]  Srinivas Devadas,et al.  Identification and authentication of integrated circuits , 2004, Concurr. Pract. Exp..

[19]  Jie Chen,et al.  A DNA-based, biomolecular cryptography design , 2003, Proceedings of the 2003 International Symposium on Circuits and Systems, 2003. ISCAS '03..

[20]  John H. Reif,et al.  DNA-based Cryptography , 1999, Aspects of Molecular Computing.

[21]  Frederik Armknecht,et al.  A Formalization of the Security Features of Physical Functions , 2011, 2011 IEEE Symposium on Security and Privacy.

[22]  Ulrich Rührmair,et al.  PUFs in Security Protocols: Attack Models and Security Evaluations , 2013, 2013 IEEE Symposium on Security and Privacy.

[23]  Rafail Ostrovsky,et al.  Universally Composable Secure Computation with (Malicious) Physically Uncloneable Functions , 2012, IACR Cryptol. ePrint Arch..

[24]  Catherine Taylor Clelland,et al.  Hiding messages in DNA microdots , 1999, Nature.

[25]  G. Edward Suh,et al.  Extracting secret keys from integrated circuits , 2005, IEEE Transactions on Very Large Scale Integration (VLSI) Systems.

[26]  Blaise L. P. Gassend,et al.  Physical random functions , 2003 .

[27]  Andrew Chi-Chih Yao,et al.  Classical physics and the Church--Turing Thesis , 2003, JACM.

[28]  Stephen A. Benton,et al.  Physical one-way functions , 2001 .

[29]  Scott Aaronson,et al.  NP-complete Problems and Physical Reality , 2005, Electron. Colloquium Comput. Complex..

[30]  Ueli Maurer Conditionally-perfect secrecy and a provably-secure randomized cipher , 2004, Journal of Cryptology.