Transfer-Path-Based Hardware-Reuse Strong PUF Achieving Modeling Attack Resilience With >200 Million Training CRPs

This paper presents a hardware-reuse strong physical unclonable function (PUF) based on the intrinsic transfer paths (TPs) of a conventional digital multiplier to achieve a strong modeling attack resilience. With the multiplier input employed as the PUF challenge and the path delay as the entropy so...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on information forensics and security p. 1
Main Authors Xu, Chongyao, Zhang, Jieyun, Law, Man-Kay, Zhao, Xiaojin, Mak, Pui-In, Martins, Rui P.
Format Journal Article
LanguageEnglish
Published IEEE 01.01.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper presents a hardware-reuse strong physical unclonable function (PUF) based on the intrinsic transfer paths (TPs) of a conventional digital multiplier to achieve a strong modeling attack resilience. With the multiplier input employed as the PUF challenge and the path delay as the entropy source, all the possible valid propagation paths from distinct input/output pairs can serve as PUF primitives. We can quantize the path delay using a time-to-digital converter (TDC), and select the suitable TDC output bits as the PUF response. We further propose a lightweight dynamic obfuscation algorithm (DOA) and a secure mutual authentication protocol to counteract modeling attacks. The proposed strong PUF using a 32×32 multiplier as implemented in the Xilinx ZYNQ-7000 SoC features a total of 2048 intrinsic PUF primitives, while achieving a response stream (RS) with an average of 1024 responses per TDC output bit per challenge. With Bit (5) and Bit (6) of the TDC output selected for PUF response generation, they demonstrate a measured reliability and uniqueness of up to 98.31% and 49.34%, respectively, with their excellent randomness performance as validated by the NIST SP800-22 tests. Under machine learning (ML)-based modeling attack with artificial neural network (ANN), the measured prediction accuracy of both Bit (5) and Bit (6) can still be maintained at ~50% with a total of >200 million CRPs as the training set.
ISSN:1556-6013
1556-6021
DOI:10.1109/TIFS.2023.3263621