Fine-Grained Aging-Induced Delay Prediction Based on the Monitoring of Run-Time Stress

Published

Journal Article

© 1982-2012 IEEE. Run-time solutions based on online monitoring and adaptation are required for resilience in nanoscale integrated circuits, as design-time solutions and guard bands are no longer sufficient. Bias temperature instability-induced transistor aging, one of the major reliability threats in nanoscale very large scale integration, degrades path delay over time and may lead to timing failures. Chip health monitoring is, therefore, necessary to track delay changes on a per-chip basis over the chip lifetime operation. However, direct monitoring based on actual measurement of path delays can only track a coarse-grained aging trend in a reactive manner, not suitable for proactive fine-grain adaptations. In this paper, we propose a low cost and fine-grained workload-induced stress monitoring approach, based on machine learning techniques, to accurately predict aging-induced delay. We integrate space and time sampling of selective flip-flops into the runtime monitoring infrastructure in order to reduce the cost of monitoring the workload. The prediction model is trained offline using support-vector regression and implemented in software. This approach can leverage proactive adaptation techniques to mitigate further aging of the circuit by monitoring aging trends. Simulation results for realistic open-source benchmark circuits highlight the accuracy of the proposed approach.

Full Text

Duke Authors

Cited Authors

  • Vijayan, A; Koneru, A; Kiamehr, S; Chakrabarty, K; Tahoori, MB

Published Date

  • May 1, 2018

Published In

Volume / Issue

  • 37 / 5

Start / End Page

  • 1064 - 1075

International Standard Serial Number (ISSN)

  • 0278-0070

Digital Object Identifier (DOI)

  • 10.1109/TCAD.2016.2620903

Citation Source

  • Scopus