Thwarting Replication Attack against Memristor-Based Neuromorphic Computing System
Neuromorphic architectures are widely used in many applications for advanced data processing and often implement proprietary algorithms. However, in an adversarial scenario, such systems may face elaborate security attacks including learning attack. In this article, we prevent an attacker with physical access from learning the proprietary algorithm implemented by the neuromorphic hardware. For this purpose, we leverage the obsolescence effect in memristors to judiciously reduce the accuracy of outputs for any unauthorized user. For a legitimate user, we regulate the obsolescence effect, thereby maintaining the accuracy of outputs in a suitable range. We extensively examine the feasibility of our proposed method with four datasets. We experiment under different settings, such as activation functions and constraints such as process variations, and estimate the calibration overhead. The security versus cost and performance versus resistance range tradeoffs for different applications are also analyzed. We then prove that the defense is still valid even if the attacker has the prior knowledge of the defense mechanism. Overall, our methodology is compatible with mainstream classification applications, memristor devices, and security and performance constraints.
Duke Scholars
Published In
DOI
EISSN
ISSN
Publication Date
Volume
Issue
Start / End Page
Related Subject Headings
- Computer Hardware & Architecture
- 4607 Graphics, augmented reality and games
- 4009 Electronics, sensors and digital hardware
- 1006 Computer Hardware
- 0906 Electrical and Electronic Engineering
Citation
Published In
DOI
EISSN
ISSN
Publication Date
Volume
Issue
Start / End Page
Related Subject Headings
- Computer Hardware & Architecture
- 4607 Graphics, augmented reality and games
- 4009 Electronics, sensors and digital hardware
- 1006 Computer Hardware
- 0906 Electrical and Electronic Engineering