Skip to main content

Training itself: Mixed-signal training acceleration for memristor-based neural network

Publication ,  Conference
Li, B; Wang, Y; Weng, Y; Chen, Y; Yang, H
Published in: Proceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC
March 27, 2014

The artificial neural network (ANN) is among the most widely used methods in data processing applications. The memristor-based neural network further demonstrates a power efficient hardware realization of ANN. Training phase is the critical operation of memristor-based neural network. However, the traditional training method for memristor-based neural network is time consuming and energy inefficient. Users have to first work out the parameters of memristors through digital computing systems and then tune the memristor to the corresponding state. In this work, we introduce a mixed-signal training acceleration framework, which realizes the self-training of memristor-based neural network. We first modify the original stochastic gradient descent algorithm by approximating calculations and designing an alternative computing method. We then propose a mixed-signal acceleration architecture for the modified training algorithm by equipping the original memristor-based neural network architecture with the copy crossbar technique, weight update units, sign calculation units and other assistant units. The experiment on the MNIST database demonstrates that the proposed mixed-signal acceleration is 3 orders of magnitude faster and 4 orders of magnitude more energy efficient than the CPU implementation counterpart at the cost of a slight decrease of the recognition accuracy (< 5%). © 2014 IEEE.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

Proceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC

DOI

Publication Date

March 27, 2014

Start / End Page

361 / 366
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Li, B., Wang, Y., Weng, Y., Chen, Y., & Yang, H. (2014). Training itself: Mixed-signal training acceleration for memristor-based neural network. In Proceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC (pp. 361–366). https://doi.org/10.1109/ASPDAC.2014.6742916
Li, B., Y. Wang, Y. Weng, Y. Chen, and H. Yang. “Training itself: Mixed-signal training acceleration for memristor-based neural network.” In Proceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC, 361–66, 2014. https://doi.org/10.1109/ASPDAC.2014.6742916.
Li B, Wang Y, Weng Y, Chen Y, Yang H. Training itself: Mixed-signal training acceleration for memristor-based neural network. In: Proceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC. 2014. p. 361–6.
Li, B., et al. “Training itself: Mixed-signal training acceleration for memristor-based neural network.” Proceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC, 2014, pp. 361–66. Scopus, doi:10.1109/ASPDAC.2014.6742916.
Li B, Wang Y, Weng Y, Chen Y, Yang H. Training itself: Mixed-signal training acceleration for memristor-based neural network. Proceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC. 2014. p. 361–366.

Published In

Proceedings of the Asia and South Pacific Design Automation Conference, ASP-DAC

DOI

Publication Date

March 27, 2014

Start / End Page

361 / 366