Lifetime Enhancement for RRAM-based Computing-In-Memory Engine Considering Aging and Thermal Effects
RRAM-based computing-in-memory engines provide a promising platform to accelerate deep neural networks. The programming process imposes high voltages onto the RRAM cells and thus degrades their valid conductance ranges from the fresh state, an effect called aging. Consequently, the expected conductances of RRAM cells corresponding to the weights after training may fall outside of the valid ranges, potentially leading to a significant accuracy degradation. In addition, an uneven temperature distribution due to different conductances accelerates the aging effect further. Moreover, the uneven temperatures can cause accuracy discrepancy between the tuning process and inference, thus reducing the lifetime of such accelerators even further. In this paper, we propose to counter aging and thermal effects by distributing aging stress and high-temperature RRAM cells evenly, during both software training and hardware mapping, to extend the lifetime of computing-in-memory engines. Experimental results demonstrate lifetime enhancement up to 453 times while maintaining the classification accuracy.