Reliable and Energy Efficient MLC STT-RAM Buffer for CNN Accelerators
Skip to main content
eScholarship
Open Access Publications from the University of California

Reliable and Energy Efficient MLC STT-RAM Buffer for CNN Accelerators

  • Author(s): Jasemi, Masoomeh
  • Hessabi, Shaahin
  • Bagherzadeh, Nader
  • et al.
Creative Commons 'BY' version 4.0 license
Abstract

We propose a lightweight scheme where the formation of a data block is changed in such a way that it can tolerate soft errors significantly better than the baseline. The key insight behind our work is that CNN weights are normalized between -1 and 1 after each convolutional layer, and this leaves one bit unused in half-precision floating-point representation. By taking advantage of the unused bit, we create a backup for the most significant bit to protect it against the soft errors. Also, considering the fact that in MLC STT-RAMs the cost of memory operations (read and write), and reliability of a cell are content-dependent (some patterns take larger current and longer time, while they are more susceptible to soft error), we rearrange the data block to minimize the number of costly bit patterns. Combining these two techniques provides the same level of accuracy compared to an error-free baseline while improving the read and write energy by 9% and 6%, respectively.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Main Content
Current View