The energy efficiency of neuromorphic hardware is greatly affected by the
energy of storing, accessing, and updating synaptic parameters. Various methods
of memory organisation targeting energy-efficient digital accelerators have
been investigated in the past, however, they do not completely encapsulate the
energy costs at a system level. To address this shortcoming and to account for
various overheads, we synthesize the controller and memory for different
encoding schemes and extract the energy costs from these synthesized blocks.
Additionally, we introduce functional encoding for structured connectivity such
as the connectivity in convolutional layers. Functional encoding offers a 58%
reduction in the energy to implement a backward pass and weight update in such
layers compared to existing index-based solutions. We show that for a 2 layer
spiking neural network trained to retain a spatio-temporal pattern, bitmap
(PB-BMP) based organization can encode the sparser networks more efficiently.
This form of encoding delivers a 1.37x improvement in energy efficiency coming
at the cost of a 4% degradation in network retention accuracy as measured by
the van Rossum distance.