Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Energy-efficient Event-based Vision Sensors and Compute-In-Memory Architectures for Neuromorphic and Machine Learning Applications

Abstract

Neuromorphic engineering pursues the design of electronic systems emulating function and structural organization of biological neural systems in silicon integrated circuits that embody similar physical principles. The work in this dissertation presents advances in the field of neuromorphic engineering by demonstrating the design and applications of energy-efficient event-based sensors, compute-in-memory architectures, event-based learning algorithms and asynchronous data converters.

This dissertation focuses on neuromorphic very large scale integration (VLSI) architecture and algorithm design for the implementation of sensors and processors that are highly energy-efficient, emulating brain function through event-based sensory processing. In particular, three novel contributions are presented that work towards achieving the goal of integrated visual cortical processing on silicon hardware. First, a novel hybrid approach to vision sensing is presented, called query-driven dynamic vision that achieves the best energy efficiency reported to-date and then show various applications enabled by such sensors with improved performance compared to conventional sensors. Second, an integrated compute-in-memory (CIM) architecture is presented that combines an emerging device called resistive random access memory (ReRAM) with complimentary metal oxide semiconductor (CMOS) technology. This design achieves the highest versatility in terms of reconfigurable dataflow, multiple modes of neuron activation using a single topology and the best energy-efficiency reported to-date for CMOS-RRAM CIM architectures. Third, a learning rule called the inverted synaptic time dependent plasticity (iSTDP) rule is presented, that can learn temporal patterns using only spike event timing information.

Combining the above three works, it is possible to realize a preliminary form of biological vision on hardware, where the artificial silicon retina (qDVS) provides the event-based visual stimulus to the primary visual cortex layers implemented on a CIM architecture using convolutional neural networks (CNN) and can deploy event-based learning algorithms for temporal pattern recognition.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View