Skip to main content
eScholarship
Open Access Publications from the University of California

Reconstruction of visually stable perception from saccadic retinal inputs using corollary discharge signals-driven convLSTM neural networks

Creative Commons 'BY' version 4.0 license
Abstract

While subjective visual experiences are remarkably stable and coherent, their underlying data is incomplete and heavily influenced by the eyes' saccadic rhythm. In this work, we show that a deep and recurrent neural network can effectively reconstruct vibrant images from restricted retinal inputs during active vision. Our method includes the creation of a dataset for synthetic retinal inputs, containing intensity, color, and event-camera-generated motion data. We demonstrate the importance of both long-short-term memory and corollary discharge signals to image stabilization and the system's sensitivity to noise, corresponding to recent experimental findings. Our study contributes to the advancement of realistic and dynamic models for image reconstruction, providing insights into the complexities of active visual perception.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View