Skip to main content
eScholarship
Open Access Publications from the University of California

UC Berkeley

UC Berkeley Electronic Theses and Dissertations bannerUC Berkeley

Analysis and applications of the Locally Competitive Algorithm

Abstract

The Locally Competitive Algorithm (LCA) is a recurrent neural network for performing sparse coding and dictionary learning of natural signals. The network itself lacks much of the anatomical structure we see in real brains, but it does include lateral connectivity and iterative, or recurrent, computation. These together produce population nonlinearities that facilitate desirable coding properties, including improved robustness, selectivity, and efficiency when compared to more traditional architectures. When trained on images of natural scenes, the network also has many overlapping response properties with those of biological neurons. This has encouraged scientists to use it as a means to conceptualize theories of neural coding and explore their consequences. Probing the network enables us to observe and test theories that account for neurophysiological phenomena. In this thesis, we will provide motivation for investigating the model, a detailed derivation of the model, an analysis of its response properties, and offer some extensions. The core computational principles in the LCA are distinct from those used in most production artificial neural networks, and we will argue that there is much to be gained from incorporating LCA-like computations in exchange for feed-forward, pointwise nonlinear model neurons.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View