A Neueal Network Model of Complementary Learning Systems
Skip to main content
eScholarship
Open Access Publications from the University of California

A Neueal Network Model of Complementary Learning Systems

Abstract

We introduce a computational model capturing the high-level features of the complementary learning systems (CLS) frame- work. In particular, we model the integration of episodic mem- ory with statistical learning in an end-to-end trainable neural network architecture. We model episodic memory with a non- parametric module which can retrieve past observations in re- sponse to a given observation, and statistical learning with a parametric module which performs inference on the given ob- servation. We demonstrate on vision and control tasks that our model is able to leverage the respective advantages of nonpara- metric and parametric learning strategies, and that its behavior aligns with a variety of behavioral and neural data. In partic- ular, our model performs consistently with results indicating that episodic memory systems in the hippocampus aid early learning and transfer generalization. We also find qualitative results consistent with findings that neural traces of memories of similar events converge over time. Furthermore, without explicit instruction or incentive, the behavior of our model nat- urally aligns with results suggesting that the usage of episodic systems wanes over the course of learning. These results sug- gest that key features of the CLS framework emerge in a task- optimized model containing statistical and episodic learning components, supporting several hypotheses of the framework.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View