Skip to main content
eScholarship
Open Access Publications from the University of California

Using Fast Weights to Deblur Old Memories

Abstract

Connectionist models usually have a single weight on each connection. Some interesting newproperties emerge if each connection has two weights: A slowly changing, plastic weight which stores long-term knowledge and a fast-changing, elastic weight which stores temporary knowledge and spontaneously decays towards zero. If a network learns a set of associations and then these associationsare "blurred" by subsequent learning, all the original associations can be "deblurred" by rehearsing on just a few of them. The rehearsal allows the fast weights to take on values that temporarily cancel outthe changes in the slow weights caused by the subsequent learning.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View