- Main
Cascaded Back-Propagation on Dynamic Conncetionist Networks
Abstract
The Bacic Propagation algorithm of Rumelhart, Hinton, and Williams (1986)is a powerful learning technique which can adjust weights in connectionist networks composed of multiple layers of perceptron-like units. This paper describes a variation of this technique which is applied to networks with constrained multiplicative connections. Instead of learning the weights to compute a single function, it learns the weights for a network whose outputs are the weights for a network which can then compute multiple functions.The technique is elucidated by example, and then extended into the realm of sequence learning, as prelude to work on connectionist induction of grammars. Finally, a host of issues regarding this form of computation are raised.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-