Skip to main content
Open Access Publications from the University of California

Latent Event-Predictive Encodings through Counterfactual Regularization

  • Author(s): Humaidan, Dania;
  • Otte, Sebastian;
  • Gumbsch, Christian;
  • Wu, Charley M;
  • Butz, Martin V.
  • et al.

A critical challenge for any intelligent system is to infer structure from continuous data streams. Theories of event-predictive cognition suggest that the brain segments sensorimotor information into compact event encodings, which are used to anticipate and interpret environmental dynamics. Here, we introduce a SUrprise-GAted Recurrent neural network (SUGAR) using a novel form of counterfactual regularization. We test the model on a hierarchical sequence prediction task, where sequences are generated by alternating hidden graph structures. Our model learns to both compress the temporal dynamics of the task into latent event-predictive encodings and anticipate event transitions at the right moments, given noisy hidden signals about them. The addition of the counterfactual regularization term ensures fluid transitions from one latent code to the next, whereby the resulting latent codes exhibit compositional properties. The implemented mechanisms offer a host of useful applications in other domains, including hierarchical reasoning, planning, and decision making.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View