Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Electronic Theses and Dissertations bannerUCLA

Adaptive Filter Attention

Abstract

This work introduces Adaptive Filter Attention (AFA), a novel attention mechanism that explicitly incorporates a learnable linear time-invariant dynamics model into the attention computation to improve the estimation of noisy trajectories in stochastic dynamical systems. Unlike standard self-attention, which relies on nonlocal averaging based solely on feature similarity, AFA computes the attention weights using a state-space model that models how latent states evolve over time. The similarity of each query-key pair is modeled by how well the key aligns with the query under the learned dynamics, weighting interactions by their consistency with the system’s temporal evolution. This structure enables AFA to distinguish signal from noise based on the plausibility of dynamic transitions. By incorporating a learnable dynamics model, AFA functions as a model-based adaptive filter, introducing a principled inductive bias that should improve generalization in time series modeling.