Brain-Inspired Recurrent Neural Networks for Biomedical Applications
- Yin, Yue
- Advisor(s): Nenadic, Zoran
Abstract
Training artificial recurrent neural networks (RNNs) has traditionally relied on an offline gradient-based learning method: backpropagation through time (BPTT). However, this technique is computationally expensive and biologically implausible as the information for the entire time sequence needs to be stored for backward gradient calculation. Reservoir computing, a brain-inspired learning framework, solves this problem by allowing online adaptations in which parameters can be updated at every step. In this dissertation, we first improve the stability of the state-of-the-art reservoir computing technique (full-FORCE) by dynamically coupling the data with the network (teacher forcing). This coupling has the same effect as learning online. We also extend full-FORCE to multiple layers while permitting online local weight updates for each layer. Next, we introduce a new recurrent neural network that implements a continuous local learning rule and is compatible with the modern machine learning library for gradient-based optimization. This RNN inherits synaptic dynamics, features deep architecture, and enables online learning for weight modification. We test these new models on real-world biomedical time-series tasks, including multi-dimensional movement pattern generation, electroencephalogram (EEG) signal classification, and blood glucose level prediction, and show improved performances.