Fast subspace tracking and neural network learning by a novel information criterion
- Author(s): Miao, Yongfeng
- Hua, Yingbo
- et al.
We introduce a novel information criterion (NIC) for searching for the optimum weights of a two-layer linear neural network (NN). The NIC exhibits a single global maximum attained if and only if the weights span the (desired) principal subspace of a covariance matrix. The other stationary points of the NIC are (unstable) saddle points. We develop an adaptive algorithm based on the NIC for estimating andtracking the principal subspace of a vector sequence. The NIC algorithm provides a fast on-linelearning of the optimum weights for the two-layer linear NN. We establish the connections between the NIC algorithm and the conventional mean-square-error (MSE) based algorithms such as Oja's algorithm (Oja 1989), LMSER, PAST, APEX, and GHA. The NIC algorithm has several key advantages such as faster convergence, which is illustrated through analysis and simulation
Many UC-authored scholarly publications are freely available on this site because of the UC Academic Senate's Open Access Policy. Let us know how this access is important for you.