- Main
Noisy Signal Correlation and Neural Network Pruning
- Moore, Eli
- Advisor(s): Chaudhuri, Rishidev
Abstract
Noise is ever-present in communication networks, both in technology and biology. Cell phonenetworks, radio systems, and the brain all exhibit signals accompanied by random fluctuations that seem to obfuscate transmitted data. Phone calls are dropped, radio signals compete with one another, and synaptic currents vary stochastically as neurons attempt to communicate. However, inherent randomness is not necessarily a nuisance — many randomized algorithms offer greater computational efficiency than their deterministic counterparts without sacrificing performance. What can be done to ensure that information is appropriately transmitted through noisy channels? In what contexts can noise be used as a tool for computation? These questions are explored throughout this dissertation. In Chapter 2, I explore the problem of time-lagged sequence correlation minimization, which is critical for robust signal identification and synchronization in the presence of noise. I classify the family of sequence pairs that minimize a time-lagged correlation inequality and compute correlation properties of these sequences. In Chapter 3, I investigate synaptic pruning in the brain, a process in which unimportant neural connections are removed. I develop a noise-driven, biologically-plausible, unsupervised pruning algorithm with strong theoretical guarantees for a class of recurrent neural networks. In Chapter 4, I extend these ideas to pruning weights of nonlinear artificial neural networks used in machine learning. Here, I develop a randomized pruning algorithm that can be applied to a wider class of neural networks, and provide analytic error bounds for the output of feed-forward neural networks with pruned weight matrices.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-