- Main
Adding biological constraints to deep neural networks reduces their capacity tolearn unstructured data
Abstract
Deep neural networks (DNNs) are becoming increasingly pop-ular as a model of the human visual system. However, theyshow behaviours that are uncharacteristic of humans, includingthe ability to learn arbitrary data, such as images with pixel val-ues drawn randomly from a Gaussian distribution. We investi-gated whether this behaviour is due to the learning and memorycapacity of DNNs being too high for the training task. We re-duced the capacity of DNNs by incorporating biologically mo-tivated constraints – an information bottleneck, internal noiseand sigmoid activations – in order to diminish the learning ofarbitrary data, without significantly degrading performance onnatural images. Internal noise reliably produced the desiredbehaviour, while a bottleneck had limited impact. Combiningall three constraints yielded an even greater reduction in learn-ing capacity. Furthermore, we tested whether these constraintscontribute to a network’s ability to generalize by helping it de-velop more robust internal representations. However, none ofthe methods could consistently improve generalization.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-