Stochasticity and limited precision of synaptic weights in neural network models is a key aspect of both biological and hardware modeling of learning processes. Here we show that a neural network model with stochastic binary weights naturally gives prominence to exponentially rare dense regions of solutions with a number of desirable properties such as robustness and good generalization per- formance, while typical solutions are isolated and hard to find. Binary solutions of the standard perceptron problem are obtained from a simple gradient descent procedure on a set of real values parametrizing a probability distribution over the binary synapses. Both analytical and numerical results are presented. An algorithmic extension aimed at training discrete deep neural networks is also investigated.
|Titolo:||Role of Synaptic Stochasticity in Training Low-Precision Neural Networks|
|Data di pubblicazione:||2018|
|Digital Object Identifier (DOI):||10.1103/PhysRevLett.120.268103|
|Appare nelle tipologie:||1.1 Articolo in rivista|
File in questo prodotto: