The effects of silicon implementation on the backpropagation learning rule in artificial neural systems are examined. The effects on learning performance of limited weight resolution, range limitations, and the steepness of the activation function are considered. A minimum resolution of about 20÷22 bits is generally required, but this figure can be reduced to about 14÷15 bits by properly choosing the learning parameter η which attains good performance in presence of limited resolution. This performance can be further improved by using a modified batch backpropagation rule. Theoretical analysis is compared with ad-hoc simulations and results are discussed in detail
An Analysis on the Performance of Silicon Implementations of Backpropagation Algorithms for Artificial Neural Networks / Reyneri, Leonardo; E., Filippi. - In: IEEE TRANSACTIONS ON COMPUTERS. - ISSN 0018-9340. - ELETTRONICO. - 40-12:(1991), pp. 1380-1389. [10.1109/12.106223]
An Analysis on the Performance of Silicon Implementations of Backpropagation Algorithms for Artificial Neural Networks
REYNERI, Leonardo;
1991
Abstract
The effects of silicon implementation on the backpropagation learning rule in artificial neural systems are examined. The effects on learning performance of limited weight resolution, range limitations, and the steepness of the activation function are considered. A minimum resolution of about 20÷22 bits is generally required, but this figure can be reduced to about 14÷15 bits by properly choosing the learning parameter η which attains good performance in presence of limited resolution. This performance can be further improved by using a modified batch backpropagation rule. Theoretical analysis is compared with ad-hoc simulations and results are discussed in detailPubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/1405473
Attenzione
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo