Convolutional Neural Networks (CNNs) have shown exceptional effectiveness in complex and data-intensive domains such as image and video processing, conversational systems, and healthcare. Moreover, sectors like High-Performance Computing and safety-critical applications, including automotive, aerospace, and autonomous robotics, impose stringent requirements on energy efficiency, performance, and robustness. However, modern semiconductor technologies are increasingly vulnerable to faults, which can degrade CNN performance and potentially result in catastrophic failures. This work explores the impact of regularization techniques (dropout layer) in enhancing the inference robustness of CNN models against soft errors. We analyzed soft error impacts on five widely adopted CNN architectures, each trained with ten different dropout rates. Our experimental results reveal that optimizing the dropout rate during training can improve the in-field robustness of CNN models by up to 12% compared to baseline configurations under soft error conditions. Additionally, fine-tuning this architectural parameter can lead to accurary improvements of up to 10%.

Improving CNN Runtime Robustness Against Soft Errors by Dropout Layer Optimization / Sierra, Robert Limas; Esposito, Giuseppe; Guerrero-Balaguera, Juan-David; Rodriguez Condia, Josie E.; Reorda, Matteo Sonza. - (2025), pp. 1-6. (Intervento presentato al convegno 2025 IEEE 26th Latin American Test Symposium (LATS) tenutosi a San Andres Islas (COL) nel 11-14 March 2025) [10.1109/lats65346.2025.10963963].

Improving CNN Runtime Robustness Against Soft Errors by Dropout Layer Optimization

Sierra, Robert Limas;Esposito, Giuseppe;Guerrero-Balaguera, Juan-David;Rodriguez Condia, Josie E.;Reorda, Matteo Sonza
2025

Abstract

Convolutional Neural Networks (CNNs) have shown exceptional effectiveness in complex and data-intensive domains such as image and video processing, conversational systems, and healthcare. Moreover, sectors like High-Performance Computing and safety-critical applications, including automotive, aerospace, and autonomous robotics, impose stringent requirements on energy efficiency, performance, and robustness. However, modern semiconductor technologies are increasingly vulnerable to faults, which can degrade CNN performance and potentially result in catastrophic failures. This work explores the impact of regularization techniques (dropout layer) in enhancing the inference robustness of CNN models against soft errors. We analyzed soft error impacts on five widely adopted CNN architectures, each trained with ten different dropout rates. Our experimental results reveal that optimizing the dropout rate during training can improve the in-field robustness of CNN models by up to 12% compared to baseline configurations under soft error conditions. Additionally, fine-tuning this architectural parameter can lead to accurary improvements of up to 10%.
2025
978-1-6654-7763-5
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2999648