Critical questions in dynamical neuroscience and machine learning are related to the study of recurrent neural networks and their stability, robustness, and computational efficiency. These properties can be simultaneously established via a contraction analysis.This paper develops a comprehensive contraction theory for recurrent neural networks. First, for non-Euclidean ℓ 1 /ℓ ∞ logarithmic norms, we establish quasiconvexity with respect to positive diagonal weights and closed-form worst-case expressions over certain matrix polytopes. Second, for locally Lipschitz maps (e.g., arising as activation functions), we show that their one-sided Lipschitz constant equals the essential supremum of the logarithmic norm of their Jacobian. Third and final, we apply these general results to classes of recurrent neural circuits, including Hopfield, firing rate, Persidskii, Lur’e and other models. For each model, we compute the optimal contraction rate and corresponding weighted non-Euclidean norm via a linear program or, in some special cases, via a Hurwitz condition on the Metzler majorant of the synaptic matrix. Our non-Euclidean analysis establishes also absolute, connective, and total contraction properties.

Non-Euclidean Contractivity of Recurrent Neural Networks / Davydov, Alexander; Proskurnikov, Anton V.; Bullo, Francesco. - ELETTRONICO. - (2022), pp. 1527-1534. (Intervento presentato al convegno 2022 American Control Conference (ACC) tenutosi a Atlanta, GA, USA nel 08-10 June 2022) [10.23919/ACC53348.2022.9867357].

Non-Euclidean Contractivity of Recurrent Neural Networks

Proskurnikov, Anton V.;Bullo, Francesco
2022

Abstract

Critical questions in dynamical neuroscience and machine learning are related to the study of recurrent neural networks and their stability, robustness, and computational efficiency. These properties can be simultaneously established via a contraction analysis.This paper develops a comprehensive contraction theory for recurrent neural networks. First, for non-Euclidean ℓ 1 /ℓ ∞ logarithmic norms, we establish quasiconvexity with respect to positive diagonal weights and closed-form worst-case expressions over certain matrix polytopes. Second, for locally Lipschitz maps (e.g., arising as activation functions), we show that their one-sided Lipschitz constant equals the essential supremum of the logarithmic norm of their Jacobian. Third and final, we apply these general results to classes of recurrent neural circuits, including Hopfield, firing rate, Persidskii, Lur’e and other models. For each model, we compute the optimal contraction rate and corresponding weighted non-Euclidean norm via a linear program or, in some special cases, via a Hurwitz condition on the Metzler majorant of the synaptic matrix. Our non-Euclidean analysis establishes also absolute, connective, and total contraction properties.
2022
978-1-6654-5196-3
File in questo prodotto:
File Dimensione Formato  
FINAL_Non-Euclidean_Contractivity.pdf

accesso riservato

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Non Pubblico - Accesso privato/ristretto
Dimensione 373.27 kB
Formato Adobe PDF
373.27 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
main.pdf

accesso aperto

Tipologia: 2. Post-print / Author's Accepted Manuscript
Licenza: Pubblico - Tutti i diritti riservati
Dimensione 434.56 kB
Formato Adobe PDF
434.56 kB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2970907