Thanks to their convex formulation, kernel regressions have shown an improved accuracy with respect to artificial neural network (ANN) structures in regression problems where a reduced set of training samples are available. However, despite the above interesting features, kernel regressions are inherently less flexible than ANN structures since their implementations are usually limited to scalar-output regression problems. This article presents a vector-valued (multioutput) formulation of the kernel ridge regression (KRR) aimed at bridging the gap between multioutput ANN structures and scalar kernel-based approaches. The proposed vector-valued KRR relies on a generalized definition of the reproducing kernel Hilbert space (RKHS) and on a new multioutput kernel structure. The mathematical background of the proposed vector-valued formulation is extensively discussed together with different matrix kernel functions and training schemes. Moreover, a compression strategy based on the Nystrom approximation is presented to reduce the computational complexity of the model training. The effectiveness and the performance of the proposed vector-valued KRR are discussed on an illustrative example consisting of a high-speed link and on the optimization of a Doherty amplifier.
Bridging the Gap Between Artificial Neural Networks and Kernel Regressions for Vector-Valued Problems in Microwave Applications / Soleimani, Nastaran; Trinchero, Riccardo; Canavero, Flavio G.. - In: IEEE TRANSACTIONS ON MICROWAVE THEORY AND TECHNIQUES. - ISSN 0018-9480. - STAMPA. - 71:6(2023), pp. 2319-2332. [10.1109/TMTT.2022.3232895]
Bridging the Gap Between Artificial Neural Networks and Kernel Regressions for Vector-Valued Problems in Microwave Applications
Nastaran Soleimani;Riccardo Trinchero;Flavio G. Canavero
2023
Abstract
Thanks to their convex formulation, kernel regressions have shown an improved accuracy with respect to artificial neural network (ANN) structures in regression problems where a reduced set of training samples are available. However, despite the above interesting features, kernel regressions are inherently less flexible than ANN structures since their implementations are usually limited to scalar-output regression problems. This article presents a vector-valued (multioutput) formulation of the kernel ridge regression (KRR) aimed at bridging the gap between multioutput ANN structures and scalar kernel-based approaches. The proposed vector-valued KRR relies on a generalized definition of the reproducing kernel Hilbert space (RKHS) and on a new multioutput kernel structure. The mathematical background of the proposed vector-valued formulation is extensively discussed together with different matrix kernel functions and training schemes. Moreover, a compression strategy based on the Nystrom approximation is presented to reduce the computational complexity of the model training. The effectiveness and the performance of the proposed vector-valued KRR are discussed on an illustrative example consisting of a high-speed link and on the optimization of a Doherty amplifier.File | Dimensione | Formato | |
---|---|---|---|
Bridging_the_Gap_Between_Artificial_Neural_Networks_and_Kernel_Regressions_for_Vector-Valued_Problems_in_Microwave_Applications.pdf
accesso aperto
Tipologia:
2a Post-print versione editoriale / Version of Record
Licenza:
Creative commons
Dimensione
1.64 MB
Formato
Adobe PDF
|
1.64 MB | Adobe PDF | Visualizza/Apri |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2974826