Dealing with time-varying high dimensional data is a big problem for real time pattern recognition. Only linear projections, like principal component analysis, are used in real time while nonlinear techniques need the whole database (offline). Their incremental variants do no work properly. The onCCA neural network ad-dresses this problem; it is incremental and performs simultaneously the data quantization and projection by using the Curvilinear Component Analysis (CCA), a distance-preserving reduction technique. However, onCCA requires an initial architecture, provided by a small offline CCA. This paper presents a variant of onCCA, called growing CCA (GCCA), which has a self-organized incremen-tal architecture adapting to the nonstationary data distribution. This is achieved by introducing the ideas of “seeds”, pairs of neurons which colonize the input do-main, and “bridge”, a different kind of edge in the manifold graph, which signal the data nonstationarity. Some examples from artificial problems and a real appli-cation are given.
Growing Curvilinear Component Analysis (GCCA) for Dimensionality Reduction of Nonstationary Data / Cirrincione, Giansalvo; Randazzo, Vincenzo; Pasero, EROS GIAN ALESSANDRO (SMART INNOVATION, SYSTEMS AND TECHNOLOGIES). - In: Multidisciplinary Approaches to Neural Computing / Esposito, A., Faundez-Zanuy, M., Morabito, F.C., Pasero, E.. - ELETTRONICO. - [s.l] : Springer International Publishing, 2017. - ISBN 978-3-319-56904-8. - pp. 151-160 [10.1007/978-3-319-56904-8_15]
Growing Curvilinear Component Analysis (GCCA) for Dimensionality Reduction of Nonstationary Data
CIRRINCIONE, GIANSALVO;RANDAZZO, VINCENZO;PASERO, EROS GIAN ALESSANDRO
2017
Abstract
Dealing with time-varying high dimensional data is a big problem for real time pattern recognition. Only linear projections, like principal component analysis, are used in real time while nonlinear techniques need the whole database (offline). Their incremental variants do no work properly. The onCCA neural network ad-dresses this problem; it is incremental and performs simultaneously the data quantization and projection by using the Curvilinear Component Analysis (CCA), a distance-preserving reduction technique. However, onCCA requires an initial architecture, provided by a small offline CCA. This paper presents a variant of onCCA, called growing CCA (GCCA), which has a self-organized incremen-tal architecture adapting to the nonstationary data distribution. This is achieved by introducing the ideas of “seeds”, pairs of neurons which colonize the input do-main, and “bridge”, a different kind of edge in the manifold graph, which signal the data nonstationarity. Some examples from artificial problems and a real appli-cation are given.File | Dimensione | Formato | |
---|---|---|---|
bookExtracted_Growing Curvilinear Component Analysis (GCCA) for Dimensionality Reduction of Nonstationary Data.pdf
non disponibili
Tipologia:
2a Post-print versione editoriale / Version of Record
Licenza:
Non Pubblico - Accesso privato/ristretto
Dimensione
254.66 kB
Formato
Adobe PDF
|
254.66 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
WIRN2016_gCCAv.11 - UPLOADED.pdf
accesso aperto
Tipologia:
2. Post-print / Author's Accepted Manuscript
Licenza:
PUBBLICO - Tutti i diritti riservati
Dimensione
624.56 kB
Formato
Adobe PDF
|
624.56 kB | Adobe PDF | Visualizza/Apri |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2679577