Analysis of hebbian models with lateral weight connections
dc.contributor.author | Berzal, José Andrés | |
dc.contributor.author | Zufiria, Pedro | |
dc.date.accessioned | 2024-02-09T08:14:15Z | |
dc.date.available | 2024-02-09T08:14:15Z | |
dc.date.issued | 2007 | |
dc.description | The Deterministic Discrete Time (DDT) system approach has been employed here for analytical studies of hebbian neural networks with lateral weights. Among them, the modified linearized Rubner model has shown to be the more tractable analytically. This analysis proves that such models correctly implement Principal Component Analysis of the input data, providing the eigenvalues of the autocorrelation matrix in an ordered way. Simulations show that the sizes of the learning rates are strongly related to the spectrum of the input covariance matrix in order to preserve stability and good convergence properties. Among all models, the modified linearized Rubner model also presents a slightly better convergence performance in the simulated case. | es |
dc.description.abstract | In this paper, the behavior of some hebbian artificial neural networks with lateral weights is analyzed. Hebbian neural networks are employed in communications and signal processing applications for implementing on-line Principal Component Analysis (PCA). Different improvements over the original Oja model have been developed in the last two decades. Among them, models with lateral weights have been designed to directly provide the eigenvectors of the correlation matrix [1,5,6,9]. The behavior of hebbian models has been traditionally studied by resorting to an associated continuous-time formulation under some questionable assumptions which are not guaranteed in real implementations. In this paper we employ the alternative deterministic discrete-time (DDT) formulation that characterizes the average evolution of these nets and gathers the influence of the learning gains time evolution [12]. The dynamic behavior of some of these hebbian models is analytically characterized in this context and several simulations complement this comparative study. | es |
dc.identifier.doi | 10.1007/978-3-540-73007-1_6 | es |
dc.identifier.issn | 978-3-540-73006-4 | |
dc.identifier.issn | 978-3-540-73007-1 | |
dc.identifier.uri | https://hdl.handle.net/10115/30140 | |
dc.language.iso | eng | es |
dc.publisher | Springer, Berlin, Heidelberg | es |
dc.rights.accessRights | info:eu-repo/semantics/restrictedAccess | es |
dc.subject | Neural Networks | es |
dc.subject | Hebbian Neural Network | es |
dc.subject | Determinitic Discrete Time formulation | es |
dc.subject | Artificial Intelligence Algorithms | es |
dc.subject | Lateral and Direct Weight | es |
dc.subject | Convergence Performance | es |
dc.subject | Learning Algorithm | es |
dc.title | Analysis of hebbian models with lateral weight connections | es |
dc.type | info:eu-repo/semantics/bookPart | es |
Archivos
Bloque original
1 - 1 de 1
No hay miniatura disponible
- Nombre:
- zufiria_berzal.pdf
- Tamaño:
- 278.3 KB
- Formato:
- Adobe Portable Document Format
- Descripción:
- Artículo principal
Bloque de licencias
1 - 1 de 1
No hay miniatura disponible
- Nombre:
- license.txt
- Tamaño:
- 2.67 KB
- Formato:
- Item-specific license agreed upon to submission
- Descripción: