Convergence analysis of a linearized Rubner network with modified lateral weight behavior
Resumen
In this work a novel hebbian neural network architecture for Principal Component Analysis is presented. The proposed network is obtained via a linearization and modification of the standard Rubner model. The new antihebbian connections and learning laws define a partially decoupled net structure. This specific connectivity among the neurons allows for a stability analysis of the whole network where there is no need to assume a priori a time-scale hypothesis between the neurons dynamics.
Descripción
In this work a novel Rubner-type neural network architecture is presented. The proposed modifications on the antihebbian connections and learning laws, lead to a partially decoupled structure whose stability analysis can be performed without the need of a time-scale hypothesis.
Colecciones
- Capítulos de Libros [864]
Compartir
Estadísticas
Estadísticas de usoCitas
Los ítems de digital-BURJC están protegidos por copyright, con todos los derechos reservados, a menos que se indique lo contrario