Berzal, José AndrésZufiria, Pedro2024-02-092024-02-092006978-84-611-190https://hdl.handle.net/10115/30242In this work a novel Rubner-type neural network architecture is presented. The proposed modifications on the antihebbian connections and learning laws, lead to a partially decoupled structure whose stability analysis can be performed without the need of a time-scale hypothesis.In this work a novel hebbian neural network architecture for Principal Component Analysis is presented. The proposed network is obtained via a linearization and modification of the standard Rubner model. The new antihebbian connections and learning laws define a partially decoupled net structure. This specific connectivity among the neurons allows for a stability analysis of the whole network where there is no need to assume a priori a time-scale hypothesis between the neurons dynamics.engNeural NetworksHebbian and Rubner Neural NetworksStability analysisPrincipal Component AnalysisArtificial Intelligence AlgorithmsConvergence analysis of a linearized Rubner network with modified lateral weight behaviorinfo:eu-repo/semantics/bookPartinfo:eu-repo/semantics/restrictedAccess