Local analysis of a new Rubner-type neural network via a DDT formulation
Fecha
2005
Autores
Título de la revista
ISSN de la revista
Título del volumen
Editor
IEEE
Resumen
In this paper, the behavior of the Rubner Hebbian artificial neural network is analyzed. Hebbian neural networks are employed in communications and signal processing applications, among others, due to their capability to implement principal component analysis (PCA). Different improvements over the original model due to Oja have been developed in the last two decades. Among them, Sanger and Rubner models were designed to directly provide the eigenvectors of the correlation matrix. The behavior of these models has been traditionally considered on a continuous-time formulation whose validity is justified via some analytical procedures that presume, among other hypotheses, an specific asymptotic behavior of the learning gain. In practical applications, these assumptions cannot be guaranteed. This paper addresses the study of a deterministic discrete-time (DDT) formulation of the Rubner net, that characterizes the average evolution of the net, preserving the discrete-time form of the original network and gathering the influence of the learning gain. This way, the dynamical behavior of Rubner model is analyzed in this more realistic context. The results thoroughly characterize the relationship between the learning gain and the eigenvalue structure of the correlation matrix.
Descripción
The dynamic behavior of the newly proposed linearized Rubner net has been analyzed. The DDT model succesfully gathers the influence of hebbian and antihebbian learning rates. Different relative intensity cases have been addressed. The analysis concludes that common learning rates (one for lateral antihebbian weights and another one for direct Hebbian weights) cannot be employed. Also, it has been shown that the time scale hypothesis requires both hebbian and antihebbian learning rates to be larger as the neuron index decreases. The required values are modulated by a factor depending on the eigenvalues of the input data autocorrelation matrix.