Abstract

In this paper we consider the possibility of replacing the output layer of Multi- Layer Perceptrons (MLPs) by local schemes when dealing with classification problems. In order to open the possibility of developing LMS-trainable models, and posterior adaptive schemes, we apply a trainable version of the classical k-Nearest Neighbour classifier (kNN) named kNN-Learning Vector Classifier. We develop the corresponding training formulas for the whole resulting structure and apply it to some classification benchmark problems. The experimental results give evidence of the nearly systematic advantage of our proposal with respect to MLPs, as well as of their competitive performance regarding the Modular Neural Networks (MNNs), which have a similar philosophy as our approach.
Loading...

Quotes

plumx
0 citations in WOS
0 citations in

Journal Title

Journal ISSN

Volume Title

Publisher

DOI

Description

Citation

Endorsement

Review

Supplemented By

Referenced By

Statistics

Views
557
Downloads
130

Bibliographic managers

Document viewer

Select a file to preview:
Reload