Paper
21 March 2001 Cross-validation in fuzzy ARTMAP neural networks for large sample classification problems
Michael Georgiopoulos, Anna Koufakou, Georgios C. Anagnostopoulos, Takis Kasparis
Author Affiliations +
Abstract
In this paper we are examining the issue of overtraining in Fuzzy ARTMAP. Over-training in Fuzzy ARTMAP manifests itself in two different ways: (a) it degrades the generalization performance of Fuzzy ARTMAP as training progresses, and (b) it creates unnecessarily large Fuzzy ARTMAP neural network architectures. In this work we are demonstrating that overtraining happens in Fuzzy ARTMAP and we propose an old remedy for its cure: cross-validation. In our experiments we compare the performance of Fuzzy ARTMAP that is trained (i) until the completion of training, (ii) for one epoch, and (iii) until its performance on a validation set is maximized. The experiments were performed on artificial and real databases. The conclusion derived from these experiments is that cross-validation is a useful procedure in Fuzzy ARTMAP, because it produces smaller Fuzzy ARTMAP architectures with improved generalization performance. The trade-off is that cross-validation introduces additional computational complexity in the training phase of Fuzzy ARTMAP.
© (2001) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Michael Georgiopoulos, Anna Koufakou, Georgios C. Anagnostopoulos, and Takis Kasparis "Cross-validation in fuzzy ARTMAP neural networks for large sample classification problems", Proc. SPIE 4390, Applications and Science of Computational Intelligence IV, (21 March 2001); https://doi.org/10.1117/12.421155
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Fuzzy logic

Databases

Neural networks

Chromium

Error control coding

Machine learning

Statistical modeling

Back to Top