Back to Search
Start Over
Speeding up the Self-Organizing Feature Map Using Dynamic Subset Selection
- Source :
- Neural Processing Letters. 22:17-32
- Publication Year :
- 2005
- Publisher :
- Springer Science and Business Media LLC, 2005.
-
Abstract
- An active learning algorithm is devised for training Self-Organizing Feature Maps on large data sets. Active learning algorithms recognize that not all exemplars are created equal. Thus, the concepts of exemplar age and difficulty are used to filter the original data set such that training epochs are only conducted over a small subset of the original data set. The ensuing Hierarchical Dynamic Subset Selection algorithm introduces definitions for exemplar difficulty suitable to an unsupervised learning context and therefore appropriate Self-organizing map (SOM) stopping criteria. The algorithm is benchmarked on several real world data sets with training set exemplar counts in the region of 30--500 thousand. Cluster accuracy is demonstrated to be at least as good as that from the original SOM algorithm while requiring a fraction of the computational overhead.
- Subjects :
- Artificial neural network
Computer Networks and Communications
Computer science
business.industry
Active learning (machine learning)
General Neuroscience
Context (language use)
computer.software_genre
Machine learning
Set (abstract data type)
Information extraction
ComputingMethodologies_PATTERNRECOGNITION
Artificial Intelligence
Feature (machine learning)
Unsupervised learning
Artificial intelligence
business
Selection algorithm
computer
Software
Subjects
Details
- ISSN :
- 1573773X and 13704621
- Volume :
- 22
- Database :
- OpenAIRE
- Journal :
- Neural Processing Letters
- Accession number :
- edsair.doi...........a06e70d646b65ffb8b04c385f56e69f0
- Full Text :
- https://doi.org/10.1007/s11063-004-7775-6