Back to Search
Start Over
[Untitled]
- Source :
- Neural Processing Letters. 10:151-159
- Publication Year :
- 1999
- Publisher :
- Springer Science and Business Media LLC, 1999.
-
Abstract
- The Self-Organizing Map (SOM) and Learning Vector Quantization (LVQ) algorithms are constructed in this work for variable-length and warped feature sequences. The novelty is to associate an entire feature vector sequence, instead of a single feature vector, as a model with each SOM node. Dynamic time warping is used to obtain time-normalized distances between sequences with different lengths. Starting with random initialization, ordered feature sequence maps then ensue, and Learning Vector Quantization can be used to fine tune the prototype sequences for optimal class separation. The resulting SOM models, the prototype sequences, can then be used for the recognition as well as synthesis of patterns. Good results have been obtained in speaker-independent speech recognition.
- Subjects :
- Self-organizing map
Dynamic time warping
Sequence
Learning vector quantization
Computer Networks and Communications
business.industry
General Neuroscience
Feature vector
ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION
Vector quantization
Initialization
Pattern recognition
Computer Science::Sound
Artificial Intelligence
Feature (machine learning)
Artificial intelligence
business
Software
Mathematics
Subjects
Details
- ISSN :
- 13704621
- Volume :
- 10
- Database :
- OpenAIRE
- Journal :
- Neural Processing Letters
- Accession number :
- edsair.doi...........74328d30864d804cacf9889997d7571d