Back to Search Start Over

[Untitled]

Authors :
Teuvo Kohonen
Panu Somervuo
Source :
Neural Processing Letters. 10:151-159
Publication Year :
1999
Publisher :
Springer Science and Business Media LLC, 1999.

Abstract

The Self-Organizing Map (SOM) and Learning Vector Quantization (LVQ) algorithms are constructed in this work for variable-length and warped feature sequences. The novelty is to associate an entire feature vector sequence, instead of a single feature vector, as a model with each SOM node. Dynamic time warping is used to obtain time-normalized distances between sequences with different lengths. Starting with random initialization, ordered feature sequence maps then ensue, and Learning Vector Quantization can be used to fine tune the prototype sequences for optimal class separation. The resulting SOM models, the prototype sequences, can then be used for the recognition as well as synthesis of patterns. Good results have been obtained in speaker-independent speech recognition.

Details

ISSN :
13704621
Volume :
10
Database :
OpenAIRE
Journal :
Neural Processing Letters
Accession number :
edsair.doi...........74328d30864d804cacf9889997d7571d