Improved Vector Quantization Approach for Discrete HMM Speech Recognition System
Mohamed Debyeche1, Jean-Paul Haton2, and Amrane Houacine1
1Faculty of Electronics and Computer Sciences, USTHB, Algeria
2LORIA/INRIA-Lorraine, France
Abstract: The paper presents an improved Vector Quantization (VQ) approach for discrete Hidden Markov Models (HMMs). This improved VQ approach performs an optimal distribution of VQ codebook components on HMM states. This technique, that we named the Distributed Vector Quantization (DVQ) of hidden Markov models, succeeds in unifying acoustic micro-structure and phonetic macro-structure when the estimation of HMM parameters is performed. The DVQ technique is implemented through two variants; the first variant uses the K-means algorithm (K-means-DVQ) to optimize the VQ, while the second variant exploits the benefits of the classification behavior of Neural Networks (NN-DVQ) for the same purpose. The proposed variants are compared with the HMM-based baseline system by experiments of specific Arabic consonants recognition. The results show that the distributed vector quantization technique increase the performance of the discrete HMM system while maintaining the decoding speed of the models.
Keywords: Arabic language, hidden Markov model, vector quantization, neural network, speech recognition.
Received February 7, 2006; accepted April 26, 2006