DID YOU KNOW? DTIC has over 3.5 million final reports on DoD funded research, development, test, and evaluation activities available to our registered users. Click
HERE to register or log in.
Accession Number:
AD1038537
Title:
Investigation of Back-off Based Interpolation Between Recurrent Neural Network and N-gram Language Models (Author's Manuscript)
Corporate Author:
University of Cambridge Cambridge United Kingdom
Report Date:
2016-02-11
Abstract:
Recurrent neural network language models RNNLMs have become an increasingly popular choice for speech and language processing tasks including automatic speech recognition ASR. As the generalization patterns of RNNLMs and n-gram LMs are inherently different, RNNLMs are usually combined with n-gram LMs via a fixed weighting based linear interpolation in state-of-the-art ASR systems. However, previous work doesnt fully exploit the difference of modelling power of the RNNLMs and n-gram LMs as n-gram level changes. In order to fully exploit the detailed n-gram level complementary attributes between the two LMs, a back-off based compact representation of n-gram dependent interpolation weights is proposed in this paper. This approach allows weight parameters to be robustly estimated on limited data. Experimental results are reported on the three tasks with varying amounts of training data. Small and consistent improvements in both perplexity and WER were obtained using the proposed interpolation approach over the baseline fixed weighting based linear interpolation.
Descriptive Note:
Journal Article
Supplementary Note:
2015 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU) , 2015 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU) , 13 Dec 2015, 17 Dec 2015, Presented at the 2015 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU), held in Scottsdale, AZ on 12-17 December 2015.
Pages:
0006
Distribution Statement:
Approved For Public Release;
File Size:
0.19MB