Improving Hybrid CTC/Attention End-to-End Speech Recognition with Pretrained Acoustic and Language Models

Recently, self-supervised pretraining has achieved impressive results in end-to-end (E2E) automatic speech recognition (ASR). However, the dominant sequence-to-sequence (S2S) E2E model is still hard to fully utilize the self-supervised pretraining methods because its decoder is conditioned on acoust...

Full description

Saved in:
Bibliographic Details
Published in2021 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU) pp. 76 - 82
Main Authors Deng, Keqi, Cao, Songjun, Zhang, Yike, Ma, Long
Format Conference Proceeding
LanguageEnglish
Published IEEE 13.12.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Recently, self-supervised pretraining has achieved impressive results in end-to-end (E2E) automatic speech recognition (ASR). However, the dominant sequence-to-sequence (S2S) E2E model is still hard to fully utilize the self-supervised pretraining methods because its decoder is conditioned on acoustic representation thus cannot be pretrained separately. In this paper, we propose a pretrained Transformer (Preformer) S2S ASR architecture based on hybrid CTC/attention E2E models to fully utilize the pretrained acoustic models (AMs) and language models (LMs). In our framework, the encoder is initialized with a pretrained AM (wav2vec2.0). The Preformer leverages CTC as an auxiliary task during training and inference. Furthermore, we design a one-cross decoder (OCD), which relaxes the dependence on acoustic representations so that it can be initialized with pretrained LM (DistilGPT2). Experiments are conducted on the AISHELL-1 corpus and achieve a 4.6% character error rate (CER) on the test set. Compared with our vanilla hybrid CTC/attention Transformer baseline, our proposed CTC/attention-based Preformer yields 27% relative CER reduction. To the best of our knowledge, this is the first work to utilize both pretrained AM and LM in a S2S ASR system.
DOI:10.1109/ASRU51503.2021.9688009