Restricted Boltzmann machines for vector representation of speech in speaker recognition
•An efficient low dimensional vector representation of speech based on GMM and RBM, referred to as GMMRBM vectors, is proposed.•A Universal RBM (URBM) is trained to learn the total speaker and session variability among background GMM supervectors.•A variant of Rectified Linear Units (ReLU), referred...
Saved in:
Published in | Computer speech & language Vol. 47; pp. 16 - 29 |
---|---|
Main Authors | , |
Format | Journal Article Publication |
Language | English |
Published |
Elsevier Ltd
01.01.2018
Elsevier |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •An efficient low dimensional vector representation of speech based on GMM and RBM, referred to as GMMRBM vectors, is proposed.•A Universal RBM (URBM) is trained to learn the total speaker and session variability among background GMM supervectors.•A variant of Rectified Linear Units (ReLU), referred to as variable ReLU (VReLU), is proposed to train the URBM efficiently.•URBM is then used to transform unseen supervectors to the proposed GMM–RBM vectors.
Over the last few years, i-vectors have been the state-of-the-art technique in speaker recognition. Recent advances in Deep Learning (DL) technology have improved the quality of i-vectors but the DL techniques in use are computationally expensive and need phonetically labeled background data. The aim of this work is to develop an efficient alternative vector representation of speech by keeping the computational cost as low as possible and avoiding phonetic labels, which are not always accessible. The proposed vectors will be based on both Gaussian Mixture Models (GMM) and Restricted Boltzmann Machines (RBM) and will be referred to as GMM–RBM vectors. The role of RBM is to learn the total speaker and session variability among background GMM supervectors. This RBM, which will be referred to as Universal RBM (URBM), will then be used to transform unseen supervectors to the proposed low dimensional vectors. The use of different activation functions for training the URBM and different transformation functions for extracting the proposed vectors are investigated. At the end, a variant of Rectified Linear Units (ReLU) which is referred to as variable ReLU (VReLU) is proposed. Experiments on the core test condition 5 of NIST SRE 2010 show that comparable results with conventional i-vectors are achieved with a clearly lower computational load in the vector extraction process. |
---|---|
ISSN: | 0885-2308 1095-8363 |
DOI: | 10.1016/j.csl.2017.06.007 |