Track and Noise Separation Based on the Universal Codebook and Enhanced Speech Recognition Using Hybrid Deep Learning Method

The concept of Deep learning is a part of machine learning which is very useful nowadays to achieve accurate voice and speech recognition based on the training data by creating robust algorithms. It is also possible to separate the noise from original speech as well as the separation of tracks in pa...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 11; pp. 120707 - 120720
Main Authors Kumer, S. V. Aswin, Gogu, Lakshmi Bharath, Mohan, E., Maloji, Suman, Natarajan, Balaji, Sambasivam, G., Tyagi, Vaibhav Bhushan
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The concept of Deep learning is a part of machine learning which is very useful nowadays to achieve accurate voice and speech recognition based on the training data by creating robust algorithms. It is also possible to separate the noise from original speech as well as the separation of tracks in particular audio signal with the help of machine learning algorithms. In this paper, the implementation is applicable for voice assistant to separate the tracks and the noises from the multiple original audio which reproduces simultaneously using the speech enhancement and universal code book. For that, the Hybrid Deep Learning Algorithm has been developed and the training data sets are also created and achieve the accuracy in the speech recognition for the variety of voice assistants. Most of the time, the voice assistant recognizes the voice with noises and musical audio which results in the malfunction of devices which can be controlled by the same voice assistant. The Generative adversarial networks from Deep learning and the blind source separation method from multi-channel model are combined to form this proposed hybrid deep learning model.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2023.3328208