Spectral subtraction-based speech enhancement for cochlear implant patients in background noise

A single-channel speech enhancement algorithm utilizing speech pause detection and nonlinear spectral subtraction is proposed for cochlear implant patients in the present study. The spectral subtraction algorithm estimates the short-time spectral magnitude of speech by subtracting the estimated nois...

Full description

Saved in:
Bibliographic Details
Published inThe Journal of the Acoustical Society of America Vol. 117; no. 3; pp. 1001 - 1004
Main Authors Yang, Li-Ping, Fu, Qian-Jie
Format Journal Article
LanguageEnglish
Published Woodbury, NY Acoustical Society of America 01.03.2005
American Institute of Physics
Subjects
Online AccessGet full text
ISSN0001-4966
1520-8524
DOI10.1121/1.1852873

Cover

More Information
Summary:A single-channel speech enhancement algorithm utilizing speech pause detection and nonlinear spectral subtraction is proposed for cochlear implant patients in the present study. The spectral subtraction algorithm estimates the short-time spectral magnitude of speech by subtracting the estimated noise spectral magnitude from the noisy speech spectral magnitude. The artifacts produced by spectral subtraction (such as “musical noise”) were significantly reduced by combining variance-reduced gain function and spectral flooring. Sentence recognition by seven cochlear implant subjects was tested under different noisy listening conditions (speech-shaped noise and 6-talker speech babble at +9, +6, +3, and 0 dB SNR) with and without the speech enhancement algorithm. For speech-shaped noise, performance for all subjects at all SNRs was significantly improved by the speech enhancement algorithm; for speech babble, performance was only modestly improved. The results suggest that the proposed speech enhancement algorithm may be beneficial for implant users in noisy listening.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Correspondence-1
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:0001-4966
1520-8524
DOI:10.1121/1.1852873