Spatial attentional behavior analysis based on cognitive style during speech-in-noise task

We designed a spatial selective attention task to investigate a listener’s speech understanding ability in various masking noise conditions. The masking noise stream is music and non-intelligent speech combined sound which was played from the front-right and back directions. The target speech stream...

Full description

Saved in:
Bibliographic Details
Published inThe Journal of the Acoustical Society of America Vol. 153; no. 3; p. A336
Main Authors Takeuchi, Akira, Shim, Hwan, Choi, Inyong, Kim, Sungyoung
Format Journal Article
LanguageEnglish
Published 01.03.2023
Online AccessGet full text

Cover

Loading…
More Information
Summary:We designed a spatial selective attention task to investigate a listener’s speech understanding ability in various masking noise conditions. The masking noise stream is music and non-intelligent speech combined sound which was played from the front-right and back directions. The target speech stream which has an auditory cue “Ready,” following color and number information, for example, ‘Ready Blue Five,’ was played from the front-left direction. A listener’s attentional ability was evaluated through the accuracy of the correct number and color selection in each condition of various SNRs and the spatial location of the masker. We formed two subject groups showing distinct cognitive styles, independent-analytic and interdependent-holistic, from a pre-experimental analysis. The analysis of variance (ANOVA) evinced a significant interaction between the group and SNR. Moreover, the independent-analytic group appeared to maintain their attention regardless of spatial location while another group scored poorly for the rear-positioned masker. Specifically, when the task got difficult with lower SNR conditions, this discrepancy between the two groups became significant. From these discussions, this study supports that individual differences associated with subconscious cognitive constructs influence speech-in-noise understanding ability.
ISSN:0001-4966
1520-8524
DOI:10.1121/10.0019065