Neural encoding of melodic expectations in music across EEG frequency bands

The human brain tracks regularities in the environment and extrapolates these to predict future events. Prior work on music cognition suggests that low‐frequency (1–8 Hz) brain activity encodes melodic predictions beyond the stimulus acoustics. Building on this work, we aimed to disentangle the freq...

Full description

Saved in:
Bibliographic Details
Published inThe European journal of neuroscience Vol. 60; no. 11; pp. 6734 - 6749
Main Authors Galeano‐Otálvaro, Juan‐Daniel, Martorell, Jordi, Meyer, Lars, Titone, Lorenzo
Format Journal Article
LanguageEnglish
Published France Wiley Subscription Services, Inc 01.12.2024
John Wiley and Sons Inc
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The human brain tracks regularities in the environment and extrapolates these to predict future events. Prior work on music cognition suggests that low‐frequency (1–8 Hz) brain activity encodes melodic predictions beyond the stimulus acoustics. Building on this work, we aimed to disentangle the frequency‐specific neural dynamics linked to melodic prediction uncertainty (modelled as entropy) and prediction error (modelled as surprisal) for temporal (note onset) and content (note pitch) information. By using multivariate temporal response function (TRF) models, we re‐analysed the electroencephalogram (EEG) from 20 subjects (10 musicians) who listened to Western tonal music. Our results show that melodic expectation metrics improve the EEG reconstruction accuracy in all frequency bands below the gamma range (< 30 Hz). Crucially, we found that entropy contributed more strongly to the reconstruction accuracy enhancement compared to surprisal in all frequency bands. Additionally, we found that the encoding of temporal, but not content, information metrics was not limited to low frequencies, rather it extended to higher frequencies (> 8 Hz). An analysis of the TRF weights revealed that the temporal predictability of a note (entropy of note onset) may be encoded in the delta‐ (1–4 Hz) and beta‐band (12–30 Hz) brain activity prior to the stimulus, suggesting that these frequency bands associate with temporal predictions. Strikingly, we also revealed that melodic expectations selectively enhanced EEG reconstruction accuracy in the beta band for musicians, and in the alpha band (8–12 Hz) for non‐musicians, suggesting that musical expertise influences the neural dynamics underlying predictive processing in music cognition. We re‐analysed the band‐passed EEG (top right) from subjects who listened to music using multivariate temporal response functions (mTRF). Melodic expectation metrics beyond acoustics (top left) enhanced EEG reconstruction accuracy in all frequency bands below the gamma band (< 30 Hz). Entropy, not surprisal, metrics drove the effect in all bands (bottom left). The enhancement was stronger in the beta band (12–30 Hz) for musicians and in the alpha band (8–12 Hz) for non‐musicians (bottom right).
Bibliography:Edited by: Edmund Lalor
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0953-816X
1460-9568
1460-9568
DOI:10.1111/ejn.16581