Increased Connectivity among Sensory and Motor Regions during Visual and Audiovisual Speech Perception
In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is degraded. Here, we used fMRI to monitor brain activity while adult humans ( = 60) were presented...
Saved in:
Published in | The Journal of neuroscience Vol. 42; no. 3; pp. 435 - 442 |
---|---|
Main Authors | , , , , , , , |
Format | Journal Article |
Language | English |
Published |
United States
Society for Neuroscience
19.01.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is degraded. Here, we used fMRI to monitor brain activity while adult humans (
= 60) were presented with visual-only, auditory-only, and audiovisual words. The audiovisual words were presented in quiet and in several signal-to-noise ratios. As expected, audiovisual speech perception recruited both auditory and visual cortex, with some evidence for increased recruitment of premotor cortex in some conditions (including in substantial background noise). We then investigated neural connectivity using psychophysiological interaction analysis with seed regions in both primary auditory cortex and primary visual cortex. Connectivity between auditory and visual cortices was stronger in audiovisual conditions than in unimodal conditions, including a wide network of regions in posterior temporal cortex and prefrontal cortex. In addition to whole-brain analyses, we also conducted a region-of-interest analysis on the left posterior superior temporal sulcus (pSTS), implicated in many previous studies of audiovisual speech perception. We found evidence for both activity and effective connectivity in pSTS for visual-only and audiovisual speech, although these were not significant in whole-brain analyses. Together, our results suggest a prominent role for cross-region synchronization in understanding both visual-only and audiovisual speech that complements activity in integrative brain regions like pSTS.
In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is hard to understand (e.g., background noise). Prior work has suggested that specialized regions of the brain may play a critical role in integrating information from visual and auditory speech. Here, we show a complementary mechanism relying on synchronized brain activity among sensory and motor regions may also play a critical role. These findings encourage reconceptualizing audiovisual integration in the context of coordinated network activity. |
---|---|
AbstractList | In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is degraded. Here, we used fMRI to monitor brain activity while adult humans (n = 60) were presented with visual-only, auditory-only, and audiovisual words. The audiovisual words were presented in quiet and in several signal-to-noise ratios. As expected, audiovisual speech perception recruited both auditory and visual cortex, with some evidence for increased recruitment of premotor cortex in some conditions (including in substantial background noise). We then investigated neural connectivity using psychophysiological interaction analysis with seed regions in both primary auditory cortex and primary visual cortex. Connectivity between auditory and visual cortices was stronger in audiovisual conditions than in unimodal conditions, including a wide network of regions in posterior temporal cortex and prefrontal cortex. In addition to whole-brain analyses, we also conducted a region-of-interest analysis on the left posterior superior temporal sulcus (pSTS), implicated in many previous studies of audiovisual speech perception. We found evidence for both activity and effective connectivity in pSTS for visual-only and audiovisual speech, although these were not significant in whole-brain analyses. Together, our results suggest a prominent role for cross-region synchronization in understanding both visual-only and audiovisual speech that complements activity in integrative brain regions like pSTS.SIGNIFICANCE STATEMENT In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is hard to understand (e.g., background noise). Prior work has suggested that specialized regions of the brain may play a critical role in integrating information from visual and auditory speech. Here, we show a complementary mechanism relying on synchronized brain activity among sensory and motor regions may also play a critical role. These findings encourage reconceptualizing audiovisual integration in the context of coordinated network activity. In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is degraded. Here, we used fMRI to monitor brain activity while adult humans ( = 60) were presented with visual-only, auditory-only, and audiovisual words. The audiovisual words were presented in quiet and in several signal-to-noise ratios. As expected, audiovisual speech perception recruited both auditory and visual cortex, with some evidence for increased recruitment of premotor cortex in some conditions (including in substantial background noise). We then investigated neural connectivity using psychophysiological interaction analysis with seed regions in both primary auditory cortex and primary visual cortex. Connectivity between auditory and visual cortices was stronger in audiovisual conditions than in unimodal conditions, including a wide network of regions in posterior temporal cortex and prefrontal cortex. In addition to whole-brain analyses, we also conducted a region-of-interest analysis on the left posterior superior temporal sulcus (pSTS), implicated in many previous studies of audiovisual speech perception. We found evidence for both activity and effective connectivity in pSTS for visual-only and audiovisual speech, although these were not significant in whole-brain analyses. Together, our results suggest a prominent role for cross-region synchronization in understanding both visual-only and audiovisual speech that complements activity in integrative brain regions like pSTS. In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is hard to understand (e.g., background noise). Prior work has suggested that specialized regions of the brain may play a critical role in integrating information from visual and auditory speech. Here, we show a complementary mechanism relying on synchronized brain activity among sensory and motor regions may also play a critical role. These findings encourage reconceptualizing audiovisual integration in the context of coordinated network activity. In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is degraded. Here, we used fMRI to monitor brain activity while adult humans (n = 60) were presented with visual-only, auditory-only, and audiovisual words. The audiovisual words were presented in quiet and in several signal-to-noise ratios. As expected, audiovisual speech perception recruited both auditory and visual cortex, with some evidence for increased recruitment of premotor cortex in some conditions (including in substantial background noise). We then investigated neural connectivity using psychophysiological interaction analysis with seed regions in both primary auditory cortex and primary visual cortex. Connectivity between auditory and visual cortices was stronger in audiovisual conditions than in unimodal conditions, including a wide network of regions in posterior temporal cortex and prefrontal cortex. In addition to whole-brain analyses, we also conducted a region-of-interest analysis on the left posterior superior temporal sulcus (pSTS), implicated in many previous studies of audiovisual speech perception. We found evidence for both activity and effective connectivity in pSTS for visual-only and audiovisual speech, although these were not significant in whole-brain analyses. Together, our results suggest a prominent role for cross-region synchronization in understanding both visual-only and audiovisual speech that complements activity in integrative brain regions like pSTS. In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is degraded. Here, we used fMRI to monitor brain activity while adult humans ( n = 60) were presented with visual-only, auditory-only, and audiovisual words. The audiovisual words were presented in quiet and in several signal-to-noise ratios. As expected, audiovisual speech perception recruited both auditory and visual cortex, with some evidence for increased recruitment of premotor cortex in some conditions (including in substantial background noise). We then investigated neural connectivity using psychophysiological interaction analysis with seed regions in both primary auditory cortex and primary visual cortex. Connectivity between auditory and visual cortices was stronger in audiovisual conditions than in unimodal conditions, including a wide network of regions in posterior temporal cortex and prefrontal cortex. In addition to whole-brain analyses, we also conducted a region-of-interest analysis on the left posterior superior temporal sulcus (pSTS), implicated in many previous studies of audiovisual speech perception. We found evidence for both activity and effective connectivity in pSTS for visual-only and audiovisual speech, although these were not significant in whole-brain analyses. Together, our results suggest a prominent role for cross-region synchronization in understanding both visual-only and audiovisual speech that complements activity in integrative brain regions like pSTS. SIGNIFICANCE STATEMENT In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is hard to understand (e.g., background noise). Prior work has suggested that specialized regions of the brain may play a critical role in integrating information from visual and auditory speech. Here, we show a complementary mechanism relying on synchronized brain activity among sensory and motor regions may also play a critical role. These findings encourage reconceptualizing audiovisual integration in the context of coordinated network activity. |
Author | Peelle, Jonathan E Myerson, Joel Jones, Michael S McConkey, Sarah Tye-Murray, Nancy Hale, Sandra Sommers, Mitchell S Spehar, Brent |
Author_xml | – sequence: 1 givenname: Jonathan E orcidid: 0000-0001-9194-854X surname: Peelle fullname: Peelle, Jonathan E email: jpeelle@wustl.edu organization: Department of Otolaryngology, Washington University in St. Louis, St. Louis, Missouri 63110 jpeelle@wustl.edu – sequence: 2 givenname: Brent surname: Spehar fullname: Spehar, Brent organization: Department of Otolaryngology, Washington University in St. Louis, St. Louis, Missouri 63110 – sequence: 3 givenname: Michael S surname: Jones fullname: Jones, Michael S organization: Department of Otolaryngology, Washington University in St. Louis, St. Louis, Missouri 63110 – sequence: 4 givenname: Sarah surname: McConkey fullname: McConkey, Sarah organization: Department of Otolaryngology, Washington University in St. Louis, St. Louis, Missouri 63110 – sequence: 5 givenname: Joel surname: Myerson fullname: Myerson, Joel organization: Department of Psychological and Brain Sciences, Washington University in St. Louis, St. Louis, Missouri 63130 – sequence: 6 givenname: Sandra surname: Hale fullname: Hale, Sandra organization: Department of Psychological and Brain Sciences, Washington University in St. Louis, St. Louis, Missouri 63130 – sequence: 7 givenname: Mitchell S surname: Sommers fullname: Sommers, Mitchell S organization: Department of Psychological and Brain Sciences, Washington University in St. Louis, St. Louis, Missouri 63130 – sequence: 8 givenname: Nancy surname: Tye-Murray fullname: Tye-Murray, Nancy organization: Department of Otolaryngology, Washington University in St. Louis, St. Louis, Missouri 63110 |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/34815317$$D View this record in MEDLINE/PubMed |
BookMark | eNpdkU9PGzEQxa0KVALtV0Ar9dLLpjO299-lEopoGwSlIqVXy3hng9HGDvZuJL59HUIjyskaze89-c07ZgfOO2LsFGGKBRdfLn6e395cL2bzKSDKnOOUA8d3bJK2Tc4l4AGbAK8gL2Ulj9hxjA8AUAFW79mRkDUWAqsJ6-bOBNKR2mzmnSMz2I0dnjK98m6ZLchFH9Lk2uzKDz5kN7S03sWsHYNNwB8bR90_78_G1vrNbl6sicx99ouCofWQBB_YYaf7SB9f3hN2--389-xHfnn9fT47u8yNlHzIa5ANlkDIKymRTFF0HHVdQFs2leZdlYIJFMTvqDGcm0K0SdKShFLospbihH3d-a7HuxW1htwQdK_Wwa50eFJeW_X_xtl7tfQbVdfAG14mg88vBsE_jhQHtbLRUN9rR36MipeATSPTURP66Q364MfgUrxEJU4AijpR5Y4ywccYqNt_BkFtq1T7KtW2SsVRbatMwtPXUfayf92Jv2ddnSI |
CitedBy_id | crossref_primary_10_1126_sciadv_adg7056 crossref_primary_10_1121_10_0015262 crossref_primary_10_1093_cercor_bhac277 crossref_primary_10_3390_brainsci13081126 |
Cites_doi | 10.1016/j.cortex.2015.03.006 10.3758/BF03193014 10.3389/fninf.2015.00008 10.1371/journal.pone.0015710 10.1111/j.1467-8721.2008.00615.x 10.1016/j.conb.2005.06.008 10.1016/j.neuroimage.2011.10.018 10.1037/pag0000094 10.3758/s13423-014-0774-3 10.1016/j.neulet.2004.03.076 10.3758/s13423-014-0722-2 10.3389/fninf.2014.00090 10.1523/JNEUROSCI.0660-17.2017 10.1016/j.mri.2007.03.009 10.1002/hbm.460020107 10.1126/science.276.5312.593 10.1111/j.1460-9568.2005.04462.x 10.1163/187847611X595864 10.3758/s13423-012-0328-5 10.1002/(SICI)1097-0193(1999)7:3<213::AID-HBM5>3.0.CO;2-N 10.1016/j.neuropsychologia.2012.02.023 10.1016/j.neuroimage.2008.09.034 10.1152/jn.00706.2002 10.1016/j.neuroimage.2004.12.034 10.1038/nn1333 10.1016/j.neuroimage.2005.02.018 10.1016/j.neulet.2009.01.060 10.1016/j.neuroimage.2015.12.038 10.1371/journal.pone.0068959 10.1006/nimg.2001.0978 10.3758/s13423-015-0817-4 10.1161/STROKEAHA.108.532499 10.3389/fnhum.2017.00174 10.1038/264746a0 10.1371/journal.pcbi.1000436 10.1002/(SICI)1097-0193(1999)7:2<89::AID-HBM2>3.0.CO;2-N 10.7554/eLife.71774 10.1121/1.1907309 10.1155/2000/421719 10.1016/j.neuron.2006.12.011 10.1523/JNEUROSCI.4853-10.2011 10.1121/1.423751 10.1002/9781119184096.ch19 10.1006/nimg.1997.0291 10.1073/pnas.0408949102 10.1002/hbm.22572 10.1007/978-1-4419-1210-7_6 10.1002/wcs.58 10.1016/j.neuroimage.2015.11.038 10.1016/j.neuroimage.2010.09.033 10.1016/j.neuroimage.2011.09.015 10.31234/osf.io/6y8qw 10.3758/BF03214309 10.1006/nimg.2000.0715 10.1016/j.neuroimage.2006.09.055 10.1101/2021.10.12.464075 10.1068/p6359 10.3389/fnins.2014.00386 |
ContentType | Journal Article |
Copyright | Copyright © 2022 the authors. Copyright Society for Neuroscience Jan 19, 2022 Copyright © 2022 the authors 2022 |
Copyright_xml | – notice: Copyright © 2022 the authors. – notice: Copyright Society for Neuroscience Jan 19, 2022 – notice: Copyright © 2022 the authors 2022 |
DBID | CGR CUY CVF ECM EIF NPM AAYXX CITATION 7QG 7QR 7TK 7U7 7U9 8FD C1K FR3 H94 P64 7X8 5PM |
DOI | 10.1523/JNEUROSCI.0114-21.2021 |
DatabaseName | Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed CrossRef Animal Behavior Abstracts Chemoreception Abstracts Neurosciences Abstracts Toxicology Abstracts Virology and AIDS Abstracts Technology Research Database Environmental Sciences and Pollution Management Engineering Research Database AIDS and Cancer Research Abstracts Biotechnology and BioEngineering Abstracts MEDLINE - Academic PubMed Central (Full Participant titles) |
DatabaseTitle | MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) CrossRef Virology and AIDS Abstracts Technology Research Database Toxicology Abstracts Animal Behavior Abstracts AIDS and Cancer Research Abstracts Chemoreception Abstracts Engineering Research Database Neurosciences Abstracts Biotechnology and BioEngineering Abstracts Environmental Sciences and Pollution Management MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic MEDLINE Virology and AIDS Abstracts CrossRef |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Anatomy & Physiology |
EISSN | 1529-2401 |
EndPage | 442 |
ExternalDocumentID | 10_1523_JNEUROSCI_0114_21_2021 34815317 |
Genre | Journal Article Research Support, N.I.H., Extramural |
GrantInformation_xml | – fundername: NIDCD NIH HHS grantid: R01 DC016594 – fundername: NIA NIH HHS grantid: R56 AG018029 – fundername: ; grantid: R56 AG018029 – fundername: ; grantid: R01 DC016594 |
GroupedDBID | --- -DZ -~X .55 18M 2WC 34G 39C 53G 5GY 5RE 5VS AAFWJ ABBAR ABIVO ACGUR ACNCT ADBBV ADCOW AENEX AFCFT AFHIN AFOSN AHWXS AIZTS ALMA_UNASSIGNED_HOLDINGS AOIJS BAWUL BTFSW CGR CS3 CUY CVF DIK DU5 E3Z EBS ECM EIF EJD F5P GX1 H13 HYE H~9 KQ8 L7B NPM OK1 P0W P2P QZG R.V RHF RHI RPM TFN TR2 W8F WH7 WOQ X7M YBU YHG YKV YNH YSK AAYXX CITATION 7QG 7QR 7TK 7U7 7U9 8FD C1K FR3 H94 P64 7X8 5PM |
ID | FETCH-LOGICAL-c442t-8049160e127441ec55f21a850d697a2f7240313e2be9c22c53d804de4063a6843 |
IEDL.DBID | RPM |
ISSN | 0270-6474 |
IngestDate | Tue Sep 17 21:08:25 EDT 2024 Sat Aug 17 05:32:09 EDT 2024 Thu Oct 10 22:02:50 EDT 2024 Fri Aug 23 02:44:01 EDT 2024 Sat Nov 02 12:16:56 EDT 2024 |
IsDoiOpenAccess | false |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 3 |
Keywords | language speech speechreading audiovisual integration lipreading |
Language | English |
License | Copyright © 2022 the authors. SfN exclusive license. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c442t-8049160e127441ec55f21a850d697a2f7240313e2be9c22c53d804de4063a6843 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 Author contributions: J.E.P., B.S., J.M., S.H., M.S.S., and N.T.-M. designed research; J.E.P., B.S., M.S.J., and S.M. performed research; J.E.P. and M.S.J. analyzed data; J.E.P., B.S., M.S.J., S.M., J.M., S.H., M.S.S., and N.T.-M. wrote the paper. |
ORCID | 0000-0001-9194-854X |
OpenAccessLink | https://www.jneurosci.org/content/jneuro/42/3/435.full.pdf |
PMID | 34815317 |
PQID | 2626030138 |
PQPubID | 2049535 |
PageCount | 8 |
ParticipantIDs | pubmedcentral_primary_oai_pubmedcentral_nih_gov_8802926 proquest_miscellaneous_2601994647 proquest_journals_2626030138 crossref_primary_10_1523_JNEUROSCI_0114_21_2021 pubmed_primary_34815317 |
PublicationCentury | 2000 |
PublicationDate | 2022-01-19 20220119 |
PublicationDateYYYYMMDD | 2022-01-19 |
PublicationDate_xml | – month: 01 year: 2022 text: 2022-01-19 day: 19 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: Baltimore |
PublicationTitle | The Journal of neuroscience |
PublicationTitleAlternate | J Neurosci |
PublicationYear | 2022 |
Publisher | Society for Neuroscience |
Publisher_xml | – name: Society for Neuroscience |
References | 2023041803500554000_42.3.435.19 2023041803500554000_42.3.435.11 2023041803500554000_42.3.435.55 2023041803500554000_42.3.435.12 2023041803500554000_42.3.435.56 2023041803500554000_42.3.435.13 2023041803500554000_42.3.435.57 2023041803500554000_42.3.435.14 2023041803500554000_42.3.435.58 2023041803500554000_42.3.435.15 2023041803500554000_42.3.435.59 2023041803500554000_42.3.435.16 2023041803500554000_42.3.435.17 2023041803500554000_42.3.435.18 2023041803500554000_42.3.435.50 2023041803500554000_42.3.435.51 2023041803500554000_42.3.435.52 2023041803500554000_42.3.435.53 2023041803500554000_42.3.435.10 2023041803500554000_42.3.435.54 2023041803500554000_42.3.435.22 2023041803500554000_42.3.435.23 2023041803500554000_42.3.435.24 2023041803500554000_42.3.435.25 2023041803500554000_42.3.435.26 2023041803500554000_42.3.435.27 2023041803500554000_42.3.435.28 2023041803500554000_42.3.435.29 2023041803500554000_42.3.435.20 2023041803500554000_42.3.435.21 2023041803500554000_42.3.435.9 2023041803500554000_42.3.435.8 2023041803500554000_42.3.435.33 2023041803500554000_42.3.435.34 2023041803500554000_42.3.435.35 2023041803500554000_42.3.435.36 2023041803500554000_42.3.435.37 2023041803500554000_42.3.435.38 2023041803500554000_42.3.435.39 2023041803500554000_42.3.435.1 2023041803500554000_42.3.435.3 2023041803500554000_42.3.435.2 2023041803500554000_42.3.435.5 2023041803500554000_42.3.435.4 2023041803500554000_42.3.435.7 2023041803500554000_42.3.435.31 2023041803500554000_42.3.435.6 2023041803500554000_42.3.435.32 Markiewicz (2023041803500554000_42.3.435.30) 2021; 10 2023041803500554000_42.3.435.44 2023041803500554000_42.3.435.45 2023041803500554000_42.3.435.46 2023041803500554000_42.3.435.47 2023041803500554000_42.3.435.48 2023041803500554000_42.3.435.49 2023041803500554000_42.3.435.40 2023041803500554000_42.3.435.41 2023041803500554000_42.3.435.42 2023041803500554000_42.3.435.43 |
References_xml | – ident: 2023041803500554000_42.3.435.40 doi: 10.1016/j.cortex.2015.03.006 – ident: 2023041803500554000_42.3.435.3 doi: 10.3758/BF03193014 – ident: 2023041803500554000_42.3.435.18 doi: 10.3389/fninf.2015.00008 – ident: 2023041803500554000_42.3.435.14 doi: 10.1371/journal.pone.0015710 – ident: 2023041803500554000_42.3.435.31 – ident: 2023041803500554000_42.3.435.44 doi: 10.1111/j.1467-8721.2008.00615.x – ident: 2023041803500554000_42.3.435.45 doi: 10.1016/j.conb.2005.06.008 – ident: 2023041803500554000_42.3.435.41 doi: 10.1016/j.neuroimage.2011.10.018 – ident: 2023041803500554000_42.3.435.53 doi: 10.1037/pag0000094 – ident: 2023041803500554000_42.3.435.52 doi: 10.3758/s13423-014-0774-3 – ident: 2023041803500554000_42.3.435.34 doi: 10.1016/j.neulet.2004.03.076 – ident: 2023041803500554000_42.3.435.28 doi: 10.3758/s13423-014-0722-2 – ident: 2023041803500554000_42.3.435.9 doi: 10.3389/fninf.2014.00090 – ident: 2023041803500554000_42.3.435.36 doi: 10.1523/JNEUROSCI.0660-17.2017 – ident: 2023041803500554000_42.3.435.27 doi: 10.1016/j.mri.2007.03.009 – ident: 2023041803500554000_42.3.435.16 doi: 10.1002/hbm.460020107 – ident: 2023041803500554000_42.3.435.6 doi: 10.1126/science.276.5312.593 – ident: 2023041803500554000_42.3.435.7 doi: 10.1111/j.1460-9568.2005.04462.x – ident: 2023041803500554000_42.3.435.1 doi: 10.1163/187847611X595864 – ident: 2023041803500554000_42.3.435.51 doi: 10.3758/s13423-012-0328-5 – ident: 2023041803500554000_42.3.435.23 doi: 10.1002/(SICI)1097-0193(1999)7:3<213::AID-HBM5>3.0.CO;2-N – ident: 2023041803500554000_42.3.435.50 doi: 10.1016/j.neuropsychologia.2012.02.023 – ident: 2023041803500554000_42.3.435.48 doi: 10.1016/j.neuroimage.2008.09.034 – ident: 2023041803500554000_42.3.435.42 doi: 10.1152/jn.00706.2002 – ident: 2023041803500554000_42.3.435.12 doi: 10.1016/j.neuroimage.2004.12.034 – ident: 2023041803500554000_42.3.435.4 doi: 10.1038/nn1333 – ident: 2023041803500554000_42.3.435.2 doi: 10.1016/j.neuroimage.2005.02.018 – ident: 2023041803500554000_42.3.435.38 doi: 10.1016/j.neulet.2009.01.060 – ident: 2023041803500554000_42.3.435.37 doi: 10.1016/j.neuroimage.2015.12.038 – ident: 2023041803500554000_42.3.435.39 doi: 10.1371/journal.pone.0068959 – ident: 2023041803500554000_42.3.435.54 doi: 10.1006/nimg.2001.0978 – ident: 2023041803500554000_42.3.435.29 doi: 10.3758/s13423-015-0817-4 – ident: 2023041803500554000_42.3.435.15 doi: 10.1161/STROKEAHA.108.532499 – ident: 2023041803500554000_42.3.435.58 doi: 10.3389/fnhum.2017.00174 – ident: 2023041803500554000_42.3.435.32 doi: 10.1038/264746a0 – ident: 2023041803500554000_42.3.435.8 doi: 10.1371/journal.pcbi.1000436 – ident: 2023041803500554000_42.3.435.11 doi: 10.1002/(SICI)1097-0193(1999)7:2<89::AID-HBM2>3.0.CO;2-N – volume: 10 start-page: e71774 year: 2021 ident: 2023041803500554000_42.3.435.30 article-title: The OpenNeuro resource for sharing of neuroscience data publication-title: Elife doi: 10.7554/eLife.71774 contributor: fullname: Markiewicz – ident: 2023041803500554000_42.3.435.49 doi: 10.1121/1.1907309 – ident: 2023041803500554000_42.3.435.43 doi: 10.1155/2000/421719 – ident: 2023041803500554000_42.3.435.26 doi: 10.1016/j.neuron.2006.12.011 – ident: 2023041803500554000_42.3.435.35 doi: 10.1523/JNEUROSCI.4853-10.2011 – ident: 2023041803500554000_42.3.435.20 doi: 10.1121/1.423751 – ident: 2023041803500554000_42.3.435.46 doi: 10.1002/9781119184096.ch19 – ident: 2023041803500554000_42.3.435.17 doi: 10.1006/nimg.1997.0291 – ident: 2023041803500554000_42.3.435.56 doi: 10.1073/pnas.0408949102 – ident: 2023041803500554000_42.3.435.13 doi: 10.1002/hbm.22572 – ident: 2023041803500554000_42.3.435.59 doi: 10.1007/978-1-4419-1210-7_6 – ident: 2023041803500554000_42.3.435.47 doi: 10.1002/wcs.58 – ident: 2023041803500554000_42.3.435.57 doi: 10.1016/j.neuroimage.2015.11.038 – ident: 2023041803500554000_42.3.435.19 doi: 10.1016/j.neuroimage.2010.09.033 – ident: 2023041803500554000_42.3.435.24 doi: 10.1016/j.neuroimage.2011.09.015 – ident: 2023041803500554000_42.3.435.55 doi: 10.31234/osf.io/6y8qw – ident: 2023041803500554000_42.3.435.21 doi: 10.3758/BF03214309 – ident: 2023041803500554000_42.3.435.33 doi: 10.1006/nimg.2000.0715 – ident: 2023041803500554000_42.3.435.10 doi: 10.1016/j.neuroimage.2006.09.055 – ident: 2023041803500554000_42.3.435.25 doi: 10.1101/2021.10.12.464075 – ident: 2023041803500554000_42.3.435.22 doi: 10.1068/p6359 – ident: 2023041803500554000_42.3.435.5 doi: 10.3389/fnins.2014.00386 |
SSID | ssj0007017 |
Score | 2.4590867 |
Snippet | In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly... |
SourceID | pubmedcentral proquest crossref pubmed |
SourceType | Open Access Repository Aggregation Database Index Database |
StartPage | 435 |
SubjectTerms | Adult Aged Aged, 80 and over Auditory Cortex - diagnostic imaging Auditory Cortex - physiology Background noise Brain Brain mapping Cortex (auditory) Cortex (premotor) Cortex (somatosensory) Cortex (temporal) Female Functional magnetic resonance imaging Hearing Humans Language Lipreading Magnetic Resonance Imaging Male Middle Aged Nerve Net - diagnostic imaging Nerve Net - physiology Neural networks Noise Perception Prefrontal cortex Sensorimotor integration Sensory integration Speech Speech perception Speech Perception - physiology Superior temporal sulcus Synchronism Synchronization Visual cortex Visual Cortex - diagnostic imaging Visual Cortex - physiology Visual perception Visual Perception - physiology Visual signals Young Adult |
Title | Increased Connectivity among Sensory and Motor Regions during Visual and Audiovisual Speech Perception |
URI | https://www.ncbi.nlm.nih.gov/pubmed/34815317 https://www.proquest.com/docview/2626030138 https://search.proquest.com/docview/2601994647 https://pubmed.ncbi.nlm.nih.gov/PMC8802926 |
Volume | 42 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LT-MwEB5RTntBsCyQXUBGQtzSJI6dxMeqWsRDIKCAuEW144hKNK1oe-DfM-M8tF1uHC3bceQZZ-aLv5kBOI21STPFM9-icfRFKaWvSyF9Y2OL6EHQkSK2xW1y8SSuXuTLBsg2FsaR9o2e9Ku3ab-avDpu5XxqgpYnFtzdDFHnuOJJ0IMeKmgL0ZvPbxq6MrsItxAXiVQ0YcEIuIKrW6LHjYaXfYIBPid8yKlWDEWjojKm68bpi8f5P3HyH0t0vg1bjQvJBvWr7sCGrX7C7qBC-Dz9YGfMkTrd3_JdKPH8E-3cFsxRWkxdLIK5GkNshBh29o6tqmA3M0Tf7MESP3nB6uhF9jxZrHAp6h-siLdat0dza80ru-tIMb_g6fzv4_DCb0or-EYIvkS7JNAvDG1ECQIja6QseTTOZFgkKh3zMqU0fVFsubbKcG5kXOCUwqL5j8dJJuI92KxmlT0AFhUIcpTURuOzlFFZnBmdSakTjd6JVh4E7Z7m8zqDRk7IAwWSdwLJSSA5j3ISiAeH7dbnzYla5JyQV0z3qh6cdN14FuiCY1zZ2YrGhJTqGKXuwX4tqW7JVsQepGsy7AZQnu31HlQ_l2-7Ubff3575B35wipoIIz9Sh7C5fF_ZI_RllvoYetf32bHT4E9RBPLm |
link.rule.ids | 230,315,730,783,787,888,27936,27937,53804,53806 |
linkProvider | National Library of Medicine |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LbxMxEB6VcoBLeRTapS0YCXHbl9fexzGKqNLSRBVpq95WsderRpBN1GQP8Os7432IlFM5WrbXsmbGnm_9zQzAl0jpJM146hq8HF1RSumqUkhXm8ggehBkUsS2mMSja3F-K293QHaxMJa0r9Xcq34tvGp-Z7mVq4X2O56Yfzkeos7xjMf-M3iO9hqIDqS3B3AS2EK7CLgQGYlEtIHBCLn88wkR5KbDM4-AgMsJIXKqFkPxqKiOyfb19I_P-Zg6-ddddPoKbrpdNBSUn169UZ7-8yjB45O3-Rr2Wu-UDZruN7BjqrewP6gQmS9-s6_M8kXtj_h9KPFoIUa7KZhly-imDgWz5YvYFOHx8h5bVcHGSwT27Ich6vOaNYGR7Ga-rnEp6h_URIlt2tOVMfqOXfZ8m3dwffrtajhy26oNrhaCb_DKE-hyBiak3IOh0VKWPJylMijiLJnxMqEMgGFkuDKZ5lzLqMAphUHPIprFqYjew261rMwhsLBA_JRJpRV-K9NZGqVapVKqWKHjozIH_E5Y-apJzpETqEFJ572kc5J0zsOcJO3AcSfTvDXWdc4J1EX0ZOvA574bzYzeTmaVWdY0JqAsyqhODhw0KtAv2emOA8mWcvQDKIX3dg-K3KbybkX84b9nfoIXo6vxRX5xNvl-BC85BWcEoRtmx7C7ua_NCbpMG_XRGsgDhhUT-Q |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Lj9MwEB7BIiEuvJZHYAEjIW55OXYSH6tCtbuwVUVZtOIS1Y6jraBptW0O8OuZcR7aLrc9WrZjWTNjzxd_MwPwIdEmyxXPfYuXoy8qKX1dCekbm1hED4JMitgW0_T4XJxeyItrpb4cad_oZVD_XgX18tJxKzcrE_Y8sXB2Nkad44qn4aaswrtwD202Snug3h3CWeSK7SLoQnQkMtEFByPsCk-nRJKbj08CAgM-J5TIqWIMxaSiSmb7V9R_fudN-uS1-2jyCH72O2lpKL-CZqcD8_dGksdbbfUxPOy8VDZqhzyBO7Z-CoejGhH66g_7yBxv1P2QP4QKjxhittuSOdaMaetRMFfGiM0RJq-vsFWX7GyNAJ99s0SB3rI2QJL9WG4bXIr6Rw1RY9v2fGOtuWSzgXfzDM4nn7-Pj_2ueoNvhOA7vPoEup6RjSkHYWyNlBWPF7mMylRlC15llAkwTizXVhnOjUxKnFJa9DCSRZqL5Dkc1OvavgQWl4ijlNRG47eUUXmSG51LqVONDpBWHoS9wIpNm6SjIHCD0i4GaRck7YLHBUnbg6NerkVntNuCE7hL6OnWg_dDN5obvaEsartuaExE2ZRRpTx40arBsGSvPx5kewoyDKBU3vs9KHaX0rsT86tbz3wH92efJsXXk-mX1_CAU4xGFPuxOoKD3VVj36DntNNvnY38AzmIFnk |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Increased+Connectivity+among+Sensory+and+Motor+Regions+during+Visual+and+Audiovisual+Speech+Perception&rft.jtitle=The+Journal+of+neuroscience&rft.au=Peelle%2C+Jonathan+E&rft.au=Spehar%2C+Brent&rft.au=Jones%2C+Michael+S&rft.au=McConkey%2C+Sarah&rft.date=2022-01-19&rft.eissn=1529-2401&rft.volume=42&rft.issue=3&rft.spage=435&rft.epage=442&rft_id=info:doi/10.1523%2FJNEUROSCI.0114-21.2021&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0270-6474&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0270-6474&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0270-6474&client=summon |