Attention to audiovisual speech shapes neural processing through feedback-feedforward loops between different nodes of the speech network

dc.contributorAalto-yliopistofi
dc.contributorAalto Universityen
dc.contributor.authorWikman, Patriken_US
dc.contributor.authorSalmela, Viljamien_US
dc.contributor.authorSjöblom, Eetuen_US
dc.contributor.authorLeminen, Miikaen_US
dc.contributor.authorLaine, Mattien_US
dc.contributor.authorAlho, Kimmoen_US
dc.contributor.departmentDepartment of Neuroscience and Biomedical Engineeringen
dc.contributor.organizationUniversity of Helsinkien_US
dc.contributor.organizationÅbo Akademi Universityen_US
dc.date.accessioned2024-03-27T07:57:35Z
dc.date.available2024-03-27T07:57:35Z
dc.date.issued2024-03en_US
dc.descriptionPublisher Copyright: © 2024 Wikman et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
dc.description.abstractSelective attention-related top-down modulation plays a significant role in separating relevant speech from irrelevant background speech when vocal attributes separating concurrent speakers are small and continuously evolving. Electrophysiological studies have shown that such top-down modulation enhances neural tracking of attended speech. Yet, the specific cortical regions involved remain unclear due to the limited spatial resolution of most electrophysiological techniques. To overcome such limitations, we collected both electroencephalography (EEG) (high temporal resolution) and functional magnetic resonance imaging (fMRI) (high spatial resolution), while human participants selectively attended to speakers in audiovisual scenes containing overlapping cocktail party speech. To utilise the advantages of the respective techniques, we analysed neural tracking of speech using the EEG data and performed representational dissimilarity-based EEG-fMRI fusion. We observed that attention enhanced neural tracking and modulated EEG correlates throughout the latencies studied. Further, attention-related enhancement of neural tracking fluctuated in predictable temporal profiles. We discuss how such temporal dynamics could arise from a combination of interactions between attention and prediction as well as plastic properties of the auditory cortex. EEG-fMRI fusion revealed attention-related iterative feedforward-feedback loops between hierarchically organised nodes of the ventral auditory object related processing stream. Our findings support models where attention facilitates dynamic neural changes in the auditory cortex, ultimately aiding discrimination of relevant sounds from irrelevant ones while conserving neural resources.en
dc.description.versionPeer revieweden
dc.format.extent27
dc.format.mimetypeapplication/pdfen_US
dc.identifier.citationWikman, P, Salmela, V, Sjöblom, E, Leminen, M, Laine, M & Alho, K 2024, ' Attention to audiovisual speech shapes neural processing through feedback-feedforward loops between different nodes of the speech network ', PLoS Biology, vol. 22, no. 3, e3002534, pp. 1-27 . https://doi.org/10.1371/journal.pbio.3002534en
dc.identifier.doi10.1371/journal.pbio.3002534en_US
dc.identifier.issn1544-9173
dc.identifier.issn1545-7885
dc.identifier.otherPURE UUID: 727abf3e-e172-48c3-a626-2f6aa22f2607en_US
dc.identifier.otherPURE ITEMURL: https://research.aalto.fi/en/publications/727abf3e-e172-48c3-a626-2f6aa22f2607en_US
dc.identifier.otherPURE LINK: http://www.scopus.com/inward/record.url?scp=85187512911&partnerID=8YFLogxKen_US
dc.identifier.otherPURE FILEURL: https://research.aalto.fi/files/142136090/Attention_to_audiovisual_speech_shapes_neural_processing_through_feedback-feedforward_loops_between_different_nodes_of_the_speech_network.pdfen_US
dc.identifier.urihttps://aaltodoc.aalto.fi/handle/123456789/127282
dc.identifier.urnURN:NBN:fi:aalto-202403272915
dc.language.isoenen
dc.publisherPublic Library of Science
dc.relation.ispartofseriesPLoS Biology
dc.relation.ispartofseriesVolume 22, issue 3, pp. 1-27
dc.rightsopenAccessen
dc.titleAttention to audiovisual speech shapes neural processing through feedback-feedforward loops between different nodes of the speech networken
dc.typeA1 Alkuperäisartikkeli tieteellisessä aikakauslehdessäfi
dc.type.versionpublishedVersion
Files