Gaze-direction-based MEG averaging during audiovisual speech perception
Loading...
Access rights
© 2010 Frontiers Media SA. This document is protected by copyright and was first published by Frontiers. All rights reserved. It is reproduced with permission.
Final published version
URL
Journal Title
Journal ISSN
Volume Title
School of Science |
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä
Unless otherwise stated, all rights belong to the author. You may download, display and print this publication for Your own personal use. Commercial use is prohibited.
Date
2010
Major/Subject
Mcode
Degree programme
Language
en
Pages
7
Series
Frontiers in Human Neuroscience
Abstract
To take a step towards real-life-like experimental setups, we simultaneously recorded magnetoencephalographic (MEG) signals and subject’s gaze direction during audiovisual speech perception. The stimuli were utterances of /apa/ dubbed onto two side-by-side female faces articulating /apa/ (congruent) and /aka/ (incongruent) in synchrony, repeated once every 3 s. Subjects (N = 10) were free to decide which face they viewed, and responses were averaged to two categories according to the gaze direction. The right-hemisphere 100-ms response to the onset of the second vowel (N100m’) was a fifth smaller to incongruent than congruent stimuli. The results demonstrate the feasibility of realistic viewing conditions with gaze-based averaging of MEG signals.Description
Keywords
auditory cortex, eye tracking, human, magnetoencephalography, McGurk illusion
Other note
Citation
Hirvenkari, Lotta & Jousmäki, Veikko & Lamminmäki, Satu & Saarinen, Veli-Matti & Sams, Mikko E. & Hari, Riitta. 2010. Gaze-direction-based MEG averaging during audiovisual speech perception. Frontiers in Human Neuroscience. 7. ISSN 1662-5161 (printed). DOI: 10.3389/fnhum.2010.00017.