Gaze-direction-based MEG averaging during audiovisual speech perception

Loading...
Thumbnail Image
Journal Title
Journal ISSN
Volume Title
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä
Date
2010-03-08
Major/Subject
Mcode
Degree programme
Language
en
Pages
7
1-7
Series
FRONTIERS IN HUMAN NEUROSCIENCE, Volume 4
Abstract
To take a step towards real-life-like experimental setups, we simultaneously recorded magnetoencephalographic (MEG) signals and subject’s gaze direction during audiovisual speech perception. The stimuli were utterances of /apa/ dubbed onto two side-by-side female faces articulating /apa/ (congruent) and /aka/ (incongruent) in synchrony, repeated once every 3 s. Subjects (N = 10) were free to decide which face they viewed, and responses were averaged to two categories according to the gaze direction. The right-hemisphere 100-ms response to the onset of the second vowel (N100m’) was a fifth smaller to incongruent than congruent stimuli. The results demonstrate the feasibility of realistic viewing conditions with gaze-based averaging of MEG signals.
Description
Keywords
auditory cortex, eye tracking, human, magnetoencephalography, McGurk illusion
Other note
Citation
Hirvenkari , L , Jousmäki , V , Lamminmäki , S , Saarinen , V-M , Sams , M E & Hari , R 2010 , ' Gaze-direction-based MEG averaging during audiovisual speech perception ' , FRONTIERS IN HUMAN NEUROSCIENCE , vol. 4 , 17 , pp. 1-7 . https://doi.org/10.3389/fnhum.2010.00017