Browsing by Author "Jaaskelainen, Iiro P."
Now showing 1 - 4 of 4
- Results Per Page
- Sort Options
- Differential brain mechanisms during reading human vs. machine translated fiction and news texts
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä(2019-09-13) Lin, Fa Hsuan; Liu, Yun Fei; Lee, Hsin Ju; Chang, Claire H.C.; Jaaskelainen, Iiro P.; Yeh, Jyh Neng; Kuo, Wen JuiFew neuroimaigng studies on reading comprehension have been conducted under natural reading settings. In this study, we showed texts presented in a natural way during functional MRI (fMRI) measurements to reveal brain areas sensitive to reading comprehension. Specifically, this paradigm independently manipulated two holistic features of article style: text genre and translation style, a qualitative index of how typical word choices and arrangements are made in daily use of the language. Specifically, articles from The New York Times (news) and Reader's Digest (fiction) translated from English to Mandarin Chinese either by human experts or machine (Google Translate) were used to investigate the correlation of brain activity across participants during article reading. We found that bi-hemispheric visual cortex, precuneus, and occipito-parietal junction show significantly correlated hemodynamics across participants regardless of translation style and article genre. Compared to machine translation, reading human expert translation elicited more reliable fMRI signals across participants at precuneus, potentially because narrative representations and contents can be coherently presented over tens of seconds. We also found significantly stronger inter-subject correlated fMRI signals at temporal poles and fusiform gyri in fiction reading than in news reading. This may be attributed to more stable empathy processing across participants in fiction reading. The degree of stability of brain responses across subjects at extra-linguistic areas was found correlated with subjective rating on the text fluency. The functional connectivity between these areas was modulated by text genre and translation style. Taken together, our imaging results suggested stable and selective neural substrates associated with comprehending holistic features of written narratives. - Functional MRI of the vocalization-processing network in the macaque brain
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä(2015-04-01) Ortiz-Rios, Michael; Kusmierek, Wet; DeWitt, Iain; Archakov, Denis; Azevedo, Frederico A.C.; Sams, Mikko; Jaaskelainen, Iiro P.; Keliris, Georgios A.; Rauschecker, Josef P.Using functional magnetic resonance imaging in awake behaving monkeys we investigated how species-specific vocalizations are represented in auditory and auditory-related regions of the macaque brain. We found clusters of active voxels along the ascending auditory pathway that responded to various types of complex sounds: inferior colliculus (IC), medial geniculate nucleus (MGN), auditory core, belt, and parabelt cortex, and other parts of the superior temporal gyrus (STG) and sulcus (STS). Regions sensitive to monkey calls were most prevalent in the anterior STG, but some clusters were also found in frontal and parietal cortex on the basis of comparisons between responses to calls and environmental sounds. Surprisingly, we found that spectrotemporal control sounds derived from the monkey calls (“scrambled calls”) also activated the parietal and frontal regions. Taken together, our results demonstrate that species-specific vocalizations in rhesus monkeys activate preferentially the auditory ventral stream, and in particular areas of the antero-lateral belt and parabelt. - Mental Action Simulation Synchronizes Action-Observation Circuits across Individuals
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä(2014-01-15) Nummenmaa, Lauri; Smirnov, Dmitry; Lahnakoski, Juha M.; Glerean, Enrico; Jaaskelainen, Iiro P.; Sams, Mikko; Hari, RiittaA frontoparietal action–observation network (AON) has been proposed to support understanding others' actions and goals. We show that the AON “ticks together” in human subjects who are sharing a third person's feelings. During functional magnetic resonance imaging, 20 volunteers watched movies depicting boxing matches passively or while simulating a prespecified boxer's feelings. Instantaneous intersubject phase synchronization (ISPS) was computed to derive multisubject voxelwise similarity of hemodynamic activity and inter-area functional connectivity. During passive viewing, subjects' brain activity was synchronized in sensory projection and posterior temporal cortices. Simulation induced widespread increase of ISPS in the AON (premotor, posterior parietal, and superior temporal cortices), primary and secondary somatosensory cortices, and the dorsal attention circuits (frontal eye fields, intraparietal sulcus). Moreover, interconnectivity of these regions strengthened during simulation. We propose that sharing a third person's feelings synchronizes the observer's own brain mechanisms supporting sensations and motor planning, thereby likely promoting mutual understanding. - Two-Stage Processing of Sounds Explains Behavioral Performance Variations due to Changes in Stimulus Contrast and Selective Attention: An MEG Study
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä(2012) Kauramaki, Jaakko; Jaaskelainen, Iiro P.; Hanninen, Jarno L.; Auranen, Toni; Nummenmaa, Aapo; Lampinen, Jouko; Sams, MikkoSelectively attending to task-relevant sounds whilst ignoring background noise is one of the most amazing feats performed by the human brain. Here, we studied the underlying neural mechanisms by recording magnetoencephalographic (MEG) responses of 14 healthy human subjects while they performed a near-threshold auditory discrimination task vs. a visual control task of similar difficulty. The auditory stimuli consisted of notch-filtered continuous noise masker sounds, and of 1020-Hz target tones occasionally () replacing 1000-Hz standard tones of 300-ms duration that were embedded at the center of the notches, the widths of which were parametrically varied. As a control for masker effects, tone-evoked responses were additionally recorded without masker sound. Selective attention to tones significantly increased the amplitude of the onset M100 response at 100 ms to the standard tones during presence of the masker sounds especially with notches narrower than the critical band. Further, attention modulated sustained response most clearly at 300–400 ms time range from sound onset, with narrower notches than in case of the M100, thus selectively reducing the masker-induced suppression of the tone-evoked response. Our results show evidence of a multiple-stage filtering mechanism of sensory input in the human auditory cortex: 1) one at early (100 ms) latencies bilaterally in posterior parts of the secondary auditory areas, and 2) adaptive filtering of attended sounds from task-irrelevant background masker at longer latency (300 ms) in more medial auditory cortical regions, predominantly in the left hemisphere, enhancing processing of near-threshold sounds.