Meaning is extracted from sensory inputs through dynamic transformations of information. Representational Similarity Analysis (RSA) for source-localised MEG signals has the promise to uncover representational transformations over time. RSA determines the information represented in distributed activity patterns. The core principle of RSA is similar stimuli, for example objects with a similar shape, produce similar activity patterns in a region that represents this information. By analysing the similarity of neural activity, and how this relates to the similarity of stimulus properties, we can uncover what information is coded in neural signals. Here we show the utility of RSA for source-localised MEG signals. Drawing on two examples, we show how RSA for MEG can reveal the representational transformations during object recognition and speech comprehension. First, we show how alpha oscillatory spatio-temporal patterns in early visual cortex represent low-level visual properties of objects, while object category information is subsequently represented in IT cortex. Further, we show that oscillatory phase signals carry more information than power. Second, using single spoken words and searchlight analysis of MEG source localised signals, we show how lexical and semantic competition engage posterior middle temporal and inferior frontal regions during early spoken input - when word identity remains ambiguous. As the speech input unfolds and the word becomes uniquely identifiable, semantic effects emerge in the middle temporal and angular gyrus. These studies highlight how RSA for MEG source-localised data can reveal dynamic representational transformations as we understand meaning from our senses.
|Publication status||Published - 2017|
|Event||Annual Meeting of the Cognitive Neuroscience Society - San Francisco, United States|
Duration: 25 Mar 2017 → 28 Mar 2017
|Conference||Annual Meeting of the Cognitive Neuroscience Society|
|Period||25/03/2017 → 28/03/2017|