You are here

Alona Fyshe on decoding word meaning from brain images during language production; Rohan Saha

Rohan Saha
University of Alberta
Primer: Machines read, humans read: parallels between computer and human representations of meaning

Computational linguists build models of language meaning by processing huge bodies of text, often scraped from the internet. These learned models typically represent the meaning of a word using a point in high dimensional space. When people read, their brains produce a representation of meaning that can also be thought of as a point in high-dimensional space defined by neuronal firing patterns. Using brain imaging, we can record these representations (albeit in a very lossy way) and compare human representations to those learned by a computer. Here I will describe the framework we use to make these comparisons (often called decoding), which allows us to search for neural patterns correlated with the dimensions of word meaning. Using an example case study, I will show the utility of this framework, and use it to show evidence for the repetition of these neural patterns during a phrase reading paradigm.

 
Alona Fyshe
Depts. of Computing Science, Psychology, University of Alberta; Canadian Institute for Advanced Research
Meeting: Decoding word meaning from brain images collected during language production

Models of language meaning have been used to explore the human brain's representation of word meaning while listening to speech or reading. Though these language models are trained only from large collections of text, and know nothing about the brain, they seem to represent information in a way that mirrors the language-perceiving brain. In this talk I will describe our work, which used a decoding approach to detect the meaning of a word or phrase not during perception, but in preparation for language production (speaking). We found that decoding word meaning is possible pre-utterance, and discovered interesting connections between the representation of words in isolation and words in a phrase.