Whether listening to music or enjoying the latest audio book, we've likely all taken a moment to close our eyes, relax and let our ears do all the work. Behind the scenes, however, a wild orchestra of activity is taking place, with our brains engaged in giving instant meaning to everything we're hearing.
For the longest time, neuroscientists suspected that the brain restricted word interpretation to the same specific regions associated with speech. When a group of researchers at the University of California, Berkeley, went for a closer look with a MRI machine, they were stunned to discover the brain awash in activity across nearly its entire neural landscape. Their study, published this week in the journal Nature, sheds promising new light in understanding how our brains handle word interpretation.
"Our goal was to build a giant atlas that shows how one specific aspect of language is represented in the brain, in this case semantics, or the meanings of words," Jack Gallant, a neuroscientist at the University of California, Berkeley, told the Guardian.
To figure out how the brain maps the meaning of words, the researchers placed participants in a MRI machine and had them listen to two hours worth of stories from "The Moth Radio Hour." Blood flow to more than 50,000 different parts of the brain was recorded, with the imaging data later matched against time-coded transcriptions of the stories. The results were then translated into a kind of word cloud "where the words are arranged on the flattened cortices of the left and right hemispheres of the brain."
As the above video explains, the atlas reveals new insights into how our brains associate words with related concepts. For instance, the word "star" might light up a small area of the brain relating to an object in space or, depending on meaning, another area relating to a famous individual. While the study group included only seven English-speaking individuals, the researchers were surprised to discover that the layout of the word cloud for all was fairly similar.
"Our semantic models are good at predicting responses to language in several big swaths of cortex," lead study author Alexander Huth said in a statement. "But we also get the fine-grained information that tells us what kind of information is represented in each brain area. That's why these maps are so exciting and hold so much potential."
Could advancements in this area one day create a machine with the sole function of reading our minds? The researchers say the potential is there, with the possibility of such a device helping victims of stroke or brain damage more easily convey their inner thoughts.
"This discovery paves the way for brain-machine interfaces that can interpret the meaning of what people want to express," Huth added. "Imagine a brain-machine interface that doesn't just figure out what sounds you want to make, but what you want to say."
Those interested in exploring more about our brain's semantic atlas can check out this beautiful, interactive model online. While this 3-D scan is only from one individual, the researchers hope to update it with the additional participants shortly; they also hope to perform future studies that take into account various cultures and languages.
"Future studies that include subjects from more diverse backgrounds will be needed to determine how much of this organizational consistency reflects innate brain structure versus experience," they added.