Neuroscientists get a glimpse into the workings of the baby brain

In adults, certain regions of the brain’s visual cortex respond preferentially to specific types of input, such as faces or objects — but how and when those preferences arise has long puzzled neuroscientists.

One way to help answer that question is to study the brains of very young infants and compare them to adult brains. However, scanning the brains of awake babies in an MRI machine has proven difficult.

Now, neuroscientists at MIT have overcome that obstacle, adapting their MRI scanner to make it easier to scan infants’ brains as the babies watch movies featuring different types of visual input. Using these data, the team found that in some ways, the organization of infants’ brains is surprisingly similar to that of adults. Specifically, brain regions that respond to faces in adults do the same in babies, as do regions that respond to scenes.

“It suggests that there’s a stronger biological predisposition than I would have guessed for specific cortical regions to end up with specific functions,” says Rebecca Saxe, a professor of brain and cognitive sciences and member of MIT’s McGovern Institute for Brain Research.

Saxe is the senior author of the study, which appears in the Jan. 10 issue of Nature Communications. The paper’s lead author is former MIT graduate student Ben Deen, who is now a postdoc at Rockefeller University.

MRI adaptations

Functional MRI (magnetic resonance imaging) is the go-to technique for studying brain function in adults. However, very few researchers have taken on the challenge of trying to scan babies’ brains, especially while they are awake.

“Babies and MRI machines have very different needs,” Saxe points out. “Babies would like to do activities for two or three minutes and then move on. They would like to be sitting in a comfortable position, and in charge of what they’re looking at.”

On the other hand, “MRI machines would like to be loud and dark and have a person show up on schedule, stay still for the entire time, pay attention to one thing for two hours, and follow instructions closely,” she says.

To make the setup more comfortable for babies, the researchers made several modifications to the MRI machine and to their usual experimental protocols. First, they built a special coil (part of the MRI scanner that acts as a radio antenna) that allows the baby to recline in a seat similar to a car seat. A mirror in front of the baby’s face allows him or her to watch videos, and there is space in the machine for a parent or one of the researchers to sit with the baby.

The researchers also made the scanner much less noisy than a typical MRI machine. “It’s quieter than a loud restaurant,” Saxe says. “The baby can hear their parent talking over the sound of the scanner.”

Once the babies, who were 4 to 6 months old, were in the scanner, the researchers played the movies continuously while scanning the babies’ brains. However, they only used data from the time periods when the babies were actively watching the movies. From 26 hours of scanning 17 babies, the researchers obtained four hours of usable data from nine babies.

“The sheer tenacity of this work is truly amazing,” says Charles Nelson, a professor of pediatrics at Boston Children’s Hospital, who was not involved in the research. “The fact that they pulled this off is incredibly novel.”

Obtaining this data allowed the MIT team to study how infants’ brains respond to specific types of sensory input, and to compare their responses with those of adults.

“The big-picture question is, how does the adult brain come to have the structure and function that you see in adulthood? How does it get like that?” Saxe says. “A lot of the answer to that question will depend on having the tools to be able to see the baby brain in action. The more we can see, the more we can ask that kind of question.”

Distinct preferences

The researchers showed the babies videos of either smiling children or outdoor scenes such as a suburban street seen from a moving car. Distinguishing social scenes from the physical environment is one of the main high-level divisions that our brains make when interpreting the world.

“The questions we’re asking are about how you understand and organize your world, with vision as the main modality for getting you into these very different mindsets,” Saxe says. “In adults, there are brain regions that prefer to look at faces and socially relevant things, and brain regions that prefer to look at environments and objects.”

The scans revealed that many regions of the babies’ visual cortex showed the same preferences for scenes or faces seen in adult brains. This suggests that these preferences form within the first few months of life and refutes the hypothesis that it takes years of experience interpreting the world for the brain to develop the responses that it shows in adulthood.

The researchers also found some differences in the way that babies’ brains respond to visual stimuli. One is that they do not seem to have regions found in the adult brain that are “highly selective,” meaning these regions prefer features such as human faces over any other kind of input, including human bodies or the faces of other animals. The babies also showed some differences in their responses when shown examples from four different categories — not just faces and scenes but also bodies and objects.

“We believe that the adult-like organization of infant visual cortex provides a scaffolding that guides the subsequent refinement of responses via experience, ultimately leading to the strongly specialized regions observed in adults,” Deen says.

Saxe and colleagues now hope to try to scan more babies between the ages of 3 and 8 months so they can get a better idea of how these vision-processing regions change over the first several months of life. They also hope to study even younger babies to help them discover when these distinctive brain responses first appear.

Young brains can take on new functions

In 2011, MIT neuroscientist Rebecca Saxe and colleagues reported that in blind adults, brain regions normally dedicated to vision processing instead participate in language tasks such as speech and comprehension. Now, in a study of blind children, Saxe’s lab has found that this transformation occurs very early in life, before the age of 4.

The study, appearing in the Journal of Neuroscience, suggests that the brains of young children are highly plastic, meaning that regions usually specialized for one task can adapt to new and very different roles. The findings also help to define the extent to which this type of remodeling is possible.

“In some circumstances, patches of cortex appear to take on other roles than the ones that they most typically have,” says Saxe, a professor of cognitive neuroscience and an associate member of MIT’s McGovern Institute for Brain Research. “One question that arises from that is, ‘What is the range of possible differences between what a cortical region typically does and what it could possibly do?’”

The paper’s lead author is Marina Bedny, a former MIT postdoc who is now an assistant professor at Johns Hopkins University. MIT graduate student Hilary Richardson is also an author of the paper.

Brain reorganization

The brain’s cortex, which carries out high-level functions such as thought, sensory processing, and initiation of movement, is made of sheets of neurons, each dedicated to a certain role. Within the visual system, located primarily in the occipital lobe, most neurons are tuned to respond only to a very specific aspect of visual input, such as brightness, orientation, or location in the field of view.

“There’s this big fundamental question, which is, ‘How did that organization get there, and to what degree can it be changed?’” Saxe says.

One possibility is that neurons in each patch of cortex have evolved to carry out specific roles, and can do nothing else. At the other extreme is the possibility that any patch of cortex can be recruited to perform any kind of computational task.

“The reality is somewhere in between those two,” Saxe says.

To study the extent to which cortex can change its function, scientists have focused on the visual cortex because they can learn a great deal about it by studying people who were born blind.

A landmark 1996 study of blind people found that their visual regions could participate in a nonvisual task — reading Braille. Some scientists theorized that perhaps the visual cortex is recruited for reading Braille because like vision, it requires discriminating very fine-grained patterns.

However, in their 2011 study, Saxe and Bedny found that the visual cortex of blind adults also responds to spoken language. “That was weird, because processing auditory language doesn’t require the kind of fine-grained spatial discrimination that Braille does,” Saxe says.

She and Bedny hypothesized that auditory language processing may develop in the occipital cortex by piggybacking onto the Braille-reading function. To test that idea, they began studying congenitally blind children, including some who had not learned Braille yet. They reasoned that if their hypothesis were correct, the occipital lobe would be gradually recruited for language processing as the children learned Braille.

However, they found that this was not the case. Instead, children as young as 4 already have language-related activity in the occipital lobe.

“The response of occipital cortex to language is not affected by Braille acquisition,” Saxe says. “It happens before Braille and it doesn’t increase with Braille.”

Language-related occipital activity was similar among all of the 19 blind children, who ranged in age from 4 to 17, suggesting that the entire process of occipital recruitment for language processing takes place before the age of 4, Saxe says. Bedny and Saxe have previously shown that this transition occurs only in people blind from birth, suggesting that there is an early critical period after which the cortex loses much of its plasticity.

The new study represents a huge step forward in understanding how the occipital cortex can take on new functions, says Ione Fine, an associate professor of psychology at the University of Washington.

“One thing that has been missing is an understanding of the developmental timeline,” says Fine, who was not involved in the research. “The insight here is that you get plasticity for language separate from plasticity for Braille and separate from plasticity for auditory processing.”

Language skills

The findings raise the question of how the extra language-processing centers in the occipital lobe affect language skills.

“This is a question we’ve always wondered about,” Saxe says. “Does it mean you’re better at those functions because you have more of your cortex doing it? Does it mean you’re more resilient in those functions because now you have more redundancy in your mechanism for doing it? You could even imagine the opposite: Maybe you’re less good at those functions because they’re distributed in an inefficient or atypical way.”

There are hints that the occipital lobe’s contribution to language-related functions “takes the pressure off the frontal cortex,” where language processing normally occurs, Saxe says. Other researchers have shown that suppressing left frontal cortex activity with transcranial magnetic stimulation interferes with language function in sighted people, but not in the congenitally blind.

This leads to the intriguing prediction that a congenitally blind person who suffers a stroke in the left frontal cortex may retain much more language ability than a sighted person would, Saxe says, although that hypothesis has not been tested.

Saxe’s lab is now studying children under 4 to try to learn more about how cortical functions develop early in life, while Bedny is investigating whether the occipital lobe participates in functions other than language in congenitally blind people.

When good people do bad things

When people get together in groups, unusual things can happen — both good and bad. Groups create important social institutions that an individual could not achieve alone, but there can be a darker side to such alliances: Belonging to a group makes people more likely to harm others outside the group.

“Although humans exhibit strong preferences for equity and moral prohibitions against harm in many contexts, people’s priorities change when there is an ‘us’ and a ‘them,’” says Rebecca Saxe, an associate professor of cognitive neuroscience at MIT. “A group of people will often engage in actions that are contrary to the private moral standards of each individual in that group, sweeping otherwise decent individuals into ‘mobs’ that commit looting, vandalism, even physical brutality.”

Several factors play into this transformation. When people are in a group, they feel more anonymous, and less likely to be caught doing something wrong. They may also feel a diminished sense of personal responsibility for collective actions.

Saxe and colleagues recently studied a third factor that cognitive scientists believe may be involved in this group dynamic: the hypothesis that when people are in groups, they “lose touch” with their own morals and beliefs, and become more likely to do things that they would normally believe are wrong.

In a study that recently went online in the journal NeuroImage, the researchers measured brain activity in a part of the brain involved in thinking about oneself. They found that in some people, this activity was reduced when the subjects participated in a competition as part of a group, compared with when they competed as individuals. Those people were more likely to harm their competitors than people who did not exhibit this decreased brain activity.

“This process alone does not account for intergroup conflict: Groups also promote anonymity, diminish personal responsibility, and encourage reframing harmful actions as ‘necessary for the greater good.’ Still, these results suggest that at least in some cases, explicitly reflecting on one’s own personal moral standards may help to attenuate the influence of ‘mob mentality,’” says Mina Cikara, a former MIT postdoc and lead author of the NeuroImage paper.

Group dynamics

Cikara, who is now an assistant professor at Carnegie Mellon University, started this research project after experiencing the consequences of a “mob mentality”: During a visit to Yankee Stadium, her husband was ceaselessly heckled by Yankees fans for wearing a Red Sox cap. “What I decided to do was take the hat from him, thinking I would be a lesser target by virtue of the fact that I was a woman,” Cikara says. “I was so wrong. I have never been called names like that in my entire life.”

The harassment, which continued throughout the trip back to Manhattan, provoked a strong reaction in Cikara, who isn’t even a Red Sox fan.

“It was a really amazing experience because what I realized was I had gone from being an individual to being seen as a member of ‘Red Sox Nation.’ And the way that people responded to me, and the way I felt myself responding back, had changed, by virtue of this visual cue — the baseball hat,” she says. “Once you start feeling attacked on behalf of your group, however arbitrary, it changes your psychology.”

Cikara, then a third-year graduate student at Princeton University, started to investigate the neural mechanisms behind the group dynamics that produce bad behavior. In the new study, done at MIT, Cikara, Saxe (who is also an associate member of MIT’s McGovern Institute for Brain Research), former Harvard University graduate student Anna Jenkins, and former MIT lab manager Nicholas Dufour focused on a part of the brain called the medial prefrontal cortex. When someone is reflecting on himself or herself, this part of the brain lights up in functional magnetic resonance imaging (fMRI) brain scans.

A couple of weeks before the study participants came in for the experiment, the researchers surveyed each of them about their social-media habits, as well as their moral beliefs and behavior. This allowed the researchers to create individualized statements for each subject that were true for that person — for example, “I have stolen food from shared refrigerators” or “I always apologize after bumping into someone.”

When the subjects arrived at the lab, their brains were scanned as they played a game once on their own and once as part of a team. The purpose of the game was to press a button if they saw a statement related to social media, such as “I have more than 600 Facebook friends.”

The subjects also saw their personalized moral statements mixed in with sentences about social media. Brain scans revealed that when subjects were playing for themselves, the medial prefrontal cortex lit up much more when they read moral statements about themselves than statements about others, consistent with previous findings. However, during the team competition, some people showed a much smaller difference in medial prefrontal cortex activation when they saw the moral statements about themselves compared to those about other people.

Those people also turned out to be much more likely to harm members of the competing group during a task performed after the game. Each subject was asked to select photos that would appear with the published study, from a set of four photos apiece of two teammates and two members of the opposing team. The subjects with suppressed medial prefrontal cortex activity chose the least flattering photos of the opposing team members, but not of their own teammates.

“This is a nice way of using neuroimaging to try to get insight into something that behaviorally has been really hard to explore,” says David Rand, an assistant professor of psychology at Yale University who was not involved in the research. “It’s been hard to get a direct handle on the extent to which people within a group are tapping into their own understanding of things versus the group’s understanding.”

Getting lost

The researchers also found that after the game, people with reduced medial prefrontal cortex activity had more difficulty remembering the moral statements they had heard during the game.

“If you need to encode something with regard to the self and that ability is somehow undermined when you’re competing with a group, then you should have poor memory associated with that reduction in medial prefrontal cortex signal, and that’s exactly what we see,” Cikara says.

Cikara hopes to follow up on these findings to investigate what makes some people more likely to become “lost” in a group than others. She is also interested in studying whether people are slower to recognize themselves or pick themselves out of a photo lineup after being absorbed in a group activity.

The research was funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development, the Air Force Office of Scientific Research, and the Packard Foundation.

Thinking about others is not child’s play

When you try to read other people’s thoughts, or guess why they are behaving a certain way, you employ a skill known as theory of mind. This skill, as measured by false-belief tests, takes time to develop: In children, it doesn’t start appearing until the age of 4 or 5.

Several years ago, MIT neuroscientist Rebecca Saxe showed that in adults, theory of mind is seated in a specific brain region known as the right temporo-parietal junction (TPJ). Saxe and colleagues at MIT have now shown how brain activity in the TPJ changes as children learn to reason about others’ thoughts and feelings.

The findings suggest that the right TPJ becomes more specific to theory of mind as children age, taking on adult patterns of activity over time. The researchers also showed that the more selectively the right TPJ is activated when children listen to stories about other people’s thoughts, the better those children perform in tasks that require theory of mind.

The paper, published in the July 31 online edition of the journal Child Development, lays the groundwork for exploring theory-of-mind impairments in autistic children, says Hyowon Gweon, a graduate student in Saxe’s lab and lead author of the paper.

“Given that we know this is what typically developing kids show, the next question to ask is how it compares to autistic children who exhibit marked impairments in their ability to think about other people’s minds,” Gweon says. “Do they show differences from typically developing kids in their neural activity?”

Saxe, an associate professor of brain and cognitive sciences and associate member of MIT’s McGovern Institute for Brain Research, is senior author of the Child Development paper. Other authors are Marina Bedny, a postdoc in Saxe’s lab, and David Dodell-Feder, a graduate student at Harvard University.

Tracking theory of mind

The classic test for theory of mind is the false-belief test, sometimes called the Sally-Anne test. Experimenters often use dolls or puppets to perform a short skit: Sally takes a marble and hides it in her basket, then leaves the room. Anne then removes the marble and puts it in her own box. When Sally returns, the child watching the skit is asked: Where will Sally look for her marble?

Children with well-developed theory of mind realize that Sally will look where she thinks the marble is: her own basket. However, before children develop this skill, they don’t realize that Sally’s beliefs may not correspond to reality. Therefore, they believe she will look for the marble where it actually is, in Anne’s box.

Previous studies have shown that children start making accurate predictions in the false belief test around age 4 — but this happens much later, if ever, in autistic children.

In this study, the researchers used functional magnetic resonance imaging (fMRI) to look for a link between the development of theory of mind and changes in neural activity in the TPJ. They studied 20 children, ranging from 5 to 11 years old.

Each child participated in two sets of experiments. First, the child was scanned in the MRI machine as he or she listened to different types of stories. One type focused on people’s mental states, another also focused on people but only on their physical appearances or actions, and a third type of story focused on physical objects.

The researchers measured activity across the brain as the children listened to different stories. By subtracting neural activity as they listen to stories about physical states from activity as they listen to stories about people’s mental states, the researchers can determine which brain regions are exclusive to interpreting people’s mental states.

In younger children, both the left and right TPJ were active in response to stories about people’s mental states, but they were also active when the children listened to stories about people’s appearances or actions. However, in older children, both regions became more specifically tuned to interpreting people’s thoughts and emotions, and were no longer responsive to people’s appearances or actions.

For the second task, done outside of the scanner, the researchers gave children tests similar to the classic Sally-Anne test, as well as harder questions that required making moral judgments, to measure their theory-of-mind abilities. They found that the degree to which activity in the right TPJ was specific to others’ mental states correlated with the children’s performance in theory-of-mind tasks.

Kristin Lagattuta, an associate professor of psychology at the University of California at Davis, says the paper makes an important contribution to understanding how theory of mind develops in older children. “Getting more insight into the neural basis of the behavioral development we’re seeing at these ages is exciting,” says Lagattuta, who was not involved in the research.

In an ongoing study of autistic children undergoing the same type of tests, the researchers hope to learn more about the neural basis of the theory-of-mind impairments seen in autistic children.

So little is known about differences in neural mechanisms that contribute to these kinds of impairments,” Gweon says. “Understanding the developmental changes in brain regions related to theory of mind is going to be critical to think of measures that can help them in the real world.”

The research was funded by the Ellison Medical Foundation, the Packard Foundation, the John Merck Scholars Program, a National Science Foundation Career Award and an Ewha 21st Century Scholarship.