Computational model mimics humans’ ability to predict emotions

When interacting with another person, you likely spend part of your time trying to anticipate how they will feel about what you’re saying or doing. This task requires a cognitive skill called theory of mind, which helps us to infer other people’s beliefs, desires, intentions, and emotions.

MIT neuroscientists have now designed a computational model that can predict other people’s emotions — including joy, gratitude, confusion, regret, and embarrassment — approximating human observers’ social intelligence. The model was designed to predict the emotions of people involved in a situation based on the prisoner’s dilemma, a classic game theory scenario in which two people must decide whether to cooperate with their partner or betray them.

To build the model, the researchers incorporated several factors that have been hypothesized to influence people’s emotional reactions, including that person’s desires, their expectations in a particular situation, and whether anyone was watching their actions.

“These are very common, basic intuitions, and what we said is, we can take that very basic grammar and make a model that will learn to predict emotions from those features,” says Rebecca Saxe, the John W. Jarve Professor of Brain and Cognitive Sciences, a member of MIT’s McGovern Institute for Brain Research, and the senior author of the study.

Sean Dae Houlihan PhD ’22, a postdoc at the Neukom Institute for Computational Science at Dartmouth College, is the lead author of the paper, which appears today in Philosophical Transactions A. Other authors include Max Kleiman-Weiner PhD ’18, a postdoc at MIT and Harvard University; Luke Hewitt PhD ’22, a visiting scholar at Stanford University; and Joshua Tenenbaum, a professor of computational cognitive science at MIT and a member of the Center for Brains, Minds, and Machines and MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL).

Predicting emotions

While a great deal of research has gone into training computer models to infer someone’s emotional state based on their facial expression, that is not the most important aspect of human emotional intelligence, Saxe says. Much more important is the ability to predict someone’s emotional response to events before they occur.

“The most important thing about what it is to understand other people’s emotions is to anticipate what other people will feel before the thing has happened,” she says. “If all of our emotional intelligence was reactive, that would be a catastrophe.”

To try to model how human observers make these predictions, the researchers used scenarios taken from a British game show called “Golden Balls.” On the show, contestants are paired up with a pot of $100,000 at stake. After negotiating with their partner, each contestant decides, secretly, whether to split the pool or try to steal it. If both decide to split, they each receive $50,000. If one splits and one steals, the stealer gets the entire pot. If both try to steal, no one gets anything.

Depending on the outcome, contestants may experience a range of emotions — joy and relief if both contestants split, surprise and fury if one’s opponent steals the pot, and perhaps guilt mingled with excitement if one successfully steals.

To create a computational model that can predict these emotions, the researchers designed three separate modules. The first module is trained to infer a person’s preferences and beliefs based on their action, through a process called inverse planning.

“This is an idea that says if you see just a little bit of somebody’s behavior, you can probabilistically infer things about what they wanted and expected in that situation,” Saxe says.

Using this approach, the first module can predict contestants’ motivations based on their actions in the game. For example, if someone decides to split in an attempt to share the pot, it can be inferred that they also expected the other person to split. If someone decides to steal, they may have expected the other person to steal, and didn’t want to be cheated. Or, they may have expected the other person to split and decided to try to take advantage of them.

The model can also integrate knowledge about specific players, such as the contestant’s occupation, to help it infer the players’ most likely motivation.

The second module compares the outcome of the game with what each player wanted and expected to happen. Then, a third module predicts what emotions the contestants may be feeling, based on the outcome and what was known about their expectations. This third module was trained to predict emotions based on predictions from human observers about how contestants would feel after a particular outcome. The authors emphasize that this is a model of human social intelligence, designed to mimic how observers causally reason about each other’s emotions, not a model of how people actually feel.

“From the data, the model learns that what it means, for example, to feel a lot of joy in this situation, is to get what you wanted, to do it by being fair, and to do it without taking advantage,” Saxe says.

Core intuitions

Once the three modules were up and running, the researchers used them on a new dataset from the game show to determine how the models’ emotion predictions compared with the predictions made by human observers. This model performed much better at that task than any previous model of emotion prediction.

The model’s success stems from its incorporation of key factors that the human brain also uses when predicting how someone else will react to a given situation, Saxe says. Those include computations of how a person will evaluate and emotionally react to a situation, based on their desires and expectations, which relate to not only material gain but also how they are viewed by others.

“Our model has those core intuitions, that the mental states underlying emotion are about what you wanted, what you expected, what happened, and who saw. And what people want is not just stuff. They don’t just want money; they want to be fair, but also not to be the sucker, not to be cheated,” she says.

“The researchers have helped build a deeper understanding of how emotions contribute to determining our actions; and then, by flipping their model around, they explain how we can use people’s actions to infer their underlying emotions. This line of work helps us see emotions not just as ‘feelings’ but as playing a crucial, and subtle, role in human social behavior,” says Nick Chater, a professor of behavioral science at the University of Warwick, who was not involved in the study.

In future work, the researchers hope to adapt the model so that it can perform more general predictions based on situations other than the game-show scenario used in this study. They are also working on creating models that can predict what happened in the game based solely on the expression on the faces of the contestants after the results were announced.

The research was funded by the McGovern Institute; the Paul E. and Lilah Newton Brain Science Award; the Center for Brains, Minds, and Machines; the MIT-IBM Watson AI Lab; and the Multidisciplinary University Research Initiative.

Tackling the MIT campus’s top energy consumers, building by building

When staff in MIT’s Department of Facilities would visualize energy use and carbon-associated emissions by campus buildings, Building 46 always stood out — attributed to its energy intensity, which accounted for 8 percent of MIT’s total campus energy use. This high energy draw was not surprising, as the building is home of the Brain and Cognitive Sciences Complex and a large amount of lab space, but it also made the building a perfect candidate for an energy performance audit to seek out potential energy saving opportunities.

This audit revealed that several energy efficiency updates to the building mechanical systems infrastructure, including optimization of the room-by-room ventilation rates, could result in an estimated 35 percent reduction of energy use, which would in turn lower MIT’s total greenhouse gas emissions by an estimated 2 percent — driving toward the Institute’s 2026 goal of net-zero and 2050 goal of elimination of direct campus emissions.

Building energy efficiency projects are not new for MIT. Since 2010, MIT has been engaged in a partnership agreement with utility company Eversource establishing the Efficiency Forward program, empowering MIT to invest in more than 300 energy conservation projects to date and lowering energy consumption on campus for a total calculated savings of approximately 70 million kilowatt hours and 4.2 million therms. But at 418,000 gross square feet, Building 46 is the first energy efficiency project of its size on the campus.

“We’ve never tackled a whole building like this — it’s the first capital project that is technically an energy project,” explains Siobhan Carr, energy efficiency program manager, who was part of the team overseeing the energy audit and lab ventilation performance assessment in the building. “That gives you an idea of the magnitude and complexity of this.”

The project started with the full building energy assessment and lab ventilation risk audit. “We had a team go through every corner of the building and look at every possible opportunity to save energy,” explains Jessica Parks, senior project manager for systems performance and turnover in campus construction. “One of the biggest issues we saw was that there’s a lot of dry lab spaces which are basically offices, but they’re all getting the same ventilation as if they were a high-intensity lab.” Higher ventilation and more frequent air exchange rates draw more energy. By optimizing for the required ventilation rates, there was an opportunity to save energy in nearly every space in the building.

In addition to the optimized ventilation, the project team will convert fume hoods from constant volume to variable volume and install equipment to help the building systems run more efficiently. The team also identified opportunities to work with labs to implement programs such as fume hood hibernation and unoccupied setbacks for temperature and ventilation. As different spaces in the building have varying needs, the energy retrofit will touch all 1,254 spaces in the building — one by one — to implement the different energy measures to reach that estimated 35 percent reduction in energy use.

Although time-consuming and complex, this room-by-room approach has a big benefit in that it has allowed research to continue in the space largely uninterrupted. With a few exceptions, the occupants of Building 46, which include the Department of Brain and Cognitive Sciences, The McGovern Institute for Brain Research, and The Picower Institute for Learning and Memory, have remained in place for the duration of the project. Partners in the MIT Environment, Health and Safety Office are instrumental to this balance of renovations and keeping the building operational during the optimization efforts and are one of several teams across MIT contributing to building efficiency efforts.

The completion date of the building efficiency project is set for 2024, but Carr says that some of the impact of this ongoing work may soon be seen. “We should start to see savings as we move through the building, and we expect to fully realize all of our projected savings a year after completion,” she says, noting that the length of time is required for a year-over-year perspective to see the full reduction in energy use.

The impact of the project goes far beyond the footprint of Building 46 as it offers insights and spurred actions for future projects — including buildings 76 and 68, the number two and three top energy users on campus. Both buildings recently underwent their own energy audits and lab ventilation performance assessments. The energy efficiency team is now crafting a plan for full-building approaches, much like Building 46. “To date, 46 has presented many learning opportunities, such as how to touch every space in a building while research continues, as well as how to overcome challenges encountered when working on existing systems,” explains Parks. “The good news is that we have developed solutions for those challenges and the teams have been proactively implementing those lessons in our other projects.”

Communication has proven to be another key for these large projects where occupants see the work happening and often play a role in answering questions about their unique space. “People are really engaged, they ask questions about the work, and we ask them about the space they’re in every day,” says Parks. “The Building 46 occupants have been wonderful partners as we worked in all of their spaces, which is paving the way for a successful project.”

The release of Fast Forward in 2021 has also made communications easier, notes Carr, who says the plan helps to frame these projects as part of the big picture — not just a construction interruption. “Fast Forward has brought a visibility into what we’re doing within [MIT] Facilities on these buildings,” she says. “It brings more eyes and ears, and people understand that these projects are happening throughout campus and not just in their own space — we’re all working to reduce energy and to reduce greenhouse gas across campus.”

The Energy Efficiency team will continue to apply that big-picture approach as ongoing building efficiency projects on campus are assessed to reach toward a 10 to 15 percent reduction in energy use and corresponding emissions over the next several years.

Scientists discover how mutations in a language gene produce speech deficits

Mutations of a gene called Foxp2 have been linked to a type of speech disorder called apraxia that makes it difficult to produce sequences of sound. A new study from MIT and National Yang Ming Chiao Tung University sheds light on how this gene controls the ability to produce speech.

In a study of mice, the researchers found that mutations in Foxp2 disrupt the formation of dendrites and neuronal synapses in the brain’s striatum, which plays important roles in the control of movement. Mice with these mutations also showed impairments in their ability to produce the high-frequency sounds that they use to communicate with other mice.

Those malfunctions arise because Foxp2 mutations prevent the proper assembly of motor proteins, which move molecules within cells, the researchers found.

“These mice have abnormal vocalizations, and in the striatum there are many cellular abnormalities,” says Ann Graybiel, an MIT Institute Professor, a member of MIT’s McGovern Institute for Brain Research, and an author of the paper. “This was an exciting finding. Who would have thought that a speech problem might come from little motors inside cells?”

Fu-Chin Liu PhD ’91, a professor at National Yang Ming Chiao Tung University in Taiwan, is the senior author of the study, which appears today in the journal Brain. Liu and Graybiel also worked together on a 2016 study of the potential link between Foxp2 and autism spectrum disorder. The lead authors of the new Brain paper are Hsiao-Ying Kuo and Shih-Yun Chen of National Yang Ming Chiao Tung University.

Speech control

Children with Foxp2-associated apraxia tend to begin speaking later than other children, and their speech is often difficult to understand. The disorder is believed to arise from impairments in brain regions, such as the striatum, that control the movements of the lips, mouth, and tongue. Foxp2 is also expressed in the brains of songbirds such as zebra finches and is critical to those birds’ ability to learn songs.

Foxp2 encodes a transcription factor, meaning that it can control the expression of many other target genes. Many species express Foxp2, but humans have a special form of Foxp2. In a 2014 study, Graybiel and colleagues found evidence that the human form of Foxp2, when expressed in mice, allowed the mice to accelerate the switch from declarative to procedural types of learning.

In that study, the researchers showed that mice engineered to express the human version of Foxp2, which differs from the mouse version by only two DNA base pairs, were much better at learning mazes and performing other tasks that require turning repeated actions into behavioral routines. Mice with human-like Foxp2 also had longer dendrites — the slender extensions that help neurons form synapses — in the striatum, which is involved in habit formation as well as motor control.

In the new study, the researchers wanted to explore how the Foxp2 mutation that has been linked with apraxia affects speech production, using ultrasonic vocalizations in mice as a proxy for speech. Many rodents and other animals such as bats produce these vocalizations to communicate with each other.

While previous studies, including the work by Liu and Graybiel in 2016, had suggested that Foxp2 affects dendrite growth and synapse formation, the mechanism for how that occurs was not known. In the new study, led by Liu, the researchers investigated one proposed mechanism, which is that Foxp2 affects motor proteins.

One of these molecular motors is the dynein protein complex, a large cluster of proteins that is responsible for shuttling molecules along microtubule scaffolds within cells.

“All kinds of molecules get shunted around to different places in our cells, and that’s certainly true of neurons,” Graybiel says. “There’s an army of tiny molecules that move molecules around in the cytoplasm or put them into the membrane. In a neuron, they may send molecules from the cell body all the way down the axons.”

A delicate balance

The dynein complex is made up of several other proteins. The most important of these is a protein called dynactin1, which interacts with microtubules, enabling the dynein motor to move along microtubules. In the new study, the researchers found that dynactin1 is one of the major targets of the Foxp2 transcription factor.

The researchers focused on the striatum, one of the regions where Foxp2 is most often found, and showed that the mutated version of Foxp2 is unable to suppress dynactin1 production. Without that brake in place, cells generate too much dynactin1. This upsets the delicate balance of dynein-dynactin1, which prevents the dynein motor from moving along microtubules.

Those motors are needed to shuttle molecules that are necessary for dendrite growth and synapse formation on dendrites. With those molecules stranded in the cell body, neurons are unable to form synapses to generate the proper electrophysiological signals they need to make speech production possible.

Mice with the mutated version of Foxp2 had abnormal ultrasonic vocalizations, which typically have a frequency of around 22 to 50 kilohertz. The researchers showed that they could reverse these vocalization impairments and the deficits in the molecular motor activity, dendritic growth, and electrophysiological activity by turning down the gene that encodes dynactin1.

Mutations of Foxp2 can also contribute to autism spectrum disorders and Huntington’s disease, through mechanisms that Liu and Graybiel previously studied in their 2016 paper and that many other research groups are now exploring. Liu’s lab is also investigating the potential role of abnormal Foxp2 expression in the subthalamic nucleus of the brain as a possible factor in Parkinson’s disease.

The research was funded by the Ministry of Science and Technology of Taiwan, the Ministry of Education of Taiwan, the U.S. National Institute of Mental Health, the Saks Kavanaugh Foundation, the Kristin R. Pressman and Jessica J. Pourian ’13 Fund, and Stephen and Anne Kott.

PhD student Wei-Chen Wang is moved to help people heal

This story originally appeared in the Spring 2023 issue of Spectrum.

___

When he turned his ankle five years ago as an undergraduate playing pickup basketball at the University of Illinois, Wei-Chen (Eric) Wang SM ’22 knew his life would change in certain ways. For one thing, Wang, then a computer science major, wouldn’t be playing basketball anytime soon. He also assumed, correctly, that he might require physical therapy (PT).

What he did not foresee was that this minor injury would influence his career trajectory. While lying on the PT bench, Wang began to wonder: “Can I replicate what the therapist is doing using a robot?” It was an idle thought at the time. Today, however, his research involves robots and movement, closely related to what had seemed a passing fancy.

Wang continued his focus on computer science as an MIT graduate student, receiving his master’s in 2022 before deciding to pursue work of a more applied nature. He met Nidhi Seethapathi, who had joined MIT’s faculty as an assistant professor in electrical engineering and computer science and brain and cognitive science a few months earlier, and was intrigued by the notion of creating robots that could illuminate the key principles of movement—knowledge that might someday help people regain the ability to move comfortably after suffering from injury, stroke, or disease.

As the first PhD student in Seethapathi’s group and a MathWorks Fellow, Wang is charged with building machine learning-based models that can accurately predict and reproduce human movements. He will then use computer-simulated environments to visualize and evaluate the performance of these models.

To begin, he needs to gather data about specific human movements. One potential data collection method involves the placement of sensors or markers on different parts of the body to pinpoint their precise positions at any given moment. He can then try to calculate those positions in the future, as dictated by the equations of motion in physics.

The other method relies on computer vision-powered software that can automatically convert video footage to motion data. Wang prefers the latter approach, which he considers more natural. “We just look at what humans are doing and try to learn from that directly,” he explains. That’s also where machine learning comes in. “We use machine-learning tools to extract data from the video, and those data become the input to our model,” he adds. The model, in this case, is just another term for the robot brain.

The near-term goal is not to make robots more natural, Wang notes. “We’re using [simulated] robots to understand how humans are moving and eventually to explain any kind of movement—or at least that’s the hope. That said, based on the general principles we’re able to abstract, we might someday build robots that can move more naturally.”

Wang is also collaborating on a project headed by postdoctoral fellow Antoine De Comité that focuses on robotic retrieval of objects—the movements required to remove books from a library shelf, for example, or to grab a drink from a refrigerator. While robots routinely excel at tasks such as grasping an object on a tabletop, performing naturalistic movements in three dimensions remains challenging.

Wang describes a video shown by a Stanford University scientist in which a robot destroyed a refrigerator while attempting to extract a beer. He and De Comité hope for better results with robots that have undergone reinforcement learning—an approach using deep learning in which desired motions are rewarded or reinforced whereas unwanted motions are discouraged.

If they succeed in designing a robot that can safely retrieve a beer, Wang says, then more important and delicate tasks could be within reach. Someday, a robot at PT might guide a patient through knee exercises or apply ultrasound to an arthritic elbow.

Francesca Riccio-Ackerman works to improve access to prosthetics

This story originally appeared in the Spring 2023 issue of Spectrum.

___

In Sierra Leone, war and illness have left up to 40,000 people requiring orthotics and prosthetics services, but there is a profound lack of access to specialized care, says Francesca Riccio-Ackerman, a biomedical engineer and PhD student studying health equity and health systems. There is just one fully certified prosthetist available for the thousands of patients in the African nation who are living with amputation, she notes. The ideal number is one for every 250, according to the World Health Organization and the International Society of Orthotics and Prosthetics.

The data point is significant for Riccio-Ackerman, who conducts research in the MIT Media Lab’s Biomechatronics Group and in the K. Lisa Yang Center for Bionics, both of which aim to improve translation of assistive technologies to people with disabilities. “We’re really focused on improving and augmenting human mobility,” she says. For Riccio-Ackerman, part of the quest to improve human mobility means ensuring that the people who need access to prosthetic care can get it—for the duration of their lives.

“We’re really focused on improving and augmenting human mobility,” says Riccio-Ackerman.

In September 2021, the Yang Center provided funding for Riccio-Ackerman to travel to Sierra Leone, where she witnessed the lingering physical effects of a brutal decade-long civil war that ended in 2002. Prosthetic and orthotic care in the country, where a vast number of patients are also disabled by untreated polio or diabetes, has become more elusive, she says, as global media attention on the war’s aftermath has subsided. “People with amputation need low-level, consistent care for years. There really needs to be a long-term investment in improving this.”

Through the Yang Center and supported by a fellowship from the new MIT Morningside Academy for Design, Riccio-Ackerman is designing and building a sustainable care and delivery model in Sierra Leone that aims to multiply the production of prosthetic limbs and strengthen the country’s prosthetic sector. “[We’re working] to improve access to orthotic and prosthetic services,” she says.

She is also helping to establish a supply chain for prosthetic limb and orthotic brace parts and equipping clinics with machines and infrastructure to serve more patients. In January 2023, her team launched a four-year collaboration with the Sierra Leone Ministry of Health and Sanitation. One of the goals of the joint effort is to enable Sierra Leoneans to obtain professional prosthetics training, so they can care for their own community without leaving home.

From engineering to economics

Riccio-Ackerman was drawn to issues around human mobility after witnessing her aunt suffer from rheumatoid arthritis. “My aunt was young, but she looked like she was 80 or 90. She was sick, in pain, in a wheelchair— a young spirit in an old body,” she says.

As a biomedical engineering undergraduate student at Florida International University, Riccio-Ackerman worked on clinical trials for neural-enabled myoelectric arms controlled by nerves in the body. She says that the technology was thrilling yet heartbreaking. She would often have to explain to patients who participated in testing that they couldn’t take the devices home and that they may never be covered by insurance.

Riccio-Ackerman began asking questions: “What factors determine who gets an amputation? Why are we making devices that are so expensive and inaccessible?” This sense of injustice inspired her to pivot away from device design and toward a master’s degree in health economics and policy at the SDA Bocconi School of Management in Milan.

She began work as a research specialist with Hugh Herr SM ’93, professor of arts and sciences at the MIT Media Lab and codirector of the Yang Center, helping to study communities that were medically neglected in prosthetic care. “I knew that the devices weren’t getting to the people who need them, and I didn’t know if the best way to solve it was through engineering,” Riccio-Ackerman explains.

While Riccio-Ackerman’s PhD should be finished within three years, she’s only at the beginning of her health care equity work. “We’re forging ahead in Sierra Leone and thinking about translating our strategy and methodologies to other communities around the globe that could benefit,” she says. “We hope to be able to do this in many, many countries in the future.”

Bionics researchers develop technologies to ease pain and transcend human limitations

This story originally appeared in the Spring 2023 issue of Spectrum.

___

In early December 2022, a middle-aged woman from California arrived at Boston’s Brigham and Women’s Hospital for the amputation of her right leg below the knee following an accident. This was no ordinary procedure. At the end of her remaining leg, surgeons attached a titanium fixture through which they threaded eight thin, electrically conductive wires. These flexible leads, implanted on her leg muscles, would, in the coming months, connect to a robotic, battery-powered prosthetic ankle and foot.

The goal of this unprecedented surgery, driven by MIT researchers from the K. Lisa Yang Center for Bionics at MIT, was the restoration of near-natural function to the patient, enabling her to sense and control the position and motion of her ankle and foot—even with her eyes closed.

In the K. Lisa Yang Center for Bionics, codirector Hugh Herr SM ’93 and graduate student Christopher Shallal are working to return mobility to people disabled by disease or physical trauma. Photo: Tony Luong

“The brain knows exactly how to control the limb, and it doesn’t matter whether it is flesh and bone or made of titanium, silicon, and carbon composite,” says Hugh Herr SM ’93, professor of media arts and sciences, head of the MIT Media Lab’s Biomechatronics Group, codirector of the Yang Center, and an associate member of MIT’s McGovern Institute for Brain Research.

For Herr, in attendance during that long day, the surgery represented a critical milestone in a decades-long mission to develop technologies returning mobility to people disabled by disease or physical trauma. His research combines a dizzying range of disciplines—electrical, mechanical, tissue, and biomedical engineering, as well as neuroscience and robotics—and has yielded pathbreaking results. Herr’s more than 100 patents include a computer-controlled knee and powered ankle-foot prosthesis and have enabled thousands of people around the world to live more on their own terms, including Herr.

Surmounting catastrophe

For much of Herr’s life, “go” meant “up.”

“Starting when I was eight, I developed an extraordinary passion, an absolute obsession, for climbing; it’s all I thought about in life,” says Herr. He aspired “to be the best climber in the world,” a goal he nearly achieved in his teenage years, enthralled by the “purity” of ascending mountains ropeless and solo in record times, by “a vertical dance, a balance between physicality and mind control.”

McGovern Institute Associate Investigator Hugh Herr. Photo: Jimmy Day / MIT Media Lab

At 17, Herr became disoriented while climbing New Hampshire’s Mt. Washington during a blizzard. Days in the cold permanently damaged his legs, which had to be amputated below his knees. His rescue cost another man’s life, and Herr was despondent, disappointed in himself, and fearful for his future.

Then, following months of rehabilitation, he felt compelled to test himself. His first weekend home, when he couldn’t walk without canes and crutches, he headed back to the mountains. “I hobbled to the base of this vertical cliff and started ascending,” he recalls. “It brought me joy to realize that I was still me, the same person.”

But he also recognized that as a person with amputated limbs, he faced severe disadvantages. “Society doesn’t look kindly on people with unusual bodies; we are viewed as crippled and weak, and that did not sit well with me.” Unable to tolerate both the new physical and social constraints on his life, Herr determined to view his disability not as a loss but as an opportunity. “I think the rage was the catapult that led me to do something that was without precedent,” he says.

Lifelike limb

On hand in the surgical theater in December was a member of Herr’s Biomechatronics Group for whom the bionic limb procedure also held special resonance. Christopher Shallal, a second-year graduate student in the Harvard-MIT Health Sciences and Technology program who received bilateral lower limb amputations at birth, worked alongside surgeon Matthew Carty testing the electric leads before implantation in the patient. Shallal found this, his first direct involvement with a reconstruction surgery, deeply fulfilling.

“Ever since I was a kid, I’ve wanted to do medicine plus engineering,” says Shallal. “I’m really excited to work on this bionic limb reconstruction, which will probably be one of the most advanced systems yet in terms of neural interfacing and control, with a far greater range of motion possible.”

Hugh and Shallal are working on a next-generation, biomimetic limb with implanted sensors that can relay signals between the external prosthesis and muscles in the remaining limb. Photo: Tony Luong

Like other Herr lab designs, the new prosthesis features onboard, battery-powered propulsion, microprocessors, and tunable actuators. But this next-generation, biomimetic limb represents a major leap forward, replacing electrodes sited on a patient’s skin, subject to sweat and other environmental threats, with implanted sensors that can relay signals between the external prosthesis and muscles in the remaining limb.

This system takes advantage of a breakthrough technique invented several years ago by the Herr lab called CMI (for cutaneous mechanoneural interface), which constructs muscle-skin-nerve bundles at the amputation site. Muscle actuators controlled by computers on board the external prosthesis apply forces on skin cells implanted within the amputated residuum when a person with amputation touches an object with their prosthesis.

With CMI and electric leads connecting the prosthesis to these muscle actuators within the residual limb, the researchers hypothesize that a person with an amputation will be able to “feel” their prosthetic leg step onto the ground. This sensory capability is the holy grail for persons with major limb loss. After recovery from her surgery, the woman from California will be wearing Herr’s latest state-of-the-art prosthetic system in the lab.

‘Tinkering’ with the body

Not all artificial limbs emulate those that humans are born with. “You can make them however you want, swapping them in and out depending on what you want to do, and they can take you anywhere,” Herr says. Committed to extreme climbing even after his accident, Herr came up with special limbs that became a commercial hit early in his career. His designs made it possible for someone with amputated legs to run and dance.

But he also knew the day-to-day discomfort of navigating on flatter earth with most prostheses. He won his first patent during his senior year of college for a fluid-controlled socket attachment designed to reduce the pain of walking. Growing up in a Mennonite family skilled in handcrafting things they needed, and in a larger community that was disdainful of technology, Herr says he had “difficulty trusting machines.” Yet by the time he began his master’s program at MIT, intent on liberating persons with limb amputation to live more fully in the world, he had embraced the tools of science and engineering as the means to this end.

“I want to be in the business of designing not more and more powerful tools but designing new bodies,” says Hugh Herr.

For Shallal, Herr was an early icon, and his inventions and climbing exploits served as inspiration. “I’d known about Hugh since middle school; he was famous among those with amputations,” he says. “As a kid, I liked tinkering with things, and I kind of saw my body as a canvas, a place where I could explore different boundaries and expand possibilities for myself and others with amputations.” In school, Shallal sometimes encountered resistance to his prostheses. “People would say I couldn’t do certain things, like running and playing different sports, and I found these barriers frustrating,” he says. “I did things in my own way and didn’t want people to pity me.”

In fact, Shallal felt he could do some things better than his peers. In high school, he used a 3-D printer to make a mobile phone charger case he could plug into his prosthesis. “As a kid, I would wear long pants to hide my legs, but as the technology got cooler, I started wearing shorts,” he says. “I got comfortable and liked kind of showing off my legs.”

Global impact

December’s surgery was the first phase in the bionic limb project. Shallal will be following up with the patient over many months, ensuring that the connections between her limb and implanted sensors function and provide appropriate sensorimotor data for the built-in processor. Research on this and other patients to determine the impact of these limbs on gait and ease of managing slopes, for instance, will form the basis for Shallal’s dissertation.

“After graduation, I’d be really interested in translating technology out of the lab, maybe doing a startup related to neural interfacing technology,” he says. “I watched Inspector Gadget on television when I was a kid. Making the tool you need at the time you need it to fix problems would be my dream.”

Herr will be overseeing Shallal’s work, as well as a suite of research efforts propelled by other graduate students, postdocs, and research scientists that together promise to strengthen the technology behind this generation of biomimetic prostheses.

One example: devising an innovative method for measuring muscle length and velocity with tiny implanted magnets. In work published in November 2022, researchers including Herr; project lead Cameron Taylor SM ’16, PhD ’20, a research associate in the Biomechatronics Group; and Brown University partners demonstrated that this new tool, magnetomicrometry, yields the kind of high-resolution data necessary for even more precise bionic limb control. The Herr lab awaits FDA approval on human implantation of the magnetic beads.

These intertwined initiatives are central to the ambitious mission of the K. Lisa Yang Center for Bionics, established with a $24 million gift from Yang in 2021 to tackle transformative bionic interventions to address an extensive range of human limitations.

Herr is committed to making the broadest possible impact with his technologies. “Shoes and braces hurt, so my group is developing the science of comfort—designing mechanical parts that attach to the body and transfer loads without causing pain.” These inventions may prove useful not just to people living with amputation but to patients suffering from arthritis or other diseases affecting muscles, joints, and bones, whether in lower limbs or arms and hands.

The Yang Center aims to make prosthetic and orthotic devices more accessible globally, so Herr’s group is ramping up services in Sierra Leone, where civil war left tens of thousands missing limbs after devastating machete attacks. “We’re educating clinicians, helping with supply chain infrastructure, introducing novel assistive technology, and developing mobile delivery platforms,” he says.

In the end, says Herr, “I want to be in the business of designing not more and more powerful tools but designing new bodies.” Herr uses himself as an example: “I walk on two very powerful robots, but they’re not linked to my skeleton, or to my brain, so when I walk it feels like I’m on powerful machines that are not me. What I want is such a marriage between human physiology and electromechanics that a person feels at one with the synthetic, designed content of their body.” and control, with a far greater range of motion possible.”

Modeling the marvelous journey from A to B

This story originally appeared in the Spring 2023 issue of Spectrum.

___

Nidhi Seethapathi was first drawn to using powerful yet simple models to understand elaborate patterns when she learned about Newton’s laws of motion as a high school student in India. She was fascinated by the idea that wonderfully complex behaviors can arise from a set of objects that follow a few elementary rules.

Now an assistant professor at MIT, Seethapathi seeks to capture the intricacies of movement in the real world, using computational modeling as well as input from theory and experimentation. “[Theoretical physicist and Nobel laureate] Richard Feynman ’39 once said, ‘What I cannot create, I do not understand,’” Seethapathi says. “In that same spirit, the way I try to understand movement is by building models that move the way we do.”

Models of locomotion in the real world

Seethapathi—who holds a shared faculty position between the Department of Brain and Cognitive Sciences and the Department of Electrical Engineering and Computer Science’s Faculty of Artificial Intelligence + Decision- Making, which is housed in the Schwarzman College of Computing and the School of Engineering—recalls a moment during her undergraduate years studying mechanical engineering in Mumbai when a professor asked students to pick an aspect of movement to examine in detail. While most of her peers chose to analyze machines, Seethapathi selected the human hand. She was astounded by its versatility, she says, and by the number of variables, referred to by scientists as “degrees of freedom,” that are needed to characterize routine manual tasks. The assignment made her realize that she wanted to explore the diverse ways in which the entire human body can move.

Also an investigator at the McGovern Institute for Brain Research, Seethapathi pursued graduate research at The Ohio State University Movement Lab, where her goal was to identify the key elements of human locomotion. At that time, most people in the field were analyzing simple movements, she says, “but I was interested in broadening the scope of my models to include real-world behavior. Given that movement is so ubiquitous, I wondered: What can this model say about everyday life?”

After earning her PhD from Ohio State in 2018, Seethapathi continued this line of research as a postdoctoral fellow at the University of Pennsylvania. New computer vision tools to track human movement from video footage had just entered the scene, and during her time at UPenn, Seethapathi sought to expand her skillset to include computer vision and applications to movement rehabilitation.

At MIT, Seethapathi continues to extend the range of her studies of human movement, looking at how locomotion can evolve as people grow and age, and how it can adapt to anatomical changes and even adjust to shifts in weather, which can alter ground conditions. Her investigations now encompass other species as part of an effort to determine how creatures with different morphologies and habitats regulate their movements.

The models Seethapathi and her team create make predictions about human movements that can later be verified or refuted by empirical tests. While relatively simple experiments can be carried out on treadmills, her group is developing measurement systems incorporating wearable sensors and video-based sensing to measure movement data that have traditionally been hard to obtain outside the laboratory.

Although Seethapathi says she is primarily driven to uncover the fundamental principles that govern movement behavior, she believes her work also has practical applications.

“When people are treated for a movement disorder, the goal is to impact their movements in the real world,” she says. “We can use our predictive models to see how a particular intervention will affect a person’s trajectory. The hope is that our models can help put the individual on the right track to recovery as early as possible.”

Eight from MIT elected to American Academy of Arts and Sciences for 2023

Eight MIT faculty members are among more than 250 leaders from academia, the arts, industry, public policy, and research elected to the American Academy of Arts and Sciences, the academy announced April 19.

One of the nation’s most prestigious honorary societies, the academy is also a leading center for independent policy research. Members contribute to academy publications, as well as studies of science and technology policy, energy and global security, social policy and American institutions, the humanities and culture, and education.

Those elected from MIT in 2023 are:

  • Arnaud Costinot, professor of economics;
  • James J. DiCarlo, Peter de Florez Professor of Brain and Cognitive Sciences, director of the MIT Quest for Intelligence, and McGovern Institute Investigator;
  • Piotr Indyk, the Thomas D. and Virginia W. Cabot Professor of Electrical Engineering and Computer Science;
  • Senthil Todadri, professor of physics;
  • Evelyn N. Wang, Ford Professor of Engineering (on leave) and director of the Department of Energy’s Advanced Research Projects Agency-Energy;
  • Boleslaw Wyslouch, professor of physics and director of the Laboratory for Nuclear Science and Bates Research and Engineering Center;
  • Yukiko Yamashita, professor of biology and core member of the Whitehead Institute; and
  • Wei Zhang, professor of mathematics.

“With the election of these members, the academy is honoring excellence, innovation, and leadership and recognizing a broad array of stellar accomplishments. We hope every new member celebrates this achievement and joins our work advancing the common good,” says David W. Oxtoby, president of the academy.

Since its founding in 1780, the academy has elected leading thinkers from each generation, including George Washington and Benjamin Franklin in the 18th century, Maria Mitchell and Daniel Webster in the 19th century, and Toni Morrison and Albert Einstein in the 20th century. The current membership includes more than 250 Nobel and Pulitzer Prize winners.

Real-time feedback helps adolescents with depression quiet the mind

Real-time feedback about brain activity can help adolescents with depression or anxiety quiet their minds, according to a new study from MIT scientists. The researchers, led by McGovern research affiliate Susan Whitfield-Gabrieli, have used functional magnetic resonance imaging (fMRI) to show patients what’s happening in their brain as they practice mindfulness inside the scanner and to encourage them to focus on the present. They report in the journal Molecular Psychiatry that doing so settles down neural networks that are associated with symptoms of depression.

McGovern research affiliate Susan Whitfield-Gabrieli in the Martinos Imaging Center.

“We know this mindfulness meditation is really good for kids and teens, and we think this real-time fMRI neurofeedback is really a way to engage them and provide a visual representation of how they’re doing,” says Whitfield-Gabrieli. “And once we train people how to do mindfulness meditation, they can do it on their own at any time, wherever they are.”

The approach could be a valuable tool to alleviate or prevent depression in young people, which has been on the rise in recent years and escalated alarmingly during the Covid-19 pandemic. “This has gone from bad to catastrophic, in my perspective,” Whitfield-Gabrieli says. “We have to think out of the box and come up some really innovative ways to help.”

Default mode network

Mindfulness meditation, in which practitioners focus their awareness on the present moment, can modulate activity within the brain’s default mode network, which is so named because it is most active when a person is not focused on any particular task. Two hubs within the default mode network, the medial prefrontal cortex and the posterior cingulate cortex, are of particular interest to Whitfield-Gabrieli and her colleagues, due to a potential role in the symptoms of depression and anxiety.

“These two core hubs are very engaged when we’re thinking about the past or the future and we’re not really engaged in the present moment,” she explains. “If we’re in a healthy state of mind, we may be reminiscing about the past or planning for the future. But if we’re depressed, that reminiscing may turn into rumination or obsessively rehashing the past. If we’re particularly anxious, we may be obsessively worrying about the future.”

Whitfield-Gabrieli explains that these key hubs are often hyperconnected in people with anxiety and depression. The more tightly correlated the activity of the two regions are, the worse a person’s symptoms are likely to be. Mindfulness, she says, can help interrupt that hyperconnectivity.

“Mindfulness really helps to focus on the now, which just precludes all of this mind wandering and repetitive negative thinking,” she explains. In fact, she and her colleagues have found that mindfulness practice can reduce stress and improve attention in children. But she acknowledges that it can be difficult to engage young people and help them focus on the practice.

Tuning the mind

To help people visualize the benefits of their mindfulness practice, the researchers developed a game that can be played while an MRI scanner tracks a person’s brain activity. On a screen inside the scanner, the participant sees a ball and two circles. The circle at the top of the screen represents a desirable state in which the activity of the brain’s default mode network has been reduced, and the activity of a network the brain uses to focus on attention-demanding tasks—the frontal parietal network—has increased. An initial fMRI scan identifies these networks in each individual’s brain, creating a customized mental map on which the game is based.

“They’re training their brain to tune their mind. And they love it.” – Susan Whitfield-Gabrieli

As the person practices mindfulness meditation, which they learn prior to entering the scanner, the default mode network in the brain quiets while the frontal parietal mode activates. When the scanner detects this change, the ball moves and eventually enters its target. With an initial success, the target shrinks, encouraging even more focus. When the participant’s mind wanders from their task, the default mode network activation increases (relative to the frontal parietal network) and the ball moves down towards the second circle, which represents an undesirable state. “Basically, they’re just moving this ball with their brain,” Whitfield-Gabrieli says. “They’re training their brain to tune their mind. And they love it.”

Nine individuals between the ages of 17 and 19 with a history of major depression or anxiety disorders tried this new approach to mindfulness training, and for each of them, Whitfield-Gabrieli’s team saw a reduction in connectivity within the default mode network. Now they are working to determine whether an electroencephalogram, in which brain activity is measured with noninvasive electrodes, can be used to provide similar neurofeedback during mindfulness training—an approach that could be more accessible for broad clinical use.

Whitfield-Gabrieli notes that hyperconnectivity in the default mode network is also associated with psychosis, and she and her team have found that mindfulness meditation with real-time fMRI feedback can help reduce symptoms in adults with schizophrenia. Future studies are planned to investigate how the method impacts teens’ ability to establish a mindfulness practice and its potential effects on depression symptoms.

Bacterial injection system delivers proteins in mice and human cells

Researchers at the McGovern Institute and the Broad Institute of MIT and Harvard have harnessed a natural bacterial system to develop a new protein delivery approach that works in human cells and animals. The technology, described today in Nature, can be programmed to deliver a variety of proteins, including ones for gene editing, to different cell types. The system could potentially be a safe and efficient way to deliver gene therapies and cancer therapies.

Led by McGovern Institute investigator and Broad Institute core member Feng Zhang, the team took advantage of a tiny syringe-like injection structure, produced by a bacterium, that naturally binds to insect cells and injects a protein payload into them. The researchers used the artificial intelligence tool AlphaFold to engineer these syringe structures to deliver a range of useful proteins to both human cells and cells in live mice.

“This is a really beautiful example of how protein engineering can alter the biological activity of a natural system,” said Joseph Kreitz, the study’s first author and a graduate student in Zhang’s lab. “I think it substantiates protein engineering as a useful tool in bioengineering and the development of new therapeutic systems.”

“Delivery of therapeutic molecules is a major bottleneck for medicine, and we will need a deep bench of options to get these powerful new therapies into the right cells in the body,” added Zhang. “By learning from how nature transports proteins, we were able to develop a new platform that can help address this gap.”

Zhang is senior author on the study and is also the James and Patricia Poitras Professor of Neuroscience at MIT and an investigator at the Howard Hughes Medical Institute.

Injection via contraction

Portrait of Joseph Kreitz.
Graduate student Joseph Kreitz holds a 3D printed bacteriophage. Photo: Steph Stevens

Symbiotic bacteria use the roughly 100-nanometer-long syringe-like machines to inject proteins into host cells to help adjust the biology of their surroundings and enhance their survival. These machines, called extracellular contractile injection systems (eCISs), consist of a rigid tube inside a sheath that contracts, driving a spike on the end of the tube through the cell membrane. This forces protein cargo inside the tube to enter the cell.

On the outside of one end of the eCIS are tail fibers that recognize specific receptors on the cell surface and latch on. Previous research has shown that eCISs can naturally target insect and mouse cells, but Kreitz thought it might be possible to modify them to deliver proteins to human cells by reengineering the tail fibers to bind to different receptors.

Using AlphaFold, which predicts a protein’s structure from its amino acid sequence, the researchers redesigned tail fibers of an eCIS produced by Photorhabdus bacteria to bind to human cells. By reengineering another part of the complex, the scientists tricked the syringe into delivering a protein of their choosing, in some cases with remarkably high efficiency.

The team made eCISs that targeted cancer cells expressing the EGF receptor and showed that they killed almost 100 percent of the cells, but did not affect cells without the receptor. Though efficiency depends in part on the receptor the system is designed to target, Kreitz says that the findings demonstrate the promise of the system with thoughtful engineering.

Photorhabdus virulence cassettes (green) binding to insect cells (blue) prior to injection of payload proteins. Image: Joseph Kreitz | McGovern Institute, Broad Institute

The researchers also used an eCIS to deliver proteins to the brain in live mice — where it didn’t provoke a detectable immune response, suggesting that eCISs could one day be used to safely deliver gene therapies to humans.

Packaging proteins

Kreitz says the eCIS system is versatile, and the team has already used it to deliver a range of cargos including base editor proteins (which can make single-letter changes to DNA), proteins that are toxic to cancer cells, and Cas9, a large DNA-cutting enzyme used in many gene editing systems.

Cancer cells killed by programmed Photorhabdus virulence cassettes (PVCs), imaged with a scanning electron microscope. Image: Joseph Kreitz | McGovern Institute, Broad Institute

In the future, Kreitz says researchers could engineer other components of the eCIS system to tune other properties, or to deliver other cargos such as DNA or RNA. He also wants to better understand the function of these systems in nature.

“We and others have shown that this type of system is incredibly diverse across the biosphere, but they are not very well characterized,” Kreitz said. “And we believe this type of system plays really important roles in biology that are yet to be explored.”

This work was supported in part by the National Institutes of Health, Howard Hughes Medical Institute, Poitras Center for Psychiatric Disorders Research at MIT, Hock E. Tan and K. Lisa Yang Center for Autism Research at MIT, K. Lisa Yang and Hock E. Tan Molecular Therapeutics Center at MIT, K. Lisa Yang Brain-Body Center at MIT, Broad Institute Programmable Therapeutics Gift Donors, The Pershing Square Foundation, William Ackman, Neri Oxman, J. and P. Poitras, Kenneth C. Griffin, BT Charitable Foundation, the Asness Family Foundation, the Phillips family, D. Cheng, and R. Metcalfe.