McGovern researchers are studying how our bodies and minds work together to control movement.

The ways we move


McGovern researchers are studying how our bodies and minds work together to control movement.

Dancer and architecture student Gus Solomons '61 photographed by MIT professor of electrical engineering Harold Edgerton with stroboscopic camera, February 1960. © 2010 MIT. Courtesy of MIT Museum.

This story originally appeared in the Winter 2023 issue of BrainScan.
__

Many people barely consider how their bodies move — at least not until movement becomes more difficult due to injury or disease. But the McGovern scientists who are working to understand human movement and restore it after it has been lost know that the way we move is an engineering marvel.
Muscles, bones, brain, and nerves work together to navigate and interact with an ever-changing environment, making constant but often imperceptible adjustments to carry out our goals. It’s an efficient and highly adaptable system, and the way it’s put together is not at all intuitive, says Hugh Herr, a new associate investigator at the Institute.

That’s why Herr, who also co-directs MIT’s new K. Lisa Yang Center for Bionics, looks to biology to guide the development of artificial limbs that aim to give people the same agency, control, and comfort of natural limbs. McGovern Associate Investigator Nidhi Seethapathi, who like Herr joined the Institute in September, is also interested in understanding human movement in all its complexity. She is coming at the problem from a different direction, using computational modeling to predict how and why we move the way we do.

Moving through change

The computational models that Seethapathi builds in her lab aim to predict how humans will move under different conditions. If a person is placed in an unfamiliar environment and asked to navigate a course under time pressure, what path will they take? How will they move their limbs, and what forces will they exert? How will their movements change as they become more comfortable on the terrain?

McGovern Associate Investigator Nidhi Seethapathi with lab members (from left to right) Inseung Kang, Nikasha Patel, Antoine De Comite, Eric Wang, and Crista Falk. Photo: Steph Stevens

Seethapathi uses the principles of robotics to build models that answer these questions, then tests them by placing real people in the same scenarios and monitoring their movements. So far, that has mostly meant inviting study subjects to her lab, but as she expands her models to predict more complex movements, she will begin monitoring people’s activity in the real world, over longer time periods than laboratory experiments typically allow.

Seethapathi’s hope is that her findings will inform the way doctors, therapists, and engineers help patients regain control over their movements after an injury or stroke, or learn to live with movement disorders like Parkinson’s disease. To make a real difference, she stresses, it’s important to bring studies of human movement out of the lab, where subjects are often limited to simple tasks like walking on a treadmill, into more natural settings. “When we’re talking about doing physical therapy, neuromotor rehabilitation, robotic exoskeletons — any way of helping people move better — we want to do it in the real world, for everyday, complex tasks,” she says.

When we’re talking about helping people move better — we want to do it in the real world, for everyday, complex tasks,” says Seethapathi.

Seethapathi’s work is already revealing how the brain directs movement in the face of competing priorities. For example, she has found that when people are given a time constraint for traveling a particular distance, they walk faster than their usual, comfortable pace — so much so that they often expend more energy than necessary and arrive at their destination a bit early. Her models suggest that people pick up their pace more than they need to because humans’ internal estimations of time are imprecise.

Her team is also learning how movements change as a person becomes familiar with an environment or task. She says people find an efficient way to move through a lot of practice. “If you’re walking in a straight line for a very long time, then you seem to pick the movement that is optimal for that long-distance walk,” she explains. But in the real world, things are always changing — both in the body and in the environment. So Seethapathi models how people behave when they must move in a new way or navigate a new environment. “In these kinds of conditions, people eventually wind up on an energy-optimal solution,” she says. “But initially, they pick something that prevents them from falling down.”

To capture the complexity of human movement, Seethapathi and her team are devising new tools that will let them monitor people’s movements outside the lab. They are also drawing on data from other fields, from architecture to physical therapy, and even from studies of other animals. “If I have general principles, they should be able to tell me how modifications in the body or in how the brain is connected to the body would lead to different movements,” she says. “I’m really excited about generalizing these principles across timescales and species.”

Building new bodies

In Herr’s lab, a deepening understanding of human movement is helping drive the development of increasingly sophisticated artificial limbs and other wearable robots. The team designs devices that interface directly with a user’s nervous system, so they are not only guided by the brain’s motor control systems, but also send information back to the brain.

Herr, a double amputee with two artificial legs of his own, says prosthetic devices are getting better at replicating natural movements, guided by signals from the brain. Mimicking the design and neural signals found in biology can even give those devices much of the extraordinary adaptability of natural human movement. As an example, Herr notes that his legs effortlessly navigate varied terrain. “There’s adaptive, stabilizing features, and the machine doesn’t have to detect every pothole and pebble and banana peel on the ground, because the morphology and the nervous system control is so inherently adaptive,” he says.

McGovern Associate Investigator Hugh Herr at work in the K. Lisa Yang Center for Bionics at MIT. Photo: Jimmy Day/Media Lab

But, he notes, the field of bionics is in its infancy, and there’s lots of room for improvement. “It’s only a matter of time before a robotic knee, for example, can be as good as the biological knee or better,” he says. “But the problem is the human attached to that knee won’t feel it’s their knee until they can feel it, and until their central nervous system has complete agency over that knee,” he says. “So if you want to actually build new bodies and not just more and more powerful tools for humans, you have to link to the brain bidirectionally.”

Herr’s team has found that surgically restoring natural connections between pairs of muscles that normally work in opposition to move a limb, such as the arm’s biceps and triceps, gives the central nervous system signals about how that limb is moving, even when a natural limb is gone. The idea takes a cue from the work of McGovern Emeritus Investigator Emilio Bizzi, who found that the coordinated activation of groups of muscles by the nervous system, called muscle synergies, is important for motor control.

“It’s only a matter of time before a robotic knee can be as good as the biological knee or better,” says Herr.

“When a person thinks and moves their phantom limb, those muscle pairings move dynamically, so they feel, in a natural way, the limb moving — even though the limb is not there,” Herr explains. He adds that when those proprioceptive signals communicate instead how an artificial limb is moving, a person experiences “great agency and ownership” of that limb. Now, his group is working to develop sensors that detect and relay information usually processed by sensory neurons in the skin, so prosthetic devices can also perceive pressure and touch.

At the same time, they’re working to improve the mechanical interface between wearable robots and the body to optimize comfort and fit — whether that’s by using detailed anatomical imaging to guide the design of an individual’s device or by engineering devices that integrate directly with a person’s skeleton. There’s no “average” human, Herr says, and effective technologies must meet individual needs, not just for fit, but also for function. At that same time, he says it’s important to plan for cost-effective, mass production, because the need for these technologies is so great.

“The amount of human suffering caused by the lack of technology to address disability is really beyond comprehension,” he says. He expects tremendous progress in the growing field of bionics in the coming decades, but he’s impatient. “I think in 50 years, when scientists look back to this era, it’ll be laughable,” he says. “I’m always anxiously wanting to be in the future.”