Setting carbon management in stone

Keeping global temperatures within limits deemed safe by the Intergovernmental Panel on Climate Change means doing more than slashing carbon emissions. It means reversing them.

“If we want to be anywhere near those limits [of 1.5 or 2 C], then we have to be carbon neutral by 2050, and then carbon negative after that,” says Matěj Peč, a geoscientist and the Victor P. Starr Career Development Assistant Professor in the Department of Earth, Atmospheric, and Planetary Sciences (EAPS).

Going negative will require finding ways to radically increase the world’s capacity to capture carbon from the atmosphere and put it somewhere where it will not leak back out. Carbon capture and storage projects already suck in tens of million metric tons of carbon each year. But putting a dent in emissions will mean capturing many billions of metric tons more. Today, people emit around 40 billion tons of carbon each year globally, mainly by burning fossil fuels.

Because of the need for new ideas when it comes to carbon storage, Peč has created a proposal for the MIT Climate Grand Challenges competition — a bold and sweeping effort by the Institute to support paradigm-shifting research and innovation to address the climate crisis. Called the Advanced Carbon Mineralization Initiative, his team’s proposal aims to bring geologists, chemists, and biologists together to make permanently storing carbon underground workable under different geological conditions. That means finding ways to speed-up the process by which carbon pumped underground is turned into rock, or mineralized.

“That’s what the geology has to offer,” says Peč, who is a lead on the project, along with Ed Boyden, the Y. Eva Tan professor of neurotechnology and Howard Hughes Medical Institute investigator at the McGovern Institute for Brain Research, and Yogesh Surendranath, the Paul M Cook Career Development associate professor of chemistry. “You look for the places where you can safely and permanently store these huge volumes of CO2.”

Peč‘s proposal is one of 27 finalists selected from a pool of almost 100 Climate Grand Challenge proposals submitted by collaborators from across the Institute. Each finalist team received $100,000 to further develop their research proposals. A subset of finalists will be announced in April, making up a portfolio of multiyear “flagship” projects receiving additional funding and support.

Building industries capable of going carbon negative presents huge technological, economic, environmental, and political challenges. For one, it’s expensive and energy-intensive to capture carbon from the air with existing technologies, which are “hellishly complicated,” says Peč. Much of the carbon capture underway today focuses on more concentrated sources like coal- or gas-burning power plants.

It’s also difficult to find geologically suitable sites for storage. To keep it in the ground after it has been captured, carbon must either be trapped in airtight reservoirs or turned to stone.

One of the best places for carbon capture and storage (CCS) is Iceland, where a number of CCS projects are up and running. The island’s volcanic geology helps speed up the mineralization process, as carbon pumped underground interacts with basalt rock at high temperatures. In that ideal setting, says Peč, 95 percent of carbon injected underground is mineralized after just two years — a geological flash.

But Iceland’s geology is unusual. Elsewhere requires deeper drilling to reach suitable rocks at suitable temperature, which adds costs to already expensive projects. Further, says Peč, there’s not a complete understanding of how different factors influence the speed of mineralization.

Peč‘s Climate Grand Challenge proposal would study how carbon mineralizes under different conditions, as well as explore ways to make mineralization happen more rapidly by mixing the carbon dioxide with different fluids before injecting it underground. Another idea — and the reason why there are biologists on the team — is to learn from various organisms adept at turning carbon into calcite shells, the same stuff that makes up limestone.

Two other carbon management proposals, led by EAPS Cecil and Ida Green Professor Bradford Hager, were also selected as Climate Grand Challenge finalists. They focus on both the technologies necessary for capturing and storing gigatons of carbon as well as the logistical challenges involved in such an enormous undertaking.

That involves everything from choosing suitable sites for storage, to regulatory and environmental issues, as well as how to bring disparate technologies together to improve the whole pipeline. The proposals emphasize CCS systems that can be powered by renewable sources, and can respond dynamically to the needs of different hard-to-decarbonize industries, like concrete and steel production.

“We need to have an industry that is on the scale of the current oil industry that will not be doing anything but pumping CO2 into storage reservoirs,” says Peč.

For a problem that involves capturing enormous amounts of gases from the atmosphere and storing it underground, it’s no surprise EAPS researchers are so involved. The Earth sciences have “everything” to offer, says Peč, including the good news that the Earth has more than enough places where carbon might be stored.

“Basically, the Earth is really, really large,” says Peč. “The reasonably accessible places, which are close to the continents, store somewhere on the order of tens of thousands to hundreds thousands of gigatons of carbon. That’s orders of magnitude more than we need to put back in.”

Q&A: Climate Grand Challenges finalists on accelerating reductions in global greenhouse gas emissions

This is the second article in a four-part interview series highlighting the work of the 27 MIT Climate Grand Challenges finalists, which received a total of $2.7 million in startup funding to advance their projects. In April, the Institute will name a subset of the finalists as multiyear flagship projects.

Last month, the Intergovernmental Panel on Climate Change (IPCC), an expert body of the United Nations representing 195 governments, released its latest scientific report on the growing threats posed by climate change, and called for drastic reductions in greenhouse gas emissions to avert the most catastrophic outcomes for humanity and natural ecosystems.

Bringing the global economy to net-zero carbon dioxide emissions by midcentury is complex and demands new ideas and novel approaches. The first-ever MIT Climate Grand Challenges competition focuses on four problem areas including removing greenhouse gases from the atmosphere and identifying effective, economic solutions for managing and storing these gases. The other Climate Grand Challenges research themes address using data and science to forecast climate-related risk, decarbonizing complex industries and processes, and building equity and fairness into climate solutions.

In the following conversations prepared for MIT News, faculty from three of the teams working to solve “Removing, managing, and storing greenhouse gases” explain how they are drawing upon geological, biological, chemical, and oceanic processes to develop game-changing techniques for carbon removal, management, and storage. Their responses have been edited for length and clarity.

Directed evolution of biological carbon fixation

Agricultural demand is estimated to increase by 50 percent in the coming decades, while climate change is simultaneously projected to drastically reduce crop yield and predictability, requiring a dramatic acceleration of land clearing. Without immediate intervention, this will have dire impacts on wild habitat, rob the livelihoods of hundreds of millions of subsistence farmers, and create hundreds of gigatons of new emissions. Matthew Shoulders, associate professor in the Department of Chemistry, talks about the working group he is leading in partnership with Ed Boyden, the Y. Eva Tan Professor in Neurotechnology at MIT, Investigator at the Howard Hughes Medical Institute and the McGovern Institute for Brain Research, that aims to massively reduce carbon emissions from agriculture by relieving core biochemical bottlenecks in the photosynthetic process using the most sophisticated synthetic biology available to science.

Q: Describe the two pathways you have identified for improving agricultural productivity and climate resiliency.

A: First, cyanobacteria grow millions of times faster than plants and dozens of times faster

than microalgae. Engineering these cyanobacteria as a source of key food products using synthetic biology will enable food production using less land, in a fundamentally more climate-resilient manner. Second, carbon fixation, or the process by which carbon dioxide is incorporated into organic compounds, is the rate-limiting step of photosynthesis and becomes even less efficient under rising temperatures. Enhancements to Rubisco, the enzyme mediating this central process, will both improve crop yields and provide climate resilience to crops needed by 2050. Our team, led by Robbie Wilson and Max Schubert, has created new directed evolution methods tailored for both strategies, and we have already uncovered promising early results. Applying directed evolution to photosynthesis, carbon fixation, and food production has the potential to usher in a second green revolution.

Q: What partners will you need to accelerate the development of your solutions?

A: We have already partnered with leading agriculture institutes with deep experience in plant transformation and field trial capacity, enabling the integration of our improved carbon-dioxide-fixing enzymes into a wide range of crop plants. At the deployment stage, we will be positioned to partner with multiple industry groups to achieve improved agriculture at scale. Partnerships with major seed companies around the world will be key to leverage distribution channels in manufacturing supply chains and networks of farmers, agronomists, and licensed retailers. Support from local governments will also be critical where subsidies for seeds are necessary for farmers to earn a living, such as smallholder and subsistence farming communities. Additionally, our research provides an accessible platform that is capable of enabling and enhancing carbon dioxide sequestration in diverse organisms, extending our sphere of partnership to a wide range of companies interested in industrial microbial applications, including algal and cyanobacterial, and in carbon capture and storage.

Strategies to reduce atmospheric methane

One of the most potent greenhouse gases, methane is emitted by a range of human activities and natural processes that include agriculture and waste management, fossil fuel production, and changing land use practices — with no single dominant source. Together with a diverse group of faculty and researchers from the schools of Humanities, Arts, and Social Sciences; Architecture and Planning; Engineering; and Science; plus the MIT Schwarzman College of Computing, Desiree Plata, associate professor in the Department of Civil and Environmental Engineering, is spearheading the MIT Methane Network, an integrated approach to formulating scalable new technologies, business models, and policy solutions for driving down levels of atmospheric methane.

Q: What is the problem you are trying to solve and why is it a “grand challenge”?

A: Removing methane from the atmosphere, or stopping it from getting there in the first place, could change the rates of global warming in our lifetimes, saving as much as half a degree of warming by 2050. Methane sources are distributed in space and time and tend to be very dilute, making the removal of methane a challenge that pushes the boundaries of contemporary science and engineering capabilities. Because the primary sources of atmospheric methane are linked to our economy and culture — from clearing wetlands for cultivation to natural gas extraction and dairy and meat production — the social and economic implications of a fundamentally changed methane management system are far-reaching. Nevertheless, these problems are tractable and could significantly reduce the effects of climate change in the near term.

Q: What is known about the rapid rise in atmospheric methane and what questions remain unanswered?

A: Tracking atmospheric methane is a challenge in and of itself, but it has become clear that emissions are large, accelerated by human activity, and cause damage right away. While some progress has been made in satellite-based measurements of methane emissions, there is a need to translate that data into actionable solutions. Several key questions remain around improving sensor accuracy and sensor network design to optimize placement, improve response time, and stop leaks with autonomous controls on the ground. Additional questions involve deploying low-level methane oxidation systems and novel catalytic materials at coal mines, dairy barns, and other enriched sources; evaluating the policy strategies and the socioeconomic impacts of new technologies with an eye toward decarbonization pathways; and scaling technology with viable business models that stimulate the economy while reducing greenhouse gas emissions.

Deploying versatile carbon capture technologies and storage at scale

There is growing consensus that simply capturing current carbon dioxide emissions is no longer sufficient — it is equally important to target distributed sources such as the oceans and air where carbon dioxide has accumulated from past emissions. Betar Gallant, the American Bureau of Shipping Career Development Associate Professor of Mechanical Engineering, discusses her work with Bradford Hager, the Cecil and Ida Green Professor of Earth Sciences in the Department of Earth, Atmospheric and Planetary Sciences, and T. Alan Hatton, the Ralph Landau Professor of Chemical Engineering and director of the School of Chemical Engineering Practice, to dramatically advance the portfolio of technologies available for carbon capture and permanent storage at scale. (A team led by Assistant Professor Matěj Peč of EAPS is also addressing carbon capture and storage.)

Q: Carbon capture and storage processes have been around for several decades. What advances are you seeking to make through this project?

A: Today’s capture paradigms are costly, inefficient, and complex. We seek to address this challenge by developing a new generation of capture technologies that operate using renewable energy inputs, are sufficiently versatile to accommodate emerging industrial demands, are adaptive and responsive to varied societal needs, and can be readily deployed to a wider landscape.

New approaches will require the redesign of the entire capture process, necessitating basic science and engineering efforts that are broadly interdisciplinary in nature. At the same time, incumbent technologies have been optimized largely for integration with coal- or natural gas-burning power plants. Future applications must shift away from legacy emitters in the power sector towards hard-to-mitigate sectors such as cement, iron and steel, chemical, and hydrogen production. It will become equally important to develop and optimize systems targeted for much lower concentrations of carbon dioxide, such as in oceans or air. Our effort will expand basic science studies as well as human impacts of storage, including how public engagement and education can alter attitudes toward greater acceptance of carbon dioxide geologic storage.

Q: What are the expected impacts of your proposed solution, both positive and negative?

A: Renewable energy cannot be deployed rapidly enough everywhere, nor can it supplant all emissions sources, nor can it account for past emissions. Carbon capture and storage (CCS) provides a demonstrated method to address emissions that will undoubtedly occur before the transition to low-carbon energy is completed. CCS can succeed even if other strategies fail. It also allows for developing nations, which may need to adopt renewables over longer timescales, to see equitable economic development while avoiding the most harmful climate impacts. And, CCS enables the future viability of many core industries and transportation modes, many of which do not have clear alternatives before 2050, let alone 2040 or 2030.

The perceived risks of potential leakage and earthquakes associated with geologic storage can be minimized by choosing suitable geologic formations for storage. Despite CCS providing a well-understood pathway for removing enough of the carbon dioxide already emitted into the atmosphere, some environmentalists vigorously oppose it, fearing that CCS rewards oil companies and disincentivizes the transition away from fossil fuels. We believe that it is more important to keep in mind the necessity of meeting key climate targets for the sake of the planet, and welcome those who can help.

An optimized solution for face recognition

The human brain seems to care a lot about faces. It’s dedicated a specific area to identifying them, and the neurons there are so good at their job that most of us can readily recognize thousands of individuals. With artificial intelligence, computers can now recognize faces with a similar efficiency—and neuroscientists at MIT’s McGovern Institute have found that a computational network trained to identify faces and other objects discovers a surprisingly brain-like strategy to sort them all out.

The finding, reported March 16, 2022, in Science Advances, suggests that the millions of years of evolution that have shaped circuits in the human brain have optimized our system for facial recognition.

“The human brain’s solution is to segregate the processing of faces from the processing of objects,” explains Katharina Dobs, who led the study as a postdoctoral researcher in McGovern investigator Nancy Kanwisher’s lab. The artificial network that she trained did the same. “And that’s the same solution that we hypothesize any system that’s trained to recognize faces and to categorize objects would find,” she adds.

“These two completely different systems have figured out what a—if not the—good solution is. And that feels very profound,” says Kanwisher.

Functionally specific brain regions

More than twenty years ago, Kanwisher’s team discovered a small spot in the brain’s temporal lobe that responds specifically to faces. This region, which they named the fusiform face area, is one of many brain regions Kanwisher and others have found that are dedicated to specific tasks, such as the detection of written words, the perception of vocal songs, and understanding language.

Kanwisher says that as she has explored how the human brain is organized, she has always been curious about the reasons for that organization. Does the brain really need special machinery for facial recognition and other functions? “‘Why questions’ are very difficult in science,” she says. But with a sophisticated type of machine learning called a deep neural network, her team could at least find out how a different system would handle a similar task.

Dobs, who is now a research group leader at Justus Liebig University Giessen in Germany, assembled hundreds of thousands of images with which to train a deep neural network in face and object recognition. The collection included the faces of more than 1,700 different people and hundreds of different kinds of objects, from chairs to cheeseburgers. All of these were presented to the network, with no clues about which was which. “We never told the system that some of those are faces, and some of those are objects. So it’s basically just one big task,” Dobs says. “It needs to recognize a face identity, as well as a bike or a pen.”

Visualization of the preferred stimulus for example face-ranked filters. While filters in early layers (e.g., Conv5) were maximally activated by simple features, filters responded to features that appear somewhat like face parts (e.g., nose and eyes) in mid-level layers (e.g., Conv9) and appear to represent faces in a more holistic manner in late convolutional layers. Image: Kanwisher lab

As the program learned to identify the objects and faces, it organized itself into an information-processing network with that included units specifically dedicated to face recognition. Like the brain, this specialization occurred during the later stages of image processing. In both the brain and the artificial network, early steps in facial recognition involve more general vision processing machinery, and final stages rely on face-dedicated components.

It’s not known how face-processing machinery arises in a developing brain, but based on their findings, Kanwisher and Dobs say networks don’t necessarily require an innate face-processing mechanism to acquire that specialization. “We didn’t build anything face-ish into our network,” Kanwisher says. “The networks managed to segregate themselves without being given a face-specific nudge.”

Kanwisher says it was thrilling seeing the deep neural network segregate itself into separate parts for face and object recognition. “That’s what we’ve been looking at in the brain for twenty-some years,” she says. “Why do we have a separate system for face recognition in the brain? This tells me it is because that is what an optimized solution looks like.”

Now, she is eager to use deep neural nets to ask similar questions about why other brain functions are organized the way they are. “We have a new way to ask why the brain is organized the way it is,” she says. “How much of the structure we see in human brains will arise spontaneously by training networks to do comparable tasks?”

New MRI probe can reveal more of the brain’s inner workings

Using a novel probe for functional magnetic resonance imaging (fMRI), MIT biological engineers have devised a way to monitor individual populations of neurons and reveal how they interact with each other.

Similar to how the gears of a clock interact in specific ways to turn the clock’s hands, different parts of the brain interact to perform a variety of tasks, such as generating behavior or interpreting the world around us. The new MRI probe could potentially allow scientists to map those networks of interactions.

“With regular fMRI, we see the action of all the gears at once. But with our new technique, we can pick up individual gears that are defined by their relationship to the other gears, and that’s critical for building up a picture of the mechanism of the brain,” says Alan Jasanoff, an MIT professor of biological engineering, brain and cognitive sciences, and nuclear science and engineering.

Using this technique, which involves genetically targeting the MRI probe to specific populations of cells in animal models, the researchers were able to identify neural populations involved in a circuit that responds to rewarding stimuli. The new MRI probe could also enable studies of many other brain circuits, the researchers say.

Jasanoff, who is also an associate investigator at the McGovern Institute, is the senior author of the study, which appears today in Nature Neuroscience. The lead authors of the paper are recent MIT PhD recipient Souparno Ghosh and former MIT research scientist Nan Li.

Tracing connections

Traditional fMRI imaging measures changes to blood flow in the brain, as a proxy for neural activity. When neurons receive signals from other neurons, it triggers an influx of calcium, which causes a diffusible gas called nitric oxide to be released. Nitric oxide acts in part as a vasodilator that increases blood flow to the area.

Imaging calcium directly can offer a more precise picture of brain activity, but that type of imaging usually requires fluorescent chemicals and invasive procedures. The MIT team wanted to develop a method that could work across the brain without that type of invasiveness.

“If we want to figure out how brain-wide networks of cells and brain-wide mechanisms function, we need something that can be detected deep in tissue and preferably across the entire brain at once,” Jasanoff says. “The way that we chose to do that in this study was to essentially hijack the molecular basis of fMRI itself.”

The researchers created a genetic probe, delivered by viruses, that codes for a protein that sends out a signal whenever the neuron is active. This protein, which the researchers called NOSTIC (nitric oxide synthase for targeting image contrast), is an engineered form of an enzyme called nitric oxide synthase. The NOSTIC protein can detect elevated calcium levels that arise during neural activity; it then generates nitric oxide, leading to an artificial fMRI signal that arises only from cells that contain NOSTIC.

The probe is delivered by a virus that is injected into a particular site, after which it travels along axons of neurons that connect to that site. That way, the researchers can label every neural population that feeds into a particular location.

“When we use this virus to deliver our probe in this way, it causes the probe to be expressed in the cells that provide input to the location where we put the virus,” Jasanoff says. “Then, by performing functional imaging of those cells, we can start to measure what makes input to that region take place, or what types of input arrive at that region.”

Turning the gears

In the new study, the researchers used their probe to label populations of neurons that project to the striatum, a region that is involved in planning movement and responding to reward. In rats, they were able to determine which neural populations send input to the striatum during or immediately following a rewarding stimulus — in this case, deep brain stimulation of the lateral hypothalamus, a brain center that is involved in appetite and motivation, among other functions.

One question that researchers have had about deep brain stimulation of the lateral hypothalamus is how wide-ranging the effects are. In this study, the MIT team showed that several neural populations, located in regions including the motor cortex and the entorhinal cortex, which is involved in memory, send input into the striatum following deep brain stimulation.

“It’s not simply input from the site of the deep brain stimulation or from the cells that carry dopamine. There are these other components, both distally and locally, that shape the response, and we can put our finger on them because of the use of this probe,” Jasanoff says.

During these experiments, neurons also generate regular fMRI signals, so in order to distinguish the signals that are coming specifically from the genetically altered neurons, the researchers perform each experiment twice: once with the probe on, and once following treatment with a drug that inhibits the probe. By measuring the difference in fMRI activity between these two conditions, they can determine how much activity is present in probe-containing cells specifically.

The researchers now hope to use this approach, which they call hemogenetics, to study other networks in the brain, beginning with an effort to identify some of the regions that receive input from the striatum following deep brain stimulation.

“One of the things that’s exciting about the approach that we’re introducing is that you can imagine applying the same tool at many sites in the brain and piecing together a network of interlocking gears, which consist of these input and output relationships,” Jasanoff says. “This can lead to a broad perspective on how the brain works as an integrated whole, at the level of neural populations.”

The research was funded by the National Institutes of Health and the MIT Simons Center for the Social Brain.

School of Engineering welcomes new faculty

The School of Engineering is welcoming 17 new faculty members to its departments, institutes, labs, and centers. With research and teaching activities ranging from the development of robotics and machine learning technologies to modeling the impact of elevated carbon dioxide levels on vegetation, they are poised to make significant contributions in new directions across the school and to a wide range of research efforts around the Institute.

“I am delighted to welcome our wonderful new faculty,” says Anantha Chandrakasan, dean of the MIT School of Engineering and Vannevar Bush Professor of Electrical Engineering and Computer Science. “Their impact as talented educators, researchers, collaborators, and mentors will be felt across the School of Engineering and beyond as they strengthen our engineering community.”

Among the new faculty members are four from the Department of Electrical Engineering and Computer Science (EECS), which jointly reports into the School of Engineering and the MIT Stephen A. Schwarzman College of Computing.

Iwnetim “Tim” Abate will join the Department of Materials Science and Engineering in July 2023. He is currently both a Miller and Presidential Postdoctoral Fellow at the University of California at Berkeley. He received his MS and PhD in materials science and engineering from Stanford University and BS in physics from Minnesota State University at Moorhead. He also has research experience in industry (IBM) and at national labs (Los Alamos and SLAC National Accelerator Laboratories). Utilizing computational and experimental approaches in tandem, his research program at MIT will focus on the intersection of material chemistry, electrochemistry, and condensed matter physics to develop solutions for climate change and smart agriculture, including next-generation battery and sensor devices. Abate is also a co-founder and president of a nonprofit organization, SciFro Inc., working on empowering the African youth and underrepresented minorities in the United States to solve local problems through scientific research and innovation. He will continue working on expanding the vision and impact of SciFro with the MIT community. Abate received the Dan Cubicciotti Award of the Electrochemical Society, the EDGE and DARE graduate fellowships, the United Technologies Research Center fellowship, the John Stevens Jr. Memorial Award and the Justice, Equity, Diversity and Inclusion Graduation Award from Stanford University. He will hold the Toyota Career Development Professorship at MIT.

Kaitlyn Becker will join the Department of Mechanical Engineering as an assistant professor in August 2022. Becker received her PhD in materials science and mechanical engineering from Harvard University in 2021 and previously worked in industry as a manufacturing engineer at Cameron Health and a senior engineer for Nano Terra, Inc. She is a postdoc at the Harvard University School of Engineering and Applied Sciences and is also currently a senior glassblowing instructor in the Department of Materials Science and Engineering at MIT. Becker works on adaptive soft robots for grasping and manipulation of delicate structures from the desktop to the deep sea. Her research focuses on novel soft robotic platforms, adding functionality through innovations at the intersection of design and fabrication. She has developed novel fabrication methodologies and mechanical programming methods for large integrated arrays of soft actuators capable of collective manipulation and locomotion, and demonstrated integration of microfluidic circuits to control arrays of multichannel, two-degrees-of-freedom soft actuators. Becker received the National Science Foundation Graduate Research Fellowship in 2015, the Microsoft Graduate Women’s Scholarship in 2015, the Winston Chen Graduate Fellowship in 2015, and the Courtlandt S. Gross Memorial Scholarship in 2014.

Brandon J. DeKosky joined the Department of Chemical Engineering as an assistant professor in a newly introduced joint faculty position between the department and the Ragon Institute of MGH, MIT, and Harvard in September 2021. He received his BS in chemical engineering from University of Kansas and his PhD in chemical engineering from the University of Texas at Austin. He then did postdoctoral research at the Vaccine Research Center of the National Institute of Infectious Diseases. In 2017, Brandon launched his independent academic career as an assistant professor at the University of Kansas in a joint position with the Department of Chemical Engineering and the Department of Pharmaceutical Chemistry. He was also a member of the bioengineering graduate program. His research program focuses on developing and applying a suite of new high-throughput experimental and computational platforms for molecular analysis of adaptive immune responses, to accelerate precision drug discovery. He has received several notable recognitions, which include receipt of the NIH K99 Path to Independence and NIH DP5 Early Independence awards, the Cellular and Molecular Bioengineering Rising Star Award from the Biomedical Engineering Society, and the Career Development Award from the Congressionally Directed Medical Research Program’s Peer Reviewed Cancer Research Program.

Mohsen Ghaffari will join the Department of Electrical Engineering and Computer Science in April 2022. He received his BS from the Sharif University of Technology, and his MS and PhD in EECS from MIT. His research focuses on distributed and parallel algorithms for large graphs. Ghaffari received the ACM Doctoral Dissertation Honorable Mention Award, the ACM-EATCS Principles of Distributed Computing Doctoral Dissertation Award, and the George M. Sprowls Award for Best Computer Science PhD thesis at MIT. Before coming to MIT, he was on the faculty at ETH Zurich, where he received a prestigious European Research Council Starting Grant.

Aristide Gumyusenge joined the Department of Materials Science and Engineering in January. He is currently a postdoc at Stanford University working with Professor Zhenan Bao and Professor Alberto Salleo. He received a BS in chemistry from Wofford College in 2015 and a PhD in chemistry from Purdue University in 2019. His research background and interests are in semiconducting polymers, their processing and characterization, and their unique role in the future of electronics. Particularly, he has tackled longstanding challenges in operation stability of semiconducting polymers under extreme heat and has pioneered high-temperature plastic electronics. He has been selected as a PMSE Future Faculty Scholar (2021), the GLAM Postdoctoral Fellow (2020-22), and the MRS Arthur Nowick and Graduate Student Gold Awardee (2019), among other recognitions. At MIT, he will lead the Laboratory of Organic Materials for Smart Electronics (OMSE Lab). Through polymer design, novel processing strategies, and large-area manufacturing of electronic devices, he is interested in relating molecular design to device performance, especially transistor devices able to mimic and interface with biological systems. He will hold the Merton C. Flemings Career Development Professorship.

Mina Konakovic Lukovic will join the Department of Electrical Engineering and Computer Science as an assistant professor in July 2022. She received her BS and MS from the University of Belgrade, Faculty of Mathematics. She earned her PhD in 2019 in the School of Computer and Communication Sciences at the Swiss Federal Institute of Technology Lausanne, advised by Professor Mark Pauly. Currently a Schmidt Science Postdoctoral Fellow in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), she has been mentored by Professor Wojciech Matusik. Her research focuses on computer graphics, computational fabrication, 3D geometry processing, and machine learning, including architectural geometry and the design of programmable materials. She received the ACM SIGGRAPH Outstanding Doctoral Dissertation Honorable Mention, the Eurographics PhD Award, and was recently awarded the 2021 SIAM Activity Group on Geometric Design Early Career Prize.

Darcy McRose will join the Department of Civil and Environmental Engineering as an assistant professor in August 2022. She completed a BS in Earth systems at Stanford and a PhD in geosciences at Princeton University. Darcy is currently conducting postdoctoral work at Caltech, where she is mentored by Professor Dianne Newman in the divisions of Biology and Biological Engineering and Geological and Planetary Sciences. Her research program focuses on microbe-environment interactions and their effects on biogeochemical cycles, and incorporates techniques ranging from microbial physiology and genetics to geochemistry. A particular emphasis for this work is the production and use of secondary metabolites and small molecules in soils and sediments. McRose received the Caltech BBE Division postdoctoral fellowship in 2019 and is currently a Simons Foundation Marine Microbial Ecology postdoctoral fellow as well as a L’Oréal USA for Women in Science fellow.

Qin (Maggie) Qi joined the Department of Chemical Engineering as an assistant professor in January 2022. She received two BS degrees in chemical engineering and in operations research from Cornell University, before moving on to Stanford for her PhD. She then took on a postdoc position at Harvard University School of Engineering and Applied Sciences and the Wyss Institute. Maggie’s proposed research includes combining extensive theoretical and computational work on predictive models that guide experimental design. She seeks to investigate particle-cell biomechanics and function for better targeted cell-based therapies. She also plans to design microphysiological systems that elucidate hydrodynamics in complex organs, including delivery of drugs to the eye, and to examine ionic liquids as complex fluids for biomaterial design. She aims to push the boundaries of fluid mechanics, transport phenomena, and soft matter for human health and to innovate precision health care solutions. Maggie received the T.S. Lo Graduate Fellowship and the Stanford Graduate Fellowship in Science and Engineering. Among her accomplishments, Maggie was a participant in the inaugural class of the MIT Rising Stars in ChemE Program in 2018.

Manish Raghavan will join the MIT Sloan School of Management and the Department of Electrical Engineering and Computer Science as an assistant professor in September 2022. He shares a joint appointment with the MIT Schwarzman College of Computing. He received a bachelor’s degree in electrical engineering and computer science from the University of California at Berkeley, and PhD from the Computer Science department at Cornell University. Prior to joining MIT, he was a postdoc at the Harvard Center for Research on Computation and Society. His research interests lie in the application of computational techniques to domains of social concern, including algorithmic fairness and behavioral economics, with a particular focus on the use of algorithmic tools in the hiring pipeline. He is also a member of Cornell’s Artificial Intelligence, Policy, and Practice initiative and Mechanism Design for Social Good.

Ritu Raman joined the Department of Mechanical Engineering as an assistant professor and Brit and Alex d’Arbeloff Career Development Chair in August 2021. Raman received her PhD in mechanical engineering from the University of Illinois at Urbana-Champaign as an NSF Graduate Research Fellow in 2016 and completed a postdoctoral fellowship with Professor Robert Langer at MIT, funded by a NASEM Ford Foundation Fellowship and a L’Oréal USA For Women in Science Fellowship. Raman’s lab designs adaptive living materials powered by assemblies of living cells for applications ranging from medicine to machines. Currently, she is focused on using biological materials and engineering tools to build living neuromuscular tissues. Her goal is to help restore mobility to those who have lost it after disease or trauma and to deploy biological actuators as functional components in machines. Raman published the book Biofrabrication with MIT Press in September 2021. She was in the MIT Technology Review “35 Innovators Under 35” 2019 class, the Forbes “30 Under 30” 2018 class, and has received numerous awards including being named a National Academy of Sciences Kavli Frontiers of Science Fellow in 2020 and receiving the Science and Sartorius Prize for Regenerative Medicine and Cell Therapy in 2019. Ritu has championed many initiatives to empower women in science, including being named an AAAS IF/THEN ambassador and founding the Women in Innovation and Stem Database at MIT (WISDM).

Nidhi Seethapathi joined the Department of Brain and Cognitive Sciences and the Department of Electrical Engineering and Computer Science in January 2022. She shares a joint appointment with the MIT Schwarzman College of Computing. She received a bachelor’s degree in mechanical engineering from Veermata Jijabai Technological Institute and a PhD from the Movement Lab at Ohio State University. Her research interests include building computational predictive models of human movement with applications to autonomous and robot-aided neuromotor rehabilitation. In her work, she uses a combination of tools and approaches from dynamics, control theory, and machine learning. During her PhD, she was a Schlumberger Foundation Faculty for the Future Fellow. She then worked as a postdoc in the Kording Lab at University of Pennsylvania, developing data-driven tools for autonomous neuromotor rehabilitation, in collaboration with the Rehabilitation Robotics Lab.

Vincent Sitzmann will join the Department of Electrical Engineering and Computer Science as an assistant professor in July 2022. He earned his BS from the Technical University of Munich in 2015, his MS from Stanford in 2017, and his PhD from Stanford in 2020. At MIT, he will be the principal investigator of the Scene Representation Group, where he will lead research at the intersection of machine learning, graphics, neural rendering, and computer vision to build algorithms that learn to reconstruct, understand, and interact with 3D environments from incomplete observations the way humans can. Currently, Vincent is a postdoc at the MIT Computer Science and Artificial Intelligence Laboratory with Josh Tenenbaum, Bill Freeman, and Fredo Durand. Along with multiple scholarships and fellowships, he has been recognized with the NeurIPS Honorable Mention: Outstanding New Directions in 2019.

Tess Smidt joined the Department of Electrical Engineering and Computer Science as an assistant professor in September 2021. She earned her SB in physics from MIT in 2012 and her PhD in physics from the University of California at Berkeley in 2018. She is the principal investigator of the Atomic Architects group at the Research Laboratory of Electronics, where she works at the intersection of physics, geometry, and machine learning to design algorithms that aid in the understanding and design of physical systems. Her research focuses on machine learning that incorporates physical and geometric constraints, with applications to materials design. Prior to joining the MIT EECS faculty, she was the 2018 Alvarez Postdoctoral Fellow in Computing Sciences at Lawrence Berkeley National Laboratory and a software engineering intern on the Google Accelerated Sciences team, where she developed Euclidean symmetry equivariant neural networks which naturally handle 3D geometry and geometric tensor data.

Loza Tadesse will join the Department of Mechanical Engineering as an assistant professor in July 2023. She received her PhD in bioengineering from Stanford University in 2021 and previously was a medical student at St. Paul Hospital Millennium Medical College in Ethiopia. She is currently a postdoc at the University of California at Berkeley. Tadesse’s past research combines Raman spectroscopy and machine learning to develop a rapid, all-optical, and label-free bacterial diagnostic and antibiotic susceptibility testing system that aims to circumvent the time-consuming culturing step in “gold standard” methods. She aims to establish a research program that develops next-generation point-of-care diagnostic devices using spectroscopy, optical, and machine learning tools for application in resource limited clinical settings such as developing nations, military sites, and space exploration. Tadesse has been listed as a 2022 Forbes “30 Under 30” in health care, received many awards including the Biomedical Engineering Society (BMES) Career Development Award, the Stanford DARE Fellowship and the Gates Foundation “Call to Action” $200,000 grant for SciFro Inc., an educational nonprofit in Ethiopia, which she co-founded.

César Terrer joined the Department of Civil and Environmental Engineering as an assistant professor in July 2021. He obtained his PhD in ecosystem ecology and climate change from Imperial College London, where he started working at the interface between experiments and models to better understand the effects of elevated carbon dioxide on vegetation. His research has advanced the understanding on the effects of carbon dioxide in terrestrial ecosystems, the role of soil nutrients in a climate change context, and plant-soil interactions. Synthesizing observational data from carbon dioxide experiments and satellites through meta-analysis and machine learning, César has found that microbial interactions between plants and soils play a major role in the carbon cycle at a global scale, affecting the speed of global warming.

Haruko Wainwright joined the Department of Nuclear Science and Engineering as an assistant professor in January 2021. She received her BEng in engineering physics from Kyoto University, Japan in 2003, her MS in nuclear engineering in 2006, her MA in statistics in 2010, and her PhD in nuclear engineering in 2010 from University of California at Berkeley. Before joining MIT, she was a staff scientist in the Earth and Environmental Sciences Area at Lawrence Berkeley National Laboratory and an adjunct professor in nuclear engineering at UC Berkeley. Her research focuses on environmental modeling and monitoring technologies, with a particular emphasis on nuclear waste and nuclear-related contamination. She has been developing Bayesian methods for multi-type multiscale data integration and model-data integration. She leads and co-leads multiple interdisciplinary projects, including the U.S. Department of Energy’s Advanced Long-term Environmental Monitoring Systems (ALTEMIS) project, and the Artificial Intelligence for Earth System Predictability (AI4ESP) initiative.

Martin Wainwright will join the Department of Electrical Engineering and Computer Science in July 2022. He received a bachelor’s degree in mathematics from University of Waterloo, Canada, and PhD in EECS from MIT. Prior to joining MIT, he was the Chancellor’s Professor at the University of California at Berkeley, with a joint appointment between the Department of Statistics and the Department of EECS. His research interests include high-dimensional statistics, statistical machine learning, information theory, and optimization theory. Among other awards, he has received the COPSS Presidents’ Award (2014) from the Joint Statistical Societies, the David Blackwell Lectureship (2017), and Medallion Lectureship (2013) from the Institute of Mathematical Statistics, and Best Paper awards from the IEEE Signal Processing Society and IEEE Information Theory Society. He was a Section Lecturer at the International Congress of Mathematicians in 2014.

 

Augmented: The journey of Hugh Herr

Augmented is a Nova PBS documentary that premiered in February 2022, featuring Hugh Herr, the co-director of the K. Lisa Yang Center for Bionics at MIT.

Follow the dramatic personal journey of Hugh Herr, a biophysicist working to create brain-controlled robotic limbs. At age 17, Herr’s legs were amputated after a climbing accident. Frustrated by the crude prosthetic limbs he was given, Herr set out to remedy their design, leading him to a career as an inventor of innovative prosthetic devices. Now, Herr is teaming up with an injured climber and a surgeon at a leading Boston hospital to test a new approach to surgical amputation that allows prosthetic limbs to move and feel like the real thing. Herr’s journey is a powerful tale of innovation and the inspiring story of a personal tragedy transformed into a life-long quest to help others.

Read more at PBS.org.

David Ginty named winner of 2022 Scolnick Prize

Harvard neurobiologist David Ginty, winner of the 2022 Scolnick Prize.

The McGovern Institute for Brain Research announced today that Harvard neurobiologist David D. Ginty has been selected for the 2022 Edward M. Scolnick Prize in Neuroscience. Ginty, who is the Edward R. and Anne G. Lefler Professor of Neurobiology at Harvard Medical School, is being recognized for his fundamental discoveries into the neural mechanisms underlying the sense of touch. The Scolnick Prize is awarded annually by the McGovern Institute for outstanding advances in neuroscience.

“David Ginty has made seminal contributions in basic research that also have important translational implications,” says Robert Desimone, McGovern Institute Director and chair of the selection committee. “His rigorous research has led us to understand how the peripheral nervous system encodes the overall perception of touch, and how molecular mechanisms underlying this can fail in disease states.”

Ginty obtained his PhD in 1989 with Edward Seidel where he studied cell proliferation factors. He went on to a postdoctoral fellowship researching nerve growth factor with John Wagner at the Dana-Farber Cancer Institute and, upon Wagner’s departure to Cornell, transferred to Michael Greenberg’s lab at Harvard Medical School. There, he dissected intracellular signaling pathways for neuronal growth factors and neurotransmitters and developed key antibody reagents to detect activated forms of transcription factors. These antibody tools are now used by labs around the world in the research of neuronal plasticity and brain disorders, including Alzheimer’s disease and schizophrenia.

In 1995, Ginty started his own laboratory at Johns Hopkins University with a focus on the development and functional organization of the peripheral nervous system. Ginty’s group created and applied the latest genetic engineering techniques in mice to uncover how the peripheral nervous system develops and is organized at the molecular, cellular and circuit levels to perceive touch. Most notably, using gene targeting combined with electrophysiological, behavioral and anatomical analyses, the Ginty lab untangled properties and functions of the different types of touch neurons, termed low- and high-threshold mechanoreceptors, that convey distinct aspects of stimulus information from the skin to the central nervous system. Ginty and colleagues also discovered organizational principles of spinal cord and brainstem circuits dedicated to touch information processing, and that integration of signals from the different mechanoreceptor types begins within spinal cord networks before signal transmission to the brain.

In 2013, Ginty joined the faculty of Harvard Medical School where his team applied their genetic tools and techniques to probe the neural basis of touch sensitivity disorders. They discovered properties and functions of peripheral sensory neurons, spinal cord circuits, and ascending pathways that transmit noxious, painful stimuli from the periphery to the brain. They also asked whether abnormalities in peripheral nervous system function lead to touch over-reactivity in cases of autism or in neuropathic pain caused by nerve injury, chemotherapy, or diabetes, where even a soft touch can be aversive or painful. His team found that sensory abnormalities observed in several mouse models of autism spectrum disorder could be traced to peripheral mechanosensory neurons. They also found that reducing the activity of peripheral sensory neurons prevented tactile over-reactivity in these models and even, in some cases, lessened anxiety and abnormal social behaviors. These findings provided a plausible explanation for how sensory dysfunction may contribute to physiological and cognitive impairments in autism. Importantly, this laid the groundwork for a new approach and initiative to identify new potential therapies for disorders of touch and pain.

Ginty was named a Howard Hughes Medical Institute Investigator in 2000 and was elected to the American Academy of Arts and Sciences in 2015 and the National Academy of Sciences in 2017. He shared Columbia University’s Alden W. Spencer Prize with Ardem Patapoutian in 2017 and was awarded the Society for Neuroscience Julius Axelrod Prize in 2021. Ginty is also known for exceptional mentorship. He directed the neuroscience graduate program at Johns Hopkins from 2006 to 2013 and now serves as the associate director of Harvard’s neurobiology graduate program.

The McGovern Institute will award the Scolnick Prize to Ginty on Wednesday, June 1, 2022. At 4:00 pm he will deliver a lecture entitled “The sensory neurons of touch: beauty is skin deep,” to be followed by a reception at the McGovern Institute, 43 Vassar Street (building 46, room 3002) in Cambridge. The event is free and open to the public; registration is required.

Singing in the brain

Press Mentions

For the first time, MIT neuroscientists have identified a population of neurons in the human brain that lights up when we hear singing, but not other types of music.

These neurons, found in the auditory cortex, appear to respond to the specific combination of voice and music, but not to either regular speech or instrumental music. Exactly what they are doing is unknown and will require more work to uncover, the researchers say.

“The work provides evidence for relatively fine-grained segregation of function within the auditory cortex, in a way that aligns with an intuitive distinction within music,” says Sam Norman-Haignere, a former MIT postdoc who is now an assistant professor of neuroscience at the University of Rochester Medical Center.

The work builds on a 2015 study in which the same research team used functional magnetic resonance imaging (fMRI) to identify a population of neurons in the brain’s auditory cortex that responds specifically to music. In the new work, the researchers used recordings of electrical activity taken at the surface of the brain, which gave them much more precise information than fMRI.

“There’s one population of neurons that responds to singing, and then very nearby is another population of neurons that responds broadly to lots of music. At the scale of fMRI, they’re so close that you can’t disentangle them, but with intracranial recordings, we get additional resolution, and that’s what we believe allowed us to pick them apart,” says Norman-Haignere.

Norman-Haignere is the lead author of the study, which appears today in the journal Current Biology. Josh McDermott, an associate professor of brain and cognitive sciences, and Nancy Kanwisher, the Walter A. Rosenblith Professor of Cognitive Neuroscience, both members of MIT’s McGovern Institute for Brain Research and Center for Brains, Minds and Machines (CBMM), are the senior authors of the study.

Neural recordings

In their 2015 study, the researchers used fMRI to scan the brains of participants as they listened to a collection of 165 sounds, including different types of speech and music, as well as everyday sounds such as finger tapping or a dog barking. For that study, the researchers devised a novel method of analyzing the fMRI data, which allowed them to identify six neural populations with different response patterns, including the music-selective population and another population that responds selectively to speech.

In the new study, the researchers hoped to obtain higher-resolution data using a technique known as electrocorticography (ECoG), which allows electrical activity to be recorded by electrodes placed inside the skull. This offers a much more precise picture of electrical activity in the brain compared to fMRI, which measures blood flow in the brain as a proxy of neuron activity.

“With most of the methods in human cognitive neuroscience, you can’t see the neural representations,” Kanwisher says. “Most of the kind of data we can collect can tell us that here’s a piece of brain that does something, but that’s pretty limited. We want to know what’s represented in there.”

Electrocorticography cannot be typically be performed in humans because it is an invasive procedure, but it is often used to monitor patients with epilepsy who are about to undergo surgery to treat their seizures. Patients are monitored over several days so that doctors can determine where their seizures are originating before operating. During that time, if patients agree, they can participate in studies that involve measuring their brain activity while performing certain tasks. For this study, the MIT team was able to gather data from 15 participants over several years.

For those participants, the researchers played the same set of 165 sounds that they used in the earlier fMRI study. The location of each patient’s electrodes was determined by their surgeons, so some did not pick up any responses to auditory input, but many did. Using a novel statistical analysis that they developed, the researchers were able to infer the types of neural populations that produced the data that were recorded by each electrode.

“When we applied this method to this data set, this neural response pattern popped out that only responded to singing,” Norman-Haignere says. “This was a finding we really didn’t expect, so it very much justifies the whole point of the approach, which is to reveal potentially novel things you might not think to look for.”

That song-specific population of neurons had very weak responses to either speech or instrumental music, and therefore is distinct from the music- and speech-selective populations identified in their 2015 study.

Music in the brain

In the second part of their study, the researchers devised a mathematical method to combine the data from the intracranial recordings with the fMRI data from their 2015 study. Because fMRI can cover a much larger portion of the brain, this allowed them to determine more precisely the locations of the neural populations that respond to singing.

“This way of combining ECoG and fMRI is a significant methodological advance,” McDermott says. “A lot of people have been doing ECoG over the past 10 or 15 years, but it’s always been limited by this issue of the sparsity of the recordings. Sam is really the first person who figured out how to combine the improved resolution of the electrode recordings with fMRI data to get better localization of the overall responses.”

The song-specific hotspot that they found is located at the top of the temporal lobe, near regions that are selective for language and music. That location suggests that the song-specific population may be responding to features such as the perceived pitch, or the interaction between words and perceived pitch, before sending information to other parts of the brain for further processing, the researchers say.

The researchers now hope to learn more about what aspects of singing drive the responses of these neurons. They are also working with MIT Professor Rebecca Saxe’s lab to study whether infants have music-selective areas, in hopes of learning more about when and how these brain regions develop.

The research was funded by the National Institutes of Health, the U.S. Army Research Office, the National Science Foundation, the NSF Science and Technology Center for Brains, Minds, and Machines, the Fondazione Neurone, the Howard Hughes Medical Institute, and the Kristin R. Pressman and Jessica J. Pourian ’13 Fund at MIT.

On a mission to alleviate chronic pain

About 50 million Americans suffer from chronic pain, which interferes with their daily life, social interactions, and ability to work. MIT Professor Fan Wang wants to develop new ways to help relieve that pain, by studying and potentially modifying the brain’s own pain control mechanisms.

Her recent work has identified an “off switch” for pain, located in the brain’s amygdala. She hopes that finding ways to control this switch could lead to new treatments for chronic pain.

“Chronic pain is a major societal issue,” Wang says. “By studying pain-suppression neurons in the brain’s central amygdala, I hope to create a new therapeutic approach for alleviating pain.”

Wang, who joined the MIT faculty in January 2021, is also the leader of a new initiative at the McGovern Institute for Brain Research that is studying drug addiction, with the goal of developing more effective treatments for addiction.

“Opioid prescription for chronic pain is a major contributor to the opioid epidemic. With the Covid pandemic, I think addiction and overdose are becoming worse. People are more anxious, and they seek drugs to alleviate such mental pain,” Wang says. “As scientists, it’s our duty to tackle this problem.”

Sensory circuits

Wang, who grew up in Beijing, describes herself as “a nerdy child” who loved books and math. In high school, she took part in science competitions, then went on to study biology at Tsinghua University. She arrived in the United States in 1993 to begin her PhD at Columbia University. There, she worked on tracing the connection patterns of olfactory receptor neurons in the lab of Richard Axel, who later won the Nobel Prize for his discoveries of odorant receptors and how the olfactory system is organized.

After finishing her PhD, Wang decided to switch gears. As a postdoc at the University of California at San Francisco and then Stanford University, she began studying how the brain perceives touch.

In 2003, Wang joined the faculty at Duke University School of Medicine. There, she began developing techniques to study the brain circuits that underlie the sense of touch, tracing circuits that carry sensory information from the whiskers of mice to the brain. She also studied how the brain integrates movements of touch organs with signals of sensory stimuli to generate perception (such as using stretching movements to sense elasticity).

As she pursued her sensory perception studies, Wang became interested in studying pain perception, but she felt she needed to develop new techniques to tackle it. While at Duke, she invented a technique called CANE (capturing activated neural ensembles), which can identify networks of neurons that are activated by a particular stimulus.

Using this approach in mice, she identified neurons that become active in response to pain, but so many neurons across the brain were activated that it didn’t offer much useful information. As a way to indirectly get at how the brain controls pain, she decided to use CANE to explore the effects of drugs used for general anesthesia. During general anesthesia, drugs render a patient unconscious, but Wang hypothesized that the drugs might also shut off pain perception.

“At that time, it was just a wild idea,” Wang recalls. “I thought there may be other mechanisms — that instead of just a loss of consciousness, anesthetics may do something to the brain that actually turns pain off.”

Support for the existence of an “off switch” for pain came from the observation that wounded soldiers on a battlefield can continue to fight, essentially blocking out pain despite their injuries.

In a study of mice treated with anesthesia drugs, Wang discovered that the brain does have this kind of switch, in an unexpected location: the amygdala, which is involved in regulating emotion. She showed that this cluster of neurons can turn off pain when activated, and when it is suppressed, mice become highly sensitive to ordinary gentle touch.

“There’s a baseline level of activity that makes the animals feel normal, and when you activate these neurons, they’ll feel less pain. When you silence them, they’ll feel more pain,” Wang says.

Turning off pain

That finding, which Wang reported in 2020, raised the possibility of somehow modulating that switch in humans to try to treat chronic pain. This is a long-term goal of Wang’s, but more work is required to achieve it, she says. Currently her lab is working on analyzing the RNA expression patterns of the neurons in the cluster she identified. They also are measuring the neurons’ electrical activity and how they interact with other neurons in the brain, in hopes of identifying circuits that could be targeted to tamp down the perception of pain.

One way of modulating these circuits could be to use deep brain stimulation, which involves implanting electrodes in certain areas of the brain. Focused ultrasound, which is still in early stages of development and does not require surgery, could be a less invasive alternative.

Another approach Wang is interested in exploring is pairing brain stimulation with a context such as looking at a smartphone app. This kind of pairing could help train the brain to shut off pain using the app, without the need for the original stimulation (deep brain stimulation or ultrasound).

“Maybe you don’t need to constantly stimulate the brain. You may just need to reactivate it with a context,” Wang says. “After a while you would probably need to be restimulated, or reconditioned, but at least you have a longer window where you don’t need to go to the hospital for stimulation, and you just need to use a context.”

Wang, who was drawn to MIT in part by its focus on fostering interdisciplinary collaborations, is now working with several other McGovern Institute members who are taking different angles to try to figure out how the brain generates the state of craving that occurs in drug addiction, including opioid addiction.

“We’re going to focus on trying to understand this craving state: how it’s created in the brain and how can we sort of erase that trace in the brain, or at least control it. And then you can neuromodulate it in real time, for example, and give people a chance to get back their control,” she says.

Dendrites may help neurons perform complicated calculations

Within the human brain, neurons perform complex calculations on information they receive. Researchers at MIT have now demonstrated how dendrites — branch-like extensions that protrude from neurons — help to perform those computations.

The researchers found that within a single neuron, different types of dendrites receive input from distinct parts of the brain, and process it in different ways. These differences may help neurons to integrate a variety of inputs and generate an appropriate response, the researchers say.

In the neurons that the researchers examined in this study, it appears that this dendritic processing helps cells to take in visual information and combine it with motor feedback, in a circuit that is involved in navigation and planning movement.

“Our hypothesis is that these neurons have the ability to pick out specific features and landmarks in the visual environment, and combine them with information about running speed, where I’m going, and when I’m going to start, to move toward a goal position,” says Mark Harnett, an associate professor of brain and cognitive sciences, a member of MIT’s McGovern Institute for Brain Research, and the senior author of the study.

Mathieu Lafourcade, a former MIT postdoc, is the lead author of the paper, which appears today in Neuron.

Complex calculations

Any given neuron can have dozens of dendrites, which receive synaptic input from other neurons. Neuroscientists have hypothesized that these dendrites can act as compartments that perform their own computations on incoming information before sending the results to the body of the neuron, which integrates all these signals to generate an output.

Previous research has shown that dendrites can amplify incoming signals using specialized proteins called NMDA receptors. These are voltage-sensitive neurotransmitter receptors that are dependent on the activity of other receptors called AMPA receptors. When a dendrite receives many incoming signals through AMPA receptors at the same time, the threshold to activate nearby NMDA receptors is reached, creating an extra burst of current.

This phenomenon, known as supralinearity, is believed to help neurons distinguish between inputs that arrive close together or farther apart in time or space, Harnett says.

In the new study, the MIT researchers wanted to determine whether different types of inputs are targeted specifically to different types of dendrites, and if so, how that would affect the computations performed by those neurons. They focused on a population of neurons called pyramidal cells, the principal output neurons of the cortex, which have several different types of dendrites. Basal dendrites extend below the body of the neuron, apical oblique dendrites extend from a trunk that travels up from the body, and tuft dendrites are located at the top of the trunk.

Harnett and his colleagues chose a part of the brain called the retrosplenial cortex (RSC) for their studies because it is a good model for association cortex — the type of brain cortex used for complex functions such as planning, communication, and social cognition. The RSC integrates information from many parts of the brain to guide navigation, and pyramidal neurons play a key role in that function.

In a study of mice, the researchers first showed that three different types of input come into pyramidal neurons of the RSC: from the visual cortex into basal dendrites, from the motor cortex into apical oblique dendrites, and from the lateral nuclei of the thalamus, a visual processing area, into tuft dendrites.

“Until now, there hasn’t been much mapping of what inputs are going to those dendrites,” Harnett says. “We found that there are some sophisticated wiring rules here, with different inputs going to different dendrites.”

A range of responses

The researchers then measured electrical activity in each of those compartments. They expected that NMDA receptors would show supralinear activity, because this behavior has been demonstrated before in dendrites of pyramidal neurons in both the primary sensory cortex and the hippocampus.

In the basal dendrites, the researchers saw just what they expected: Input coming from the visual cortex provoked supralinear electrical spikes, generated by NMDA receptors. However, just 50 microns away, in the apical oblique dendrites of the same cells, the researchers found no signs of supralinear activity. Instead, input to those dendrites drives a steady linear response. Those dendrites also have a much lower density of NMDA receptors.

“That was shocking, because no one’s ever reported that before,” Harnett says. “What that means is the apical obliques don’t care about the pattern of input. Inputs can be separated in time, or together in time, and it doesn’t matter. It’s just a linear integrator that’s telling the cell how much input it’s getting, without doing any computation on it.”

Those linear inputs likely represent information such as running speed or destination, Harnett says, while the visual information coming into the basal dendrites represents landmarks or other features of the environment. The supralinearity of the basal dendrites allows them to perform more sophisticated types of computation on that visual input, which the researchers hypothesize allows the RSC to flexibly adapt to changes in the visual environment.

In the tuft dendrites, which receive input from the thalamus, it appears that NMDA spikes can be generated, but not very easily. Like the apical oblique dendrites, the tuft dendrites have a low density of NMDA receptors. Harnett’s lab is now studying what happens in all of these different types of dendrites as mice perform navigation tasks.

The research was funded by a Boehringer Ingelheim Fonds PhD Fellowship, the National Institutes of Health, the James W. and Patricia T. Poitras Fund, the Klingenstein-Simons Fellowship Program, a Vallee Scholar Award, and a McKnight Scholar Award.