Scans suggest how the mind solves ethical dilemmas

Brain region balances competing interests in moral judgments

View the slideshow

Deciding whether to kill one person to save five is a true brain teaser. A study in the March 26 Journal of Neuroscience describes the neural tug-of-war that results in a moral decision.

Cognitive neuroscientists Amitai Shenhav of Princeton University and Joshua Greene of Harvard University asked 35 people to weigh in on 48 wrenching scenarios while undergoing functional MRI brain scans. The researchers used scenarios akin to the famous trolley choice: The hypothetical dilemma forces a person to decide whether to push an innocent man to his death to stop a runaway trolley from killing five people.

This type of moral quandary evokes competing motivations: the urge to save the greatest number of people and the desire to avoid emotionally repellent behavior. In their experiment, Shenhav and Greene separated these considerations by asking people to consider how emotionally wrenching a certain behavior would be or to consider only the greater good.

In one such dilemma, a live grenade sails into a cafe where 10 people sit. Participants were told that they could ignore the grenade, leaving the 10 people to die, or grab the grenade and throw it onto the patio, saving the 10 people but killing a lone diner outside. When study participants considered only emotions (“Which do you feel worse about doing?”), activity increased in the left amygdala, one of a pair of almond-shaped structures deep in the brain. The more emotionally repellent an action was to participants, the more activity they had in the left amygdala, Shenhav and Greene found. The experiment couldn’t pinpoint where utilitarian “greater good” considerations get made.

When people were asked to make a moral decision between actions considering all aspects of the dilemma, a different brain area seemed to step in. Activity in the ventromedial prefrontal cortex, a patch of tissue near the front of the brain, was greater when people were asked to make an overall choice compared with when people considered only emotions, the researchers found. After being lobbied by other brain regions, the vmPFC ultimately makes the call, the researchers suggest.

“It’s some of the best support we’ve seen so far for the theory that the vmPFC is integrating emotional assessments from the amygdala,” says cognitive neuroscientist Molly Crockett of University College London.

Earlier work implicated the vmPFC as a final arbiter in other types of decisions. This brain region seems to be involved in the choice to eat healthful or sinfully delicious food, for instance.

Crockett cautions that many of the moral dilemmas used in these kinds of experiments fall outside the realm of possibility for most people. “We’re interested in moral decision making, but we’re studying really unrealistic situations involving pushing people off bridges,” she says. “It’s not clear whether these neural mechanisms apply to real moral decisions.”

Understanding how morality arises in the brain might have legal implications. Lawyers have argued that defendants’ culpability depends on certain neural traits, for instance. And a 2011 study found that psychopaths have weak connections between the amygdala and the vmPFC (SN Online: 11/30/11). Knowing which brain areas guide moral decisions might lead to a better understanding of what behavior to expect when those regions are damaged, Shenhav says. 


Fuzzy morals

To study how the brain makes moral decisions, scientists presented scenarios to people getting brain scans that forced them to weigh personal emotions against the greater good. Here are a few of the wrenching decisions people were asked to make:
 

Laura Sanders is the neuroscience writer. She holds a Ph.D. in molecular biology from the University of Southern California.