You are on a bridge overlooking railway tracks, and you spot a runaway trolley bearing down on five railway workers. Throwing a switch will divert the trolley away from the group but doom a lone worker on a different set of tracks. Most people would have no difficulty throwing the switch to kill one person, saving five. But what if the only way to save the group of five was to throw the man standing next to you onto the tracks to block the trolley? This situation usually elicits spasms of guilt, though the outcome in both situations is equivalent.
Such moral dilemmas, and the paradoxes they create, have been studied for centuries. The Morality Lab at Boston College applies modern behavioral and neuroimaging methods to identify the psychological and brain bases of moral judgments. This work is conducted with typical participants as well as those with brain lesions who have selective cognitive deficits.
“Morality fascinates people; everyone has opinions about moral problems and their own answers to moral questions,” said Liane Young, PhD, a professor of psychology and neuroscience at Boston College and director of the Morality Lab. “People can experience moral disagreements not only with others, including close friends and family, but within themselves as well, in the case of moral dilemmas.”
A major clue regarding what underlies moral judgments comes from Young’s discovery more than a decade ago that certain brain regions involved in “theory of mind” are recruited for people’s assessments of moral right and wrong. (Theory of mind, which develops at about age 4 or 5, is the ability to discern agents’ mental states, including their intentions, desires, motives, and beliefs). “The key thing that differentiates our lab from others studying morality is the focus on the role of theory of mind,” said postdoctoral researcher Justin Martin, PhD. “All of our research is infused with this interest in how we understand what people are thinking, what they’re trying to do, what they tried to do in the past, and how that influences our understanding of people as they are now.”
Besides Young, the lab currently includes three postdoctoral researchers, three graduate students, three full-time staff, and six undergraduate research assistants. Together, they construct and validate vignettes featuring moral scenarios, create online surveys, and collect and analyze data—with the goal of uncovering the origins of moral reasoning. In addition, the group has expanded beyond what is traditionally considered moral cognition into social psychology, exploring questions such as how people respond when others violate social norms, how people think about obligations to those close to them versus strangers, and how people evaluate others’ virtuous behaviors.
Young studied philosophy as an undergraduate at Harvard University. There, she worked with philosopher Frances Kamm, PhD, on moral dilemmas such as the trolley problem described earlier. “I was drawn to philosophy because I wanted to understand the origins of morality and the existence of objective moral truth, independent of human psychology,” Young said.
This interest in moral dilemmas and moral truth later guided her graduate research in the lab of Marc Hauser, PhD, formerly a professor of psychology at Harvard. At the time, a debate was raging among moral psychologists and neuroscientists about whether people assess the morality of others’ behavior based on their emotional response to the behavior or based on their rational consideration of factors like the intentions underlying the behavior.
To explore the role of these processes in moral judgment, Young and her colleagues asked what would happen to moral judgments if emotions were removed from the picture or at least dramatically blunted. To do this, they turned to the brain’s ventromedial prefrontal cortex (VMPC). Individuals with focal lesions to the VMPC are known to have reduced social emotions, such as compassion, shame, and guilt—emotions often implicated in moral judgments. Young and her colleagues asked participants with focal lesions to the VMPC and control participants to choose between two courses of action in a hypothetical scenario, such as whether or not to smother their own child to prevent harm to others. They found that participants with VMPC lesions would more often make the emotionally aversive, utilitarian choice (sacrificing their child to save the group), compared with control participants (Nature, Vol. 446, No. 7138, 2007). This suggests at least some moral judgments are driven by emotions; when these emotions are lacking, as in patients with damage to the VMPC, a different pattern of moral judgments emerges.
To understand the role of intentions in moral judgments, Young teamed up with Rebecca Saxe, PhD, of MIT, with whom Young would go on to do a postdoctoral fellowship. Using fMRI, the group discovered that activity in the right temporoparietal junction (rTPJ)—a brain region linked to theory of mind through involvement in representing and reasoning in people’s thoughts, beliefs, and intentions—was greatest when participants evaluated situations in which intentions, but not outcomes, were harmful, such as a botched poisoning (PNAS, Vol. 104, No. 20, 2007). Furthermore, temporarily interfering with rTPJ activity using transcranial magnetic stimulation caused people to downplay intentions and judge cases like attempted poisoning less harshly because, ultimately, no harmful outcome occurred (PNAS, Vol. 107, No. 15, 2010). These findings demonstrated that intentions are a crucial component of moral judgments and hinted at the neural mechanisms involved.
In light of these studies showing a neural basis of morality, Young and her colleagues became curious whether moral judgments are universal and whether certain neuropathologies give rise to divergent moral reasoning. In one study performed in Hauser’s lab, Young presented moral dilemmas to more than 200,000 people from 120 countries and found that judgments were similar across gender, age, education level, and cultural background (Mind & Language, Vol. 22, No. 1, 2007). In other studies seeking exceptions to a universal moral code, she found that individuals with psychopathy show increased activity in the dorsolateral prefrontal cortex during moral decision-making (Glenn, A. L., et al., Molecular Psychiatry, Vol. 14, No. 10, 2009), and people with autism tend to make more outcome-based moral judgments than others (PNAS, Vol. 108, No. 7, 2011), as do incarcerated terrorists (Nature Human Behaviour, Vol. 1, 2017).
Following her fellowship with Saxe, Young launched the Morality Lab at Boston College, where she was hired as a faculty member in 2011. As the lab has grown, Young’s research topics have become increasingly guided by the interests and expertise of her graduate students and postdocs. “Liane’s able to make connections between people and ideas that you wouldn’t necessarily think are connected,” said Martin. “That ability to see different people’s interests, bring them together in conversation, and create real collaborations is so impressive.”
Among Young’s first graduate students was Laura Niemi, PhD, now an assistant professor of psychology at Cornell University. Niemi and Young found that the more that people hold “binding” values such as loyalty, obedience, and purity (focused on building and maintaining groups)—as opposed to “individualizing” values such as caring and fairness (which apply universally to all individuals independent of group membership)—the more likely they are to blame the victim of a crime (Personality and Social Psychology Bulletin, Vol. 42, No. 9, 2016). They also discovered that people’s moral judgments could be nudged by placing either the perpetrator or the victim as the subject of sentences describing sexual assault (“Mary was forced by Dave” versus “Dave forced Mary”). Focusing attention on the perpetrator, by using their name as the subject, led to reduced ratings of victim blame, victim responsibility, and references to victims’ actions, whereas a focus on victims led to greater victim blaming. “More information about the victim actually made the victim appear more like an agent and therefore more morally responsible for what happened to them,” Young said.
Language is not the only method for manipulating people’s moral views of certain situations. The way that scenarios are juxtaposed with one another and whether the scenarios are imbued with information about social relationships can sway moral judgments, too, as shown recently by Boston College graduate student and lab member Ryan McManus’s study of people’s sense of moral obligations. Participants rated a hypothetical person who helped a stranger as more morally good than another person who helped a relative. However, when they heard about a hypothetical person who chose between helping either a stranger or a relative, participants rated the person more morally good if they chose to help the relative—the opposite of the first result. A third study in which the hypothetical person was in a position of authority and under obligation to allocate resources impartially reversed people’s judgments once more. In this case, participants thought it was morally better to help a stranger over a family member (Psychological Science, Vol. 31, No. 3, 2020). “The conclusion here is that moral psychology is highly flexible and sensitive to a host of factors that we might not even have introspective access to,” McManus said.
Another graduate student, Minjae Kim, is moving the lab further toward topics in social psychology. “Even though it’s the Morality Lab,” Kim said, “there’s room for a much more expansive look at how we learn about other people, rather than just judging the morality of their behavior.”
In a recent theoretical paper, Kim proposed that the tendency to selectively excuse a friend’s bad behavior, compared with a stranger’s, may arise from having stronger prior beliefs about the friend’s good character, not just from a motivation to view close others in a positive light (Trends in Cognitive Sciences, Vol. 24, No. 2, 2020). Kim further proposed that when strong prior beliefs are violated, brain regions related to theory of mind are recruited to a greater degree, supporting the generation of alternative explanations for apparently bad behavior. An fMRI study revealed brain activation patterns consistent with this prediction (Cerebral Cortex, Vol. 31, No. 2, 2021).
Martin is also curious about how people respond to others’ bad behavior. “Do we end the relationship or just punish them?” he asked. In adults, whether people decide to maintain or sever relationships with a partner can be influenced by how intentional they perceive bad behavior to be, whereas punishment decisions are influenced by both intent and outcome. Preliminary results yet to be published in a peer-reviewed journal indicate that choices about ending relationships based on the partner’s behavior are sensitive to intentions starting at age 5, whereas punishment decisions are influenced by intentions only starting between the ages of 7 and 8.
While Martin explores reactions to poor behavior, another postdoc, Gordon Kraft-Todd, PhD, scrutinizes people’s views of virtuous behavior, revealing a phenomenon he has identified as “differential virtue discounting.” The idea is that whether virtuous behaviors are seen by others can impact how they are judged. For example, people think that when others are generous in private, this behavior is more morally good than when people are generous in public. Preliminary results, however, suggest the phenomenon looks very different for different virtues. For impartiality, which is akin to fairness or justice, this sort of “virtue discounting” is not seen. Kraft-Todd and Young plan to determine whether discounting exists for other virtues, such as loyalty and gratitude. In addition, they are planning a study on how virtue discounting plays out in specific applied areas such as education and environmental activism.
“Moral psychologists are uniquely positioned not only to answer basic science questions about how the mind and brain make moral decisions, how people act, how people evaluate others’ actions, but also to apply this knowledge about human moral psychology to benefit real people in the real world,” Young said.
In the past few years, members of Liane Young’s Morality Lab at Boston College have been seeking ways to apply their research to help guide individuals and organizations in making decisions based on sound moral reasoning.
Postdoctoral fellow Gordon Kraft-Todd, PhD, for example, has brought Young and other lab members into the work of the Applied Cooperation Team, currently housed at MIT, a project he has been a part of since he was a Yale graduate student. The team performs field experiments in partnership with nonprofits, businesses, and government agencies using social incentives and appeals to people’s sense of morality to encourage people to contribute to public welfare (e.g., charitable giving, installing solar panels, and medication adherence for communicable diseases). One project is a public service campaign promoting smarter recycling.
In another effort to apply the lab’s research to real-world problems, graduate student Isaac Handley-Miner is launching a project on how people infer motivations underlying different sources of information, and how these inferences influence their trust in the information. He is aiming to investigate these questions in the context of combating misinformation and enhancing vaccine trust among Latinx populations in Boston.
Meanwhile, Young and her lab manager, Aditi Kodipady, recently taught members of the Boston College Innocence Program—a group that works to prevent and overturn erroneous convictions—and a standing committee of the Massachusetts Supreme Judicial Court about social psychology concepts to develop instructions for jurors that guard against implicit bias. These instructions are aimed at helping jurors consciously avoid relying on unsupported assumptions about defendants in rendering their decisions.
“Our hope with this work is that some of the knowledge that we gain and theories that we advance can be leveraged to enhance moral reasoning, prosocial action, and social relationships,” Young said.
The cognitive neuroscience of moral judgment and decision-making
Greene, J. D., & Young, L., In The Cognitive Neurosciences, Gazzaniga, M. S., et al. (Eds.), MIT Press, 2020