© 2024 New England Public Media

FCC public inspection files:
WGBYWFCRWNNZWNNUWNNZ-FMWNNI

For assistance accessing our public files, please contact hello@nepm.org or call 413-781-2801.
PBS, NPR and local perspective for western Mass.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

How Showing Special Kindness To Some Can Have Moral Consequences

STEVE INSKEEP, HOST:

When we talk about discrimination, we typically think of people denied something because of who they are, some defining trait. What we can overlook is a different kind of discrimination, which comes from love. NPR's Shankar Vedantam asks why good deeds, those we do on behalf of our spouses or our neighbors, can sometimes lead to injustice.

(SOUNDBITE OF ARCHIVED NPR BROADCAST)

SHANKAR VEDANTAM, BYLINE: Thought experiments in philosophy usually don't go viral, but there's one that has.

JOSHUA GREENE: The trolley problem has become a kind of meme.

VEDANTAM: This is Joshua Greene.

GREENE: I'm a professor in the psychology department at Harvard University. And I study moral decision-making and high-level cognition.

VEDANTAM: Greene has researched how people respond to the trolley problem. Here's the classic version of the thought experiment.

GREENE: The trolley is headed towards five people. But you can hit a switch that will turn it onto a side track where it'll only kill one person. The question is, is it OK to hit the switch to minimize the loss of life?

VEDANTAM: Many people find this moral dilemma painful but will tell you, yes. Killing one stranger is the right thing to do if it will save five. There's a version of the trolley problem that's even more painful. It forces us to reckon with the bias that we usually don't see as a problem, the bias to care about the people we love above all others.

In this version of the thought experiment, five strangers will be saved if you sacrifice someone you know and love, maybe even your own child. Hannah Groch-Begley and Dylan Matthews have discussed this moral dilemma in their relationship. They've come to very different conclusions.

HANNAH GROCH-BEGLEY: I would save the child. I would easily kill five people on behalf of my own child. I know that. And I'm not a mother. I just know that that's the kind of mother I would be.

VEDANTAM: And Matthews?

DYLAN MATTHEWS: I would kill the kid. Yeah. I wouldn't even question it. Like, it's five against one. I would probably kill myself after out of grief. And so it would be five against two. But, yeah, those people are all the heroes of their own stories. And they all have loved ones who love them as much as I love the kid. And it seems obscene to me to treat my attachment as paramount above their attachments and their lives.

VEDANTAM: The differences in the couple's moral intuitions became clear even on their very first date in 2015. That night at a bar in Washington, D.C., their flirting took the form of a debate over moral philosophy. Matthews explained that he believed in maximizing happiness in the world. That requires treating everyone's happiness as equally important, whether that person happens to live on your block or on the other side of the world.

GROCH-BEGLEY: Which I thought was wild and insane. And I was raised to really think about the person who's in front of you, your family members, to think about your friends and to think about, how can I put them first? How can I always be there for them?

VEDANTAM: Their differences in opinion became an energizing force until about six months into their relationship.

GROCH-BEGLEY: I think that it became more difficult for me when it became more concrete.

VEDANTAM: Concrete because Matthews had decided to donate one of his kidneys to a stranger. His logic was straightforward.

MATTHEWS: There are people who are dying. And one of them needs a kidney. And this will go to one of them.

VEDANTAM: To Matthews, not giving away his kidney was like hoarding food he'd never eat while others were hungry. Groch-Begley saw it very differently.

GROCH-BEGLEY: There were moments when I was very scared that this person who I was starting to love was going to potentially put themselves in a great deal of danger. It is a very safe procedure. But I'm not a doctor. So you tell me, oh, you're going to remove a major organ - elective surgery, like, I don't understand why you would do that.

VEDANTAM: Matthews went ahead with the surgery. Afterward, the National Kidney Registry honored him with a trophy.

MATTHEWS: It's hefty. It's one of the better obelisks I've received in my life.

GROCH-BEGLEY: (Laughter).

VEDANTAM: For Groch-Begley and Matthews, this may have started as an intellectual debate. But their different moral frameworks guide important decisions, like where to donate their money or whether to give up a kidney. And those decisions affect the lives of other people. All of us grapple with similar moral questions whether we recognize it or not.

Very often, like Groch-Begley, we prioritize caring for the people we know. Other times like, Matthews, we try to be impartial. Joshua Greene, the psychologist, compares these different types of moral reasoning to settings on a camera. Our default response, to prioritize the people we know well, is like a camera's automatic settings.

GREENE: So if I want to take a picture of a mountain from a mile away, then I put it in landscape setting and click, point and shoot. And I've got a pretty good picture of a mountain from far away. Occasionally, I get ambitious and will want to do something fancy and maybe have something slightly out of focus and off to the side. And who knows, right? And so there, you'd put the camera in manual mode.

VEDANTAM: Each of these modes has benefits. The automatic mode, both on a camera and in our minds, is efficient.

GREENE: When it comes to the ethics of everyday life, the basic things that everybody should know not to do - the lying, stealing, cheating kinds of things - we want to have automatic settings that just say, nope, nope, nope, nope, nope, can't do that.

VEDANTAM: These moral instincts make it possible for us to cooperate with the people around us.

GREENE: But then, sometimes life is complicated. And sometimes there are difficult tradeoffs. And then that's when you want to shift into manual mode.

VEDANTAM: The manual mode's advantage is that it's flexible. It helps when you have to weigh harming someone you know versus helping several people you don't. We need both modes. But Greene points out that our fast, intuitive moral system, a system that favors people who are near us and look like us, is not always well-suited for our modern world.

Matthews sees the downsides of our automatic mode all around us. He sees it in cases where men were aware of sexual harassment but didn't say anything because the harasser was their friend. He sees it in family members who think they'll be supported regardless of what they do.

MATTHEWS: I think the sum total of all of our small partialities has added up to a lot of injustices IN a lot of cases.

VEDANTAM: Both of our moral systems produce immense good. And both have serious shortcomings. Like a skilled photographer who changes her settings depending on the photo she wants to take, we would be wise to choose our moral settings deliberately.

Shankar Vedantam, NPR News.

(SOUNDBITE OF IL:LO'S "YBBS")

INSKEEP: You can hear Shankar tackle other moral questions on his podcast, which explores so much that we do in automatic mode. It's called Hidden Brain. Transcript provided by NPR, Copyright NPR.

Shankar Vedantam is the host and creator of Hidden Brain. The Hidden Brain podcast receives more than three million downloads per week. The Hidden Brain radio show is distributed by NPR and featured on nearly 400 public radio stations around the United States.