Crowdsourcing an Ethical Dilemma

by /

Stalin said, "A single death is a tragedy; a million deaths is a statistic." So what about one hundred deaths? What about five?

We tested this experimentally, asking people on Amazon Mechanical Turk to decide three versions of the classic philosophical conundrum, the Trolley Problem, in which you decide whether to kill one person so that several others may live.

There is no clear consensus what to do; in psychological experiments subjects disagree. But how does our decision change based on the number of people who will die? We varied the number of people saved between 1 and 1000 to see if that changed subjects' ethical calculus.

Here are the frequencies of responses for three different scenarios (plotted on a log scale with a loess fit):

Here are the sample scenario descriptions (With five people):

Scenario A

A trolley is running out of control down a track. In its path are 5 people who have been tied to the track. Fortunately, you can flip a switch, which will lead the trolley down a different track to safety. Unfortunately, there is a single person tied to that track. Should you flip the switch?

Scenario B

As before, a trolley is hurtling down a track towards five people. You are on a bridge under which it will pass, and you can stop it by dropping a heavy weight in front of it. As it happens, there is a very fat man next to you - your only way to stop the trolley is to push him over the bridge and onto the track, killing him to save five. Should you proceed?

Scenario C

A brilliant transplant surgeon has five patients, each in need of a different organ, each of whom will die without that organ. Unfortunately, there are no organs available to perform any of these five transplant operations. A healthy young traveler, just passing through the city the doctor works in, comes in for a routine checkup. In the course of doing the checkup, the doctor discovers that his organs are compatible with all five of his dying patients. Suppose further that if the young man were to disappear, no one would suspect the doctor. Should the doctor sacrifice the man to save his other patients?


Each Turker was asked all three questions for a random value. But Turkers could answer the question multiple times if they wanted. Rational decision making would show monotonicity, i.e. if you would switch the track to save 10 people, you would switch the track to save 30 people. Human beings are not always rational like this, as is showed in this chart.

Each line in the following plot represents a single turker (so denser lines means a turker answered the question more times). The horizontal axis is the number of people that would be saved by answering "yes". The red dots are where the turker responded with "no", the blue dots are where the turker responded with "yes". The turkers are sorted by the number of people in which they are first willing to answer "yes".

-Lukas and Brendan

Original idea and post by Brendan at http://anyall.org/blog/2008/01/moral-psychology-on-amazon-mechanical-turk/

Trolley image from http://www.unc.edu/~prinz/pictures/