Examples of moral dilemmas (Haidt et al., 1993)

A family's dog was killed by a car in fron of their house. They had heard that dog meat was delicious, so they cut up the dog's body and ate it.

A brother and sister like to kiss each other on the mouth. When nobody is around, they find a secret hiding place and kiss each other on the mouth, passionately.

Types of judgment (Turiel, Nucci)

Moral - universal, legitimately regulated
 Contingenly moral (Nucci)
 Moralistic vs. moral (Baron) - e.g., cloning
 Objective? (moral realism)

Conventional - changeable
 Social norms (Bicchieri) - cost to follow

Personal - matters of taste

Another dilemma

A kind of flu can be fatal to children under 3. A vaccine for this kind of flu has been developed and extensively tested. The vaccine prevents the flu, but it sometimes causes side effects that can be fatal.

A. If you had a child under 3, should you vaccinate your child?

B. Do you think that you would vaccinate your child?

C. Consider two situations:
1. You vaccinate and your child dies from the vaccine.
2. You do not vaccinate and your child dies from the flu.
In which of these situations would you feel guiltier about your decision?

Another one

Imagine that you and three others are held prisoner in the Middle East. All four of you are innocent. Your captors are planning to murder two of the other prisoners. You do not know which two. They will spare these two if you will murder the single remaining prisoner. (You believe them when they say this.) Nobody except you and your captors will know about your decision.

A. Should you kill the one to save the two?

B. Do you think that you would kill the one to save the two?

C. Would you feel guiltier about your decision if you:
1. committed the murder;
2. did not commit the murder.

Functions of morality

1. Punishment, praise, blame

2. Moral education, advice, exhortation

3. Self-control, conscience, guilt feelings

Let us take #2 as primary.


J. Bentham
Jeremy Bentham
J. S. Mill
J.S. Mill
Henry Sidgwick
Henry Sidgwick
R. M. Hare
R.M. Hare
P. Singer
Peter Singer

Argument for utilitarianism

What moral norms should we endorse?

Moral heuristics based on incomplete understanding

Problems from Wertheimer (1945)
parallelogram parallelogram AB problems

Another bias: indirectness (Royzman & Baron)

A new viral disease is spreading rapidly in a region of Africa. Left alone, it will kill 100,000 people out of 1,000,000 in the region. X, a public health official, has two ways to prevent this. Both will stop the spread of the virus and prevent all these deaths:

A. Give all 1,000,000 a shot that makes them immune to the first disease. The shot will also cause, as a side effect, a second disease that will kill 100 people.

B. Give all 1,000,000 a shot that gives them another disease, which is incompatible with the first disease. The second disease will kill 100 people.

Physical contact (Cushman, Young & Hauser)

Which is worse?

A. Push a man off of a footbridge and in front of a train in order to cause the man to fall and be hit by the train, thereby slowing it and saving five people ahead on the tracks.

B. Pull a lever that redirects a trolley onto a side track in order to save five people ahead on the main track if, as a side-effect, pulling the lever drops a man off a footbridge and in front of the train on the side tracks, where he will be hit.

Protected values

Some principles of utilitarianism

Normative model for all decisions that affect others: "not making them" is choosing the default

Better vs. worse - no natural concept of duty

Utility must be measurable and interpersonally comparable

Incentive, deterrence, responsibility (e.g., insanity defense)

Emotions, evil desires

Rights derivable but not fundamental (e.g., property)

Tastes: free market

The need for information

Utilitarianism and EU

Suppose each person faced with a gamble, like a vaccination decision. Suppose each person's utilities are such that (A,P) > (B,Q).

Now suppose that someone judges that, from a "social" perspective, it is better for 1,000,000*Q people to have outcome B than for 1,000,000*P people to have outcome A.

If we make decisions according to the social perspective, we make everyone worse off.

The fact that this can happen is relevant to normative theory. The fact that it is unlikely is not an excuse.

We can always take an ex-ante perspective: veil of ignorance.