03x04 - The Stanford Prison Experiment
Posted: 10/31/21 03:30
One of the most infamous
psychological studies
ever conducted
was the Stanford Prison
Experiment.
It's mentioned in almost every
intro to psychology textbook.
They tend to focus on
how unethical it was,
and are less critical
of its supposed conclusion.
August 14th, 1971.
Palo Alto, California.
Twelve young men are rounded
up from their homes by police,
placed under arrest,
and brought to
a makeshift prison
in the basement
of Stanford University.
It all begins as a study on
the psychology of prison life,
led by Stanford psychology
professor Dr. Philip Zimbardo.
24 volunteers--
12 guards
and 12 prisoners.
--have agreed to spend
the next two weeks
recreating life
in a correctional facility.
[guard]
The prisoners are booked
and stripped nude.
They're no longer
individuals,
forced to wear smocks,
stocking caps and shackles.
Identified only by
their prisoner numbers.
The guards quickly adapt
to their new profession.
Given anonymity by
their mirrored sunglasses,
some of them start to control
the meager food rations,
restrict prisoners'
bathroom use.
And, as tensions rise,
so do their cruel methods.
Within just six days
of the planned two-week study,
conditions are so bad
that the entire operation
is shut down.
[man]
g*dd*mn it...
The study makes international
headlines.
Zimbardo's fame skyrockets,
and his conclusions are taught
to students worldwide,
used as a defense
in criminal trials
and are even submitted
to Congress
to explain the abuses
inflicted at Abu Ghraib.
The study brings up
a question
just as important then
as it is today:
is evil caused
by the environment,
or the personalities in it?
Zimbardo's shocking conclusion
is that when people
feel anonymous
and have power over
depersonalized others,
they can easily become evil.
And it occurs more often
than we'd like to admit.
But while it's true that people
were mean to each other
during the Stanford
Prison Experiment,
what if what truly caused
that behavior
wasn't what we've always
been told?
The Stanford Prison Experiment
has always had
its controversies.
But a wave of recent
revelations
have pushed it back
into the spotlight
47 years later.
Today, I'm going to speak
with journalist Ben Blum,
whose recent writings
have brought criticism
of the experiment
to a larger audience
than ever before.
How did you get involved in
the Stanford Prison Experiment
in the first place?
Well, my involvement
was quite personal.
Like everyone,
I had kind of absorbed
the basic lesson
of the experiment
through the cultural ether.
And then my cousin Alex
was arrested for bank robbery.
This was a team of mostly
m*llitary guys with AK-47s.
Alex was the driver.
He was a 19-year-old
U.S. Army Ranger.
And it was a superior of his
on the Rangers
that organized and led
the bank robbery.
Alex thought the whole thing
was a training exercise.
He was just so brainwashed
in this intense Ranger training
that when a superior proposed
this bank robbery,
he took it as just one more kind
of tactical thought experiment.
Then Dr. Philip Zimbardo
participated
in his legal defense.
Zimbardo submits a letter
to the court,
advocating leniency
in sentencing on the grounds
that Alex, my cousin,
had been so transformed
by the social environment
of the Ranger battalion
that he participated
in the bank robbery
without exercising
his own free will.
Well, how did that affect
Alex's sentencing?
He received an extraordinarily
lenient sentence of 16 months.
So Zimbardo was a family hero.
But over time, Alex,
finally he did admit to me,
you know what, I knew this was
a bank robbery by the end,
and I just didn't have the moral
courage to back out.
Oh, wow.
Alex, myself and our
whole family
came to view
the Zimbardo argument
as a way to shirk personal
culpability,
and to put all the blame
on the situation.
So you start looking
at the Stanford Prison
Experiment in particular.
You reached out to Dr. Zimbardo
himself,
as well as some of those
who participated.
What did you learn?
I learned,
to my deep surprise,
that quite a number
of the participants
had stories of their experience
that completely contradicted
the official narrative.
Which is, look,
these regular people,
good people,
came together,
and because of the situation,
became evil.
[Ben]
Right.
Zimbardo has claimed
that the guards
were put in the situation,
and then the kind of hidden
wellspring of sadism
that apparently lies
in all of us
unfolded organically.
[Zimbardo]
There was an orientation meeting
for the guards.
They had been told
quite explicitly
to oppress the prisoners.
That falls under the heading
of what psychologists call
demand characteristics.
Experimental subjects
tend to be motivated
to give experimenters
what they want.
[Michael]
Demand characteristics occur
whenever participants
being studied
act differently
than they normally would
because they've guessed
what hypothesis is being tested
and feel that a certain kind
of behavior is being demanded.
There was a recording
of explicitly correcting a guard
who wasn't being tough enough.
So a conclusion
you could make
from the Stanford
Prison Experiment
is that when you tell people
to be cruel,
they'll do it if you tell them
it's for a greater good,
like science.
-Right.
-Who would have thought?
I think the study stands still
as a fascinating spur
to further more careful research
as a demonstration that should
make anyone curious
as to how such extreme behavior
could arise
in such a short time.
The experiment could still
be useful,
but it might need to be
reinterpreted.
Its data might lead
to different conclusions
than the one that we've been
telling for so many decades.
Right.
The flaws in the experiment
that Ben and other critics
bring up
call into question large
portions of the narrative
surrounding the study.
So I want to hear from someone
who was actually there.
Dave Eshelman, the study's
most infamous guard,
agreed to tell me
his side of the story.
It's really an honor
to meet you.
You're a living, walking piece
of psychology history.
I'm never recognized in the
street or anything like that,
although I still get
some hate mail.
-Are you serious?
-Yeah, absolutely.
Well, what do you say to them
when they react that way?
I say, well, there's probably
a lot about that
that didn't happen quite the way
it's been portrayed.
Well, Dave,
before we go too far,
I'd like to watch the footage
we have here
so we can kind of talk about
what we see.
[Dave]
That's me there, by the way.
-[Michael] Look at that look.
-[Dave] Mm-hmm.
So how did you get involved with
a Stanford Prison Experiment?
My father was a professor
at Stanford,
and I was home for summer,
looking for a summer job.
So I'm looking
through the want ads.
$15 a day.
You know,
in 1971 that wasn't bad.
The way it was introduced
to the guards,
the whole concept
of this experiment,
we were never led to believe
that we were part
of the experiment.
We were led to believe
that our job
was to get results
from the prisoners,
that they were the ones
the researchers
are really studying.
The researchers
were behind the wall.
And we all knew
they were filming.
And we can often hear
the researchers
commenting on the action
from the other side of the wall.
You know, like,
"Oh, gosh, did you see that?
Here. Make sure you get
a close-up of that."
Okay? So if they want to show
that prison is a bad experience,
I'm going to make it bad.
But how did you feel
doing stuff like that?
Didn't you feel bad?
I don't know if this
is a revelation to you,
but 18-year-old boys are not
the most sensitive creatures.
-Sure.
-My agenda was to be
the worst guard
I could possibly be.
-And it's pretty serious.
-Mm-hmm.
This is my favorite part
of all the footage we have
-from the experiment.
-Mm-hmm.
It's you and a prisoner
confronting each other
after the experiment.
I remember the guy saying,
"I hate you, man."
-Yeah.
-"I hate you."
Each day I said, well,
what can we do to ramp up
what we did yesterday?
How can we build on that?
Why did you want
to ramp things up?
Two reasons, I think.
One was because
I really believed
I was helping the researchers
with some better understanding
of human behavior.
On the other hand,
it was personally
interesting to me.
You know, I cannot say that I
did not enjoy what I was doing.
Maybe, you know,
having so much power
over these poor,
defenseless prisoners,
you know, maybe you kind of
get off on that a little bit.
You weren't entirely following
a script from a director.
Right.
But you also felt like
Zimbardo wanted something
from you.
-Yes.
-And you gave that to him.
I believe I did.
I think I decided
I was going to do a better job
than anybody there
of delivering
what he wanted.
But does that excuse me
from what I was doing?
Certainly it started out
with me playing a role.
So the question is, was there
a point where I stopped acting
and I started living,
so to speak?
The standard narrative is that
Dave Eshelman did what he did
because when people
are given power,
it's easier than we think
for abuse to happen.
That may be true,
but how predisposed
to aggression was Dave?
I mean, he signed up
to something called
a "prison study," after all.
Also, his feeling
that cruelty was encouraged
and helped the experiment,
may have affected his behavior.
What I'd like to see is,
in the absence
of outside influence,
can anonymity, power,
and depersonalization alone
lead to evil?
To answer that question,
I'd like to design
a demonstration of my own.
So I'm meeting
with Dr. Jared Bartels
of William Jewell College,
a psychologist who has written
extensively
about the Stanford
Prison Experiment
and how it is taught.
I would love to do the Stanford
Prison Experiment again.
You could probably make it
more ethical,
but still find the same
conclusions.
That's my hypothesis.
I absolutely think
it's worthwhile.
It's important.
It's interesting.
Probably the best approach
is eliminate as best as possible
the demand characteristics
by eliminating that
prisoner/guard dynamic.
Why do we even need to call one
group "guards"
and one "prisoners"?
There's a lot of expectations
around those roles.
Oh, I'm a guard?
-I guess I should act like a
guard.
-Yeah, you're right.
The cover story is really
important,
and you want to hide the true
purpose of the experiment.
Another piece of this
is the role of personality
and personality traits.
So the original ad
in the Stanford study
asked for participants
for a study of prison life.
You know, that's going to draw
certain people
that were more kind of disposed
to aggression.
[Michael]
Because they saw the word
"prison" and thought,
-"I want to be a part of that."
-Exactly.
So when you get a group
of kind of authoritarian-minded
individuals together,
not surprisingly
they're going to create
an authoritarian regime
and environment.
So, for whatever it is that
we're going to do,
we should evaluate
the personalities
of the individuals.
Right.
So how do we give people
every opportunity
to be as evil as they can?
I think you have
to have those elements
that were assumed
to be influential
in the Stanford study.
What are those elements?
You have to have
the depersonalization.
You have to have anonymity.
You have to have some power
differences.
Can we elicit
some surprising behaviors
in just a number of hours?
If you kind of come back
to the Stanford study,
there wasn't anything dramatic
that happened
-in the first day of the study.
-Yeah.
It was the second day
of the study
when the guards started to
assert their authority.
That came about because
of prisoners testing
and challenging the guards'
authority.
[Michael]
Yeah, and that led to fear.
That, like, wait a second,
these prisoners need to be
-put more in check.
-Yeah. Yeah.
So I think you still need
that provocation.
Yeah.
Something that is frustrating.
Something that's going
to increase
the participants' arousal.
Right. All right, so, Jared,
would you like
to spend some time now
brainstorming a new design
that peeks into the same
questions?
-Absolutely.
-Awesome.
[Michael]
Jared and I sat down
with the Mind Field crew
to begin the planning process.
Will a person,
without any expectations
or pushes in a certain direction
still be abusive or not?
For this demonstration,
we want to eliminate
all outside variables
and really isolate
the three core elements
of the Stanford Prison
Experiment.
The first element
is anonymity.
Subjects need to believe
that no matter how they behave,
no one will know
it was them.
This is where people will be
coming in in the morning.
This way, everyone's going to be
staggered when they come in.
That's important,
because we don't want them
to ever meet their teammates
face-to-face.
The original experiment
gave guards anonymity
by providing mirrored
sunglasses and uniforms.
But we're taking it
much further.
Our study will take place
in a room that is pitch-black.
[Jared]
They'll be taken into this room.
[Michael]
Ah. I would love to see how dark
this room is going to be
tomorrow.
[man]
Yeah, absolutely.
-You ready?
-I'm ready.
-Oh, yeah.
-[man] Right?
[Michael]
This is uncomfortable.
Despite the darkness,
we will be able
to see everything,
thanks to infrared cameras.
The second element
is depersonalization.
From the moment
the subjects arrive,
they will only be identified
by number, not name.
[woman]
So, come on in.
To eliminate the demand
characteristics,
we don't want our subjects
to know what we're studying.
Follow the sound of my voice,
if you can.
All they'll be told
is that we are studying
how they solve puzzles
in the dark.
There is another team
in a different location.
-who is also solving a puzzle.
-Okay.
Because the words
"guard" and "prisoner"
suggest certain
expected behaviors,
we've done away with them
and will simply give
our participants an unseen,
distantly located
opposing team.
We will measure
the cruelty predicted
by the standard narrative
of the Stanford
Prison Experiment
by giving our participants
a way to exercise
the third element: power.
What I'm going to show you next
is the system
by which you can send them
a loud noise.
-Okay.
-So if you want to...
We've armed the teams
with a "distractor button"
that they can press to blast
an extremely loud,
jarring noise
into the other team's room.
Everyone will have
a volume dial
that ranges from level 1 to 12,
and they'll be told
that anything below a 7
should be safe
for the other team's hearing.
And each person
has their own control.
Okay.
So they can't see
what you're doing.
-You can't see
what they're doing.
-Okay.
The intensity level
they select,
as well as the frequency with
which they push the button,
will be our indicator
of how aggressive
the participants become
in this situation.
Is it-- is it pretty,
like, terrible to hear?
Well, I'll give you
a demonstration.
Hey, Derek, could you play
level 3 for me?
[loud, discordant horn]
So that's a 3.
It's pretty...
-it's pretty loud.
-Yeah.
Perfect.
Participants will be told
that when they
or a member of their team
pushes a distractor button,
the volume played
in the opponent's room
will be determined by
the highest level selected
on any of their
teammates' dials.
This is to increase the feeling
of diffused responsibility.
The question is,
will any of these participants
take advantage of these factors
and act sadistically?
Of course, we would never
want anyone
to actually be harmed
in our experiments,
so the other team?
They don't exist.
Instead, Jared and I
will be the ones
occasionally blasting
the group with noise
at a safe level,
no higher than a 3.
To see just how powerful
the situation can be,
we selected participants
who would not be predisposed
to sadism.
We screened
our participants
using the "Big 5
Personality Scale,"
"The Personality
Assessment Inventory,"
and picked those who scored
the highest
in "moral" categories,
like honesty
and conscientiousness.
It looks like,
you know,
they should be able
to see each other.
But it's pitch-dark.
There are puzzle pieces
on the table in front of you.
Thank you, and once I leave
the room you may begin.
Okay, here we go.
[man 1]
[man 2]
[man 1]
I definitely don't think
they're conscious
of the control panel
at this point.
-No.
-They're trying to get focused
on the task here.
[man 1]
[man 2]
[man 2]
[laughter]
[man 2]
We picked people
who were most likely
to have these kinds
of personalities.
[man 1]
[laughs]
[woman]
-Oh.
-She wants...
[woman]
All right.
[all]
[man 1]
-[high-pitched squeal]
-[woman] Did somebody do it
already?
-I did.
-Yeah.
-Okay.
-We should retaliate.
-Yeah, retaliate now.
[loud, discordant horn]
[all laugh]
[horn blares]
[laughter]
[Michael]
Now, they're not retaliating
against that most recent buzz.
Shall we try again?
[loud, discordant horn]
Despite the factors making it
easy for them to do so,
this team doesn't appear
to be turning evil.
Now they are, like,
just deal with it.
Just ignore it and keep
working together.
They're not interested
in retaliating.
[discordant horn blares]
Over the course
of the two-hour study,
we blasted them with noise
23 times.
[woman laughs]
But they only pushed the button
six times,
and never above a level 5.
They didn't seem
to abuse their power.
Puzzle pieces down.
What would happen
if we introduced
demand characteristics
that encouraged them
to act aggressively?
Your team has been
randomly assigned
an experimental condition.
Although the other team
will continue working
on a puzzle,
your team will not.
Your only task is to operate
the distractors.
Also, the other team's buttons
have been disconnected
without their knowledge.
You will not hear any sounds
if they buzz back at you.
We introduce
the social roles,
where there's a little bit
of power differential.
We're kind of mimicking the
Stanford-like variables here.
[Michael]
By now saying that the buzzer
is their "task,"
the participants may feel
a greater license
to use it liberally.
Similar to how instructing
prison guards
in the original experiment
to act tough
may have encouraged
more use of force.
[man 3]
[woman]
[man 1]
Even though they were
given instructions
to distract the other team,
these participants instead
just started chatting
with one another.
They know that they can be
distracting now,
but they're not pushing the
button.
No.
[man 2]
Oh. Okay.
[woman]
A couple of threes.
[high-pitched squeal]
Over the course of ten minutes,
this group only pushed
the button three times.
Why do you think
they're so uninterested
in blasting
the other team?
Because we have individuals
who have been selected, really,
with that predisposition,
right?
These are individuals
who shouldn't be interested
in retaliating.
It was time to debrief
the participants
on what we were
actually studying.
[Michael]
I'm going to turn the lights on.
Here I am. I'm Michael,
and this is Jared.
We're going to debrief you on
what was really happening today.
There are no other people.
You are the only four here at
this moment.
There was never another team
doing anything.
[man 1]
This is a study related to
the Stanford Prison Experiment.
[man 1]
The standard narrative
we hear about that experiment
is that people
just become cruel.
So, yeah, we're trying to see if
we get the nicest people we can,
and we give them complete
anonymity
and the ability to be cruel,
but never encourage them to,
will they still do it?
And you guys didn't.
Did you have any suspicions
about what we were studying
or what was going on?
Right, but I think
that's good.
We just want to make sure
you don't think
that what we're really
looking at
is how high you turn
your own dial.
That's really
what we're looking at.
It was time to bring in our
second group of participants,
who, like the first group,
were screened to be individuals
with high morality
characteristics.
Anything up to 7
should be safe.
[laughs]
Yeah.
[woman]
So once I leave,
you can go ahead
and get started.
[woman 1]
[laughs]
Oh...
[high-pitched squeal]
Right off the bat she went to 7
and pushed the button.
Yeah.
[loud, discordant horn]
[high-pitched squeal]
[Michael]
Number two's pushing it at a 3.
[discordant horn blares]
[woman 1]
Okay, here comes number two.
[high-pitched squeal]
Number two is still
at a volume 3.
[Michael] This team seemed
more willing to retaliate.
Let's see what will happen
if we continue buzzing them.
Will they escalate
their behaviors?
Derek, let's blast them again.
Number 3.
[loud horn]
Okay, let's...
All right, so two just pushed
at a 3.
But she's not touching the dial.
[Jared]
She's not.
[loud, discordant horn]
[woman 2]
It's just annoying.
[blaring horn]
[high-pitched squeal]
[all laugh]
It was clear
that participant number two
was really the only one
hitting the distractor button,
but it appeared that she only
did it in retaliation
to our buzzes.
So we decided to see
what would happen
if we laid off.
[man 1]
It's been probably
four or five minutes,
and we have not blasted them
with the noise,
and they haven't
played one either.
I have a feeling like if we
never played a noise in their
room,
they would never touch
the distractor button.
[Jared]
Probably not at this point.
In the end, we buzzed
this group a total of 44 times,
and they buzzed us 38 times,
37 of which came
from number two
but always in retaliation,
and never above a 5.
All right, guys.
Puzzle pieces down.
The situational factors
did not seem to be sufficient
to make this group sadistic.
It was time
for phase 2.
[woman 1]
Yeah.
-Oh, she...
-[high-pitch squeal]
It looks like it's at 7.
-Wow.
-Yeah, she's--
She's going nuts.
At a 7.
So number three believes
there is no other team.
That might explain why she was
just going nuts on the button,
because she doesn't feel bad
about it.
[buttons clicking]
Okay, they're all pushing
the button a lot more.
And they were told
this time
that it was their
only task.
[buttons clicking]
[all laugh]
What a difference
this has made.
Just like in the Stanford
Prison Experiment.
If you tell people
that they have a certain task
to do, they'll do it,
even if it's going to mean
that they've been broken.
The thing is, they never hit
upon what we really cared about,
which is turning the dial
into an unsafe level.
Yeah.
[buttons click]
[Michael] Hello, everyone.
I'm going to turn the lights on
in this room.
[woman 1]
Okay.
-And slowly...
-Ah, it hurts.
...you can look.
So, hello.
-I'm Michael,
and this is Jared.
-Hi.
I'll give you time
to adjust your eyes.
Today, you've been part
of a study where all we wanted
was to see what would happen
when we put people in a room
and gave them that feeling
of anonymity
that comes from, well,
if I crank my dial up
really high,
no one will know
it's me.
So you have this opportunity
to be cruel.
I thought
I went nuts.
Like, when the other person
was pressing--
Sure, but that's--
that's just in-kind retribution.
As it turns out,
so far,
everyone stays in that
"below 7 or under" range.
-Yeah.
-This final phase was us
trying to ramp up
the demand characteristics.
And I believe number one, right,
you did say at one point,
"You've broken me.
I did it, fine."
So I loved that phrase,
because it says
"I didn't want to do this,
but I'm doing it because I
believe it was expected of me."
[all]
Thank you. Thanks.
[Michael] After dismissing
our participants,
Jared and I sat down
to discuss our results.
Really fascinating.
We brought in people who had
very different personalities
than those Zimbardo chose.
We put them in a situation that
did not demand things from them.
And they behaved according
to that personality.
I think we have some intriguing
support for the idea
that it's more than just
the situation.
We really saw personality
kind of shine through.
For the most part,
they seemed to be aware
-of where that line is...
-Yeah.
...that they shouldn't cross,
and they didn't.
None of them did.
It was now time to speak
with the man himself,
Dr. Philip Zimbardo,
who I worked with
on last season's episode,
"How to Make a Hero."
Okay. Lisa, Bear,
you guys ready?
For years, Dr. Zimbardo
has responded to criticisms
of his famous study,
always maintaining
that they aren't valid.
I asked him about
whether his study
is better seen
as one on the power
of demands from authority,
but he wasn't receptive
to that idea.
I then told him about the study
we ran to get his reaction.
I wanted to know what the
sufficient conditions might be
to make anyone
do something evil.
And we struggled
to get that to happen.
We couldn't get anyone
to be cruel.
Just giving them anonymity,
and a dehumanized other,
and the power
to hurt that other,
they didn't take
advantage of it.
Well, I mean,
maybe the problem was,
here's a case where,
by picking people
who were extremely
conscientious,
extremely mindful,
by selecting people
who are high on compassion,
high on mindfulness,
you broke the power
of the situation.
In the Stanford
Prison Experiment,
we had, I presume,
a relatively normal
distribution.
We gave them
six personality scales.
And we picked people who,
in the scales,
who were mostly
in the mid-range.
In that situation,
some people behave cruelly,
evilly.
Not everybody, but more
of the guards than not.
So, again, I think that
your study is a demonstration
of one way in which personality
dominates situation.
-Ah.
-Where the personalities are--
so I would say
it's a positive result.
The personalities
are special.
Where does this balance lie
between the personal,
the disposition,
the personality,
and the situation,
the environment?
No, that's the big--
that's the ultimate question.
Where is, you know,
how much of one
and how much of the other...?
Right.
Zimbardo insists
that demand characteristics
played little role
in his subject's behavior.
Critics like Ben Blum
say they played a big role,
that what happened
was what was asked for.
If that's true,
then the Stanford
Prison Experiment,
like the classic Milgram study,
still has an important lesson.
People are quick to be cruel
if an authority figure suggests
that doing so
will serve a greater cause.
In our test, we made sure that
such influences didn't exist.
And not one participant
acted maliciously.
Personality rose above
the situation.
Learning how that happens
is vital
if we want to improve conditions
where power is involved.
So it's great that this debate
is still ongoing.
And look, questioning methods
and interpretations
is not a personal att*ck.
It's how we improve
our confidence in what we know.
And that's how science works.
So stay curious,
never stop asking questions,
and, as always,
thanks for watching.
Hey, Mind Field.
Michael Stevens here.
There is so much more
to satisfy your hunger
for psychological knowledge
right on this show.
Click below to check out
more episodes.
psychological studies
ever conducted
was the Stanford Prison
Experiment.
It's mentioned in almost every
intro to psychology textbook.
They tend to focus on
how unethical it was,
and are less critical
of its supposed conclusion.
August 14th, 1971.
Palo Alto, California.
Twelve young men are rounded
up from their homes by police,
placed under arrest,
and brought to
a makeshift prison
in the basement
of Stanford University.
It all begins as a study on
the psychology of prison life,
led by Stanford psychology
professor Dr. Philip Zimbardo.
24 volunteers--
12 guards
and 12 prisoners.
--have agreed to spend
the next two weeks
recreating life
in a correctional facility.
[guard]
The prisoners are booked
and stripped nude.
They're no longer
individuals,
forced to wear smocks,
stocking caps and shackles.
Identified only by
their prisoner numbers.
The guards quickly adapt
to their new profession.
Given anonymity by
their mirrored sunglasses,
some of them start to control
the meager food rations,
restrict prisoners'
bathroom use.
And, as tensions rise,
so do their cruel methods.
Within just six days
of the planned two-week study,
conditions are so bad
that the entire operation
is shut down.
[man]
g*dd*mn it...
The study makes international
headlines.
Zimbardo's fame skyrockets,
and his conclusions are taught
to students worldwide,
used as a defense
in criminal trials
and are even submitted
to Congress
to explain the abuses
inflicted at Abu Ghraib.
The study brings up
a question
just as important then
as it is today:
is evil caused
by the environment,
or the personalities in it?
Zimbardo's shocking conclusion
is that when people
feel anonymous
and have power over
depersonalized others,
they can easily become evil.
And it occurs more often
than we'd like to admit.
But while it's true that people
were mean to each other
during the Stanford
Prison Experiment,
what if what truly caused
that behavior
wasn't what we've always
been told?
The Stanford Prison Experiment
has always had
its controversies.
But a wave of recent
revelations
have pushed it back
into the spotlight
47 years later.
Today, I'm going to speak
with journalist Ben Blum,
whose recent writings
have brought criticism
of the experiment
to a larger audience
than ever before.
How did you get involved in
the Stanford Prison Experiment
in the first place?
Well, my involvement
was quite personal.
Like everyone,
I had kind of absorbed
the basic lesson
of the experiment
through the cultural ether.
And then my cousin Alex
was arrested for bank robbery.
This was a team of mostly
m*llitary guys with AK-47s.
Alex was the driver.
He was a 19-year-old
U.S. Army Ranger.
And it was a superior of his
on the Rangers
that organized and led
the bank robbery.
Alex thought the whole thing
was a training exercise.
He was just so brainwashed
in this intense Ranger training
that when a superior proposed
this bank robbery,
he took it as just one more kind
of tactical thought experiment.
Then Dr. Philip Zimbardo
participated
in his legal defense.
Zimbardo submits a letter
to the court,
advocating leniency
in sentencing on the grounds
that Alex, my cousin,
had been so transformed
by the social environment
of the Ranger battalion
that he participated
in the bank robbery
without exercising
his own free will.
Well, how did that affect
Alex's sentencing?
He received an extraordinarily
lenient sentence of 16 months.
So Zimbardo was a family hero.
But over time, Alex,
finally he did admit to me,
you know what, I knew this was
a bank robbery by the end,
and I just didn't have the moral
courage to back out.
Oh, wow.
Alex, myself and our
whole family
came to view
the Zimbardo argument
as a way to shirk personal
culpability,
and to put all the blame
on the situation.
So you start looking
at the Stanford Prison
Experiment in particular.
You reached out to Dr. Zimbardo
himself,
as well as some of those
who participated.
What did you learn?
I learned,
to my deep surprise,
that quite a number
of the participants
had stories of their experience
that completely contradicted
the official narrative.
Which is, look,
these regular people,
good people,
came together,
and because of the situation,
became evil.
[Ben]
Right.
Zimbardo has claimed
that the guards
were put in the situation,
and then the kind of hidden
wellspring of sadism
that apparently lies
in all of us
unfolded organically.
[Zimbardo]
There was an orientation meeting
for the guards.
They had been told
quite explicitly
to oppress the prisoners.
That falls under the heading
of what psychologists call
demand characteristics.
Experimental subjects
tend to be motivated
to give experimenters
what they want.
[Michael]
Demand characteristics occur
whenever participants
being studied
act differently
than they normally would
because they've guessed
what hypothesis is being tested
and feel that a certain kind
of behavior is being demanded.
There was a recording
of explicitly correcting a guard
who wasn't being tough enough.
So a conclusion
you could make
from the Stanford
Prison Experiment
is that when you tell people
to be cruel,
they'll do it if you tell them
it's for a greater good,
like science.
-Right.
-Who would have thought?
I think the study stands still
as a fascinating spur
to further more careful research
as a demonstration that should
make anyone curious
as to how such extreme behavior
could arise
in such a short time.
The experiment could still
be useful,
but it might need to be
reinterpreted.
Its data might lead
to different conclusions
than the one that we've been
telling for so many decades.
Right.
The flaws in the experiment
that Ben and other critics
bring up
call into question large
portions of the narrative
surrounding the study.
So I want to hear from someone
who was actually there.
Dave Eshelman, the study's
most infamous guard,
agreed to tell me
his side of the story.
It's really an honor
to meet you.
You're a living, walking piece
of psychology history.
I'm never recognized in the
street or anything like that,
although I still get
some hate mail.
-Are you serious?
-Yeah, absolutely.
Well, what do you say to them
when they react that way?
I say, well, there's probably
a lot about that
that didn't happen quite the way
it's been portrayed.
Well, Dave,
before we go too far,
I'd like to watch the footage
we have here
so we can kind of talk about
what we see.
[Dave]
That's me there, by the way.
-[Michael] Look at that look.
-[Dave] Mm-hmm.
So how did you get involved with
a Stanford Prison Experiment?
My father was a professor
at Stanford,
and I was home for summer,
looking for a summer job.
So I'm looking
through the want ads.
$15 a day.
You know,
in 1971 that wasn't bad.
The way it was introduced
to the guards,
the whole concept
of this experiment,
we were never led to believe
that we were part
of the experiment.
We were led to believe
that our job
was to get results
from the prisoners,
that they were the ones
the researchers
are really studying.
The researchers
were behind the wall.
And we all knew
they were filming.
And we can often hear
the researchers
commenting on the action
from the other side of the wall.
You know, like,
"Oh, gosh, did you see that?
Here. Make sure you get
a close-up of that."
Okay? So if they want to show
that prison is a bad experience,
I'm going to make it bad.
But how did you feel
doing stuff like that?
Didn't you feel bad?
I don't know if this
is a revelation to you,
but 18-year-old boys are not
the most sensitive creatures.
-Sure.
-My agenda was to be
the worst guard
I could possibly be.
-And it's pretty serious.
-Mm-hmm.
This is my favorite part
of all the footage we have
-from the experiment.
-Mm-hmm.
It's you and a prisoner
confronting each other
after the experiment.
I remember the guy saying,
"I hate you, man."
-Yeah.
-"I hate you."
Each day I said, well,
what can we do to ramp up
what we did yesterday?
How can we build on that?
Why did you want
to ramp things up?
Two reasons, I think.
One was because
I really believed
I was helping the researchers
with some better understanding
of human behavior.
On the other hand,
it was personally
interesting to me.
You know, I cannot say that I
did not enjoy what I was doing.
Maybe, you know,
having so much power
over these poor,
defenseless prisoners,
you know, maybe you kind of
get off on that a little bit.
You weren't entirely following
a script from a director.
Right.
But you also felt like
Zimbardo wanted something
from you.
-Yes.
-And you gave that to him.
I believe I did.
I think I decided
I was going to do a better job
than anybody there
of delivering
what he wanted.
But does that excuse me
from what I was doing?
Certainly it started out
with me playing a role.
So the question is, was there
a point where I stopped acting
and I started living,
so to speak?
The standard narrative is that
Dave Eshelman did what he did
because when people
are given power,
it's easier than we think
for abuse to happen.
That may be true,
but how predisposed
to aggression was Dave?
I mean, he signed up
to something called
a "prison study," after all.
Also, his feeling
that cruelty was encouraged
and helped the experiment,
may have affected his behavior.
What I'd like to see is,
in the absence
of outside influence,
can anonymity, power,
and depersonalization alone
lead to evil?
To answer that question,
I'd like to design
a demonstration of my own.
So I'm meeting
with Dr. Jared Bartels
of William Jewell College,
a psychologist who has written
extensively
about the Stanford
Prison Experiment
and how it is taught.
I would love to do the Stanford
Prison Experiment again.
You could probably make it
more ethical,
but still find the same
conclusions.
That's my hypothesis.
I absolutely think
it's worthwhile.
It's important.
It's interesting.
Probably the best approach
is eliminate as best as possible
the demand characteristics
by eliminating that
prisoner/guard dynamic.
Why do we even need to call one
group "guards"
and one "prisoners"?
There's a lot of expectations
around those roles.
Oh, I'm a guard?
-I guess I should act like a
guard.
-Yeah, you're right.
The cover story is really
important,
and you want to hide the true
purpose of the experiment.
Another piece of this
is the role of personality
and personality traits.
So the original ad
in the Stanford study
asked for participants
for a study of prison life.
You know, that's going to draw
certain people
that were more kind of disposed
to aggression.
[Michael]
Because they saw the word
"prison" and thought,
-"I want to be a part of that."
-Exactly.
So when you get a group
of kind of authoritarian-minded
individuals together,
not surprisingly
they're going to create
an authoritarian regime
and environment.
So, for whatever it is that
we're going to do,
we should evaluate
the personalities
of the individuals.
Right.
So how do we give people
every opportunity
to be as evil as they can?
I think you have
to have those elements
that were assumed
to be influential
in the Stanford study.
What are those elements?
You have to have
the depersonalization.
You have to have anonymity.
You have to have some power
differences.
Can we elicit
some surprising behaviors
in just a number of hours?
If you kind of come back
to the Stanford study,
there wasn't anything dramatic
that happened
-in the first day of the study.
-Yeah.
It was the second day
of the study
when the guards started to
assert their authority.
That came about because
of prisoners testing
and challenging the guards'
authority.
[Michael]
Yeah, and that led to fear.
That, like, wait a second,
these prisoners need to be
-put more in check.
-Yeah. Yeah.
So I think you still need
that provocation.
Yeah.
Something that is frustrating.
Something that's going
to increase
the participants' arousal.
Right. All right, so, Jared,
would you like
to spend some time now
brainstorming a new design
that peeks into the same
questions?
-Absolutely.
-Awesome.
[Michael]
Jared and I sat down
with the Mind Field crew
to begin the planning process.
Will a person,
without any expectations
or pushes in a certain direction
still be abusive or not?
For this demonstration,
we want to eliminate
all outside variables
and really isolate
the three core elements
of the Stanford Prison
Experiment.
The first element
is anonymity.
Subjects need to believe
that no matter how they behave,
no one will know
it was them.
This is where people will be
coming in in the morning.
This way, everyone's going to be
staggered when they come in.
That's important,
because we don't want them
to ever meet their teammates
face-to-face.
The original experiment
gave guards anonymity
by providing mirrored
sunglasses and uniforms.
But we're taking it
much further.
Our study will take place
in a room that is pitch-black.
[Jared]
They'll be taken into this room.
[Michael]
Ah. I would love to see how dark
this room is going to be
tomorrow.
[man]
Yeah, absolutely.
-You ready?
-I'm ready.
-Oh, yeah.
-[man] Right?
[Michael]
This is uncomfortable.
Despite the darkness,
we will be able
to see everything,
thanks to infrared cameras.
The second element
is depersonalization.
From the moment
the subjects arrive,
they will only be identified
by number, not name.
[woman]
So, come on in.
To eliminate the demand
characteristics,
we don't want our subjects
to know what we're studying.
Follow the sound of my voice,
if you can.
All they'll be told
is that we are studying
how they solve puzzles
in the dark.
There is another team
in a different location.
-who is also solving a puzzle.
-Okay.
Because the words
"guard" and "prisoner"
suggest certain
expected behaviors,
we've done away with them
and will simply give
our participants an unseen,
distantly located
opposing team.
We will measure
the cruelty predicted
by the standard narrative
of the Stanford
Prison Experiment
by giving our participants
a way to exercise
the third element: power.
What I'm going to show you next
is the system
by which you can send them
a loud noise.
-Okay.
-So if you want to...
We've armed the teams
with a "distractor button"
that they can press to blast
an extremely loud,
jarring noise
into the other team's room.
Everyone will have
a volume dial
that ranges from level 1 to 12,
and they'll be told
that anything below a 7
should be safe
for the other team's hearing.
And each person
has their own control.
Okay.
So they can't see
what you're doing.
-You can't see
what they're doing.
-Okay.
The intensity level
they select,
as well as the frequency with
which they push the button,
will be our indicator
of how aggressive
the participants become
in this situation.
Is it-- is it pretty,
like, terrible to hear?
Well, I'll give you
a demonstration.
Hey, Derek, could you play
level 3 for me?
[loud, discordant horn]
So that's a 3.
It's pretty...
-it's pretty loud.
-Yeah.
Perfect.
Participants will be told
that when they
or a member of their team
pushes a distractor button,
the volume played
in the opponent's room
will be determined by
the highest level selected
on any of their
teammates' dials.
This is to increase the feeling
of diffused responsibility.
The question is,
will any of these participants
take advantage of these factors
and act sadistically?
Of course, we would never
want anyone
to actually be harmed
in our experiments,
so the other team?
They don't exist.
Instead, Jared and I
will be the ones
occasionally blasting
the group with noise
at a safe level,
no higher than a 3.
To see just how powerful
the situation can be,
we selected participants
who would not be predisposed
to sadism.
We screened
our participants
using the "Big 5
Personality Scale,"
"The Personality
Assessment Inventory,"
and picked those who scored
the highest
in "moral" categories,
like honesty
and conscientiousness.
It looks like,
you know,
they should be able
to see each other.
But it's pitch-dark.
There are puzzle pieces
on the table in front of you.
Thank you, and once I leave
the room you may begin.
Okay, here we go.
[man 1]
[man 2]
[man 1]
I definitely don't think
they're conscious
of the control panel
at this point.
-No.
-They're trying to get focused
on the task here.
[man 1]
[man 2]
[man 2]
[laughter]
[man 2]
We picked people
who were most likely
to have these kinds
of personalities.
[man 1]
[laughs]
[woman]
-Oh.
-She wants...
[woman]
All right.
[all]
[man 1]
-[high-pitched squeal]
-[woman] Did somebody do it
already?
-I did.
-Yeah.
-Okay.
-We should retaliate.
-Yeah, retaliate now.
[loud, discordant horn]
[all laugh]
[horn blares]
[laughter]
[Michael]
Now, they're not retaliating
against that most recent buzz.
Shall we try again?
[loud, discordant horn]
Despite the factors making it
easy for them to do so,
this team doesn't appear
to be turning evil.
Now they are, like,
just deal with it.
Just ignore it and keep
working together.
They're not interested
in retaliating.
[discordant horn blares]
Over the course
of the two-hour study,
we blasted them with noise
23 times.
[woman laughs]
But they only pushed the button
six times,
and never above a level 5.
They didn't seem
to abuse their power.
Puzzle pieces down.
What would happen
if we introduced
demand characteristics
that encouraged them
to act aggressively?
Your team has been
randomly assigned
an experimental condition.
Although the other team
will continue working
on a puzzle,
your team will not.
Your only task is to operate
the distractors.
Also, the other team's buttons
have been disconnected
without their knowledge.
You will not hear any sounds
if they buzz back at you.
We introduce
the social roles,
where there's a little bit
of power differential.
We're kind of mimicking the
Stanford-like variables here.
[Michael]
By now saying that the buzzer
is their "task,"
the participants may feel
a greater license
to use it liberally.
Similar to how instructing
prison guards
in the original experiment
to act tough
may have encouraged
more use of force.
[man 3]
[woman]
[man 1]
Even though they were
given instructions
to distract the other team,
these participants instead
just started chatting
with one another.
They know that they can be
distracting now,
but they're not pushing the
button.
No.
[man 2]
Oh. Okay.
[woman]
A couple of threes.
[high-pitched squeal]
Over the course of ten minutes,
this group only pushed
the button three times.
Why do you think
they're so uninterested
in blasting
the other team?
Because we have individuals
who have been selected, really,
with that predisposition,
right?
These are individuals
who shouldn't be interested
in retaliating.
It was time to debrief
the participants
on what we were
actually studying.
[Michael]
I'm going to turn the lights on.
Here I am. I'm Michael,
and this is Jared.
We're going to debrief you on
what was really happening today.
There are no other people.
You are the only four here at
this moment.
There was never another team
doing anything.
[man 1]
This is a study related to
the Stanford Prison Experiment.
[man 1]
The standard narrative
we hear about that experiment
is that people
just become cruel.
So, yeah, we're trying to see if
we get the nicest people we can,
and we give them complete
anonymity
and the ability to be cruel,
but never encourage them to,
will they still do it?
And you guys didn't.
Did you have any suspicions
about what we were studying
or what was going on?
Right, but I think
that's good.
We just want to make sure
you don't think
that what we're really
looking at
is how high you turn
your own dial.
That's really
what we're looking at.
It was time to bring in our
second group of participants,
who, like the first group,
were screened to be individuals
with high morality
characteristics.
Anything up to 7
should be safe.
[laughs]
Yeah.
[woman]
So once I leave,
you can go ahead
and get started.
[woman 1]
[laughs]
Oh...
[high-pitched squeal]
Right off the bat she went to 7
and pushed the button.
Yeah.
[loud, discordant horn]
[high-pitched squeal]
[Michael]
Number two's pushing it at a 3.
[discordant horn blares]
[woman 1]
Okay, here comes number two.
[high-pitched squeal]
Number two is still
at a volume 3.
[Michael] This team seemed
more willing to retaliate.
Let's see what will happen
if we continue buzzing them.
Will they escalate
their behaviors?
Derek, let's blast them again.
Number 3.
[loud horn]
Okay, let's...
All right, so two just pushed
at a 3.
But she's not touching the dial.
[Jared]
She's not.
[loud, discordant horn]
[woman 2]
It's just annoying.
[blaring horn]
[high-pitched squeal]
[all laugh]
It was clear
that participant number two
was really the only one
hitting the distractor button,
but it appeared that she only
did it in retaliation
to our buzzes.
So we decided to see
what would happen
if we laid off.
[man 1]
It's been probably
four or five minutes,
and we have not blasted them
with the noise,
and they haven't
played one either.
I have a feeling like if we
never played a noise in their
room,
they would never touch
the distractor button.
[Jared]
Probably not at this point.
In the end, we buzzed
this group a total of 44 times,
and they buzzed us 38 times,
37 of which came
from number two
but always in retaliation,
and never above a 5.
All right, guys.
Puzzle pieces down.
The situational factors
did not seem to be sufficient
to make this group sadistic.
It was time
for phase 2.
[woman 1]
Yeah.
-Oh, she...
-[high-pitch squeal]
It looks like it's at 7.
-Wow.
-Yeah, she's--
She's going nuts.
At a 7.
So number three believes
there is no other team.
That might explain why she was
just going nuts on the button,
because she doesn't feel bad
about it.
[buttons clicking]
Okay, they're all pushing
the button a lot more.
And they were told
this time
that it was their
only task.
[buttons clicking]
[all laugh]
What a difference
this has made.
Just like in the Stanford
Prison Experiment.
If you tell people
that they have a certain task
to do, they'll do it,
even if it's going to mean
that they've been broken.
The thing is, they never hit
upon what we really cared about,
which is turning the dial
into an unsafe level.
Yeah.
[buttons click]
[Michael] Hello, everyone.
I'm going to turn the lights on
in this room.
[woman 1]
Okay.
-And slowly...
-Ah, it hurts.
...you can look.
So, hello.
-I'm Michael,
and this is Jared.
-Hi.
I'll give you time
to adjust your eyes.
Today, you've been part
of a study where all we wanted
was to see what would happen
when we put people in a room
and gave them that feeling
of anonymity
that comes from, well,
if I crank my dial up
really high,
no one will know
it's me.
So you have this opportunity
to be cruel.
I thought
I went nuts.
Like, when the other person
was pressing--
Sure, but that's--
that's just in-kind retribution.
As it turns out,
so far,
everyone stays in that
"below 7 or under" range.
-Yeah.
-This final phase was us
trying to ramp up
the demand characteristics.
And I believe number one, right,
you did say at one point,
"You've broken me.
I did it, fine."
So I loved that phrase,
because it says
"I didn't want to do this,
but I'm doing it because I
believe it was expected of me."
[all]
Thank you. Thanks.
[Michael] After dismissing
our participants,
Jared and I sat down
to discuss our results.
Really fascinating.
We brought in people who had
very different personalities
than those Zimbardo chose.
We put them in a situation that
did not demand things from them.
And they behaved according
to that personality.
I think we have some intriguing
support for the idea
that it's more than just
the situation.
We really saw personality
kind of shine through.
For the most part,
they seemed to be aware
-of where that line is...
-Yeah.
...that they shouldn't cross,
and they didn't.
None of them did.
It was now time to speak
with the man himself,
Dr. Philip Zimbardo,
who I worked with
on last season's episode,
"How to Make a Hero."
Okay. Lisa, Bear,
you guys ready?
For years, Dr. Zimbardo
has responded to criticisms
of his famous study,
always maintaining
that they aren't valid.
I asked him about
whether his study
is better seen
as one on the power
of demands from authority,
but he wasn't receptive
to that idea.
I then told him about the study
we ran to get his reaction.
I wanted to know what the
sufficient conditions might be
to make anyone
do something evil.
And we struggled
to get that to happen.
We couldn't get anyone
to be cruel.
Just giving them anonymity,
and a dehumanized other,
and the power
to hurt that other,
they didn't take
advantage of it.
Well, I mean,
maybe the problem was,
here's a case where,
by picking people
who were extremely
conscientious,
extremely mindful,
by selecting people
who are high on compassion,
high on mindfulness,
you broke the power
of the situation.
In the Stanford
Prison Experiment,
we had, I presume,
a relatively normal
distribution.
We gave them
six personality scales.
And we picked people who,
in the scales,
who were mostly
in the mid-range.
In that situation,
some people behave cruelly,
evilly.
Not everybody, but more
of the guards than not.
So, again, I think that
your study is a demonstration
of one way in which personality
dominates situation.
-Ah.
-Where the personalities are--
so I would say
it's a positive result.
The personalities
are special.
Where does this balance lie
between the personal,
the disposition,
the personality,
and the situation,
the environment?
No, that's the big--
that's the ultimate question.
Where is, you know,
how much of one
and how much of the other...?
Right.
Zimbardo insists
that demand characteristics
played little role
in his subject's behavior.
Critics like Ben Blum
say they played a big role,
that what happened
was what was asked for.
If that's true,
then the Stanford
Prison Experiment,
like the classic Milgram study,
still has an important lesson.
People are quick to be cruel
if an authority figure suggests
that doing so
will serve a greater cause.
In our test, we made sure that
such influences didn't exist.
And not one participant
acted maliciously.
Personality rose above
the situation.
Learning how that happens
is vital
if we want to improve conditions
where power is involved.
So it's great that this debate
is still ongoing.
And look, questioning methods
and interpretations
is not a personal att*ck.
It's how we improve
our confidence in what we know.
And that's how science works.
So stay curious,
never stop asking questions,
and, as always,
thanks for watching.
Hey, Mind Field.
Michael Stevens here.
There is so much more
to satisfy your hunger
for psychological knowledge
right on this show.
Click below to check out
more episodes.