Friday, February 29, 2008

"It's Not Bad Apples, But Bad Barrel-Makers" 

As part of his presentation earlier today at TED 2008, Prof. Philip Zimbardo, creator of the Stanford Prison Experiment and author of The Lucifer Effect: Understanding How Good People Turn Evil, showed a video including a number of previously-unreleased photographs from Abu Ghraib prison in Iraq. (Zimbardo gained access to the pictures while serving as an expert witness in the defense of Sgt. Chip Frederick, one of the "bad apples" court-martialled for his treatment of Iraqi detainees at that infamous facility.) The public appetite for photographic evidence of American atrocities has apparently diminished in the last couple of years, for, as we discovered when we ran a Google search on "new photos Abu Ghraib," most domestic news outlets have chosen to ignore the story altogether; the bulk of the coverage emanates from such far-flung locales as South Africa, Egypt, Iran, Canada, France, and Australia.

Wired, however, has posted both the photos and the video, as well as a fascinating interview with Zimbardo himself:
Wired: How did what happened at Abu Ghraib compare to your Stanford prison study?

Zimbardo: The military intelligence, the CIA and the civilian interrogator corporation, Titan, told the MPs [at Abu Ghraib], "It is your job to soften the prisoners up. We give you permission to do something you ordinarily are not allowed to do as a military policeman -- to break the prisoners, to soften them up, to prepare them for interrogation." That's permission to step across the line from what is typically restricted behavior to now unrestricted behavior.

In the same way in the Stanford prison study, I was saying [to the student guards], "You have to be powerful to prevent further rebellion." I tell them, "You're not allowed, however, to use physical force." By default, I allow them to use psychological force. In five days, five prisoners are having emotional breakdowns.

The situational forces that were going on in [Abu Ghraib] -- the dehumanization, the lack of personal accountability, the lack of surveillance, the permission to get away with anti-social actions -- it was like the Stanford prison study, but in spades.

Those sets of things are found any time you really see an evil situation occurring, whether it's Rwanda or Nazi Germany or the Khmer Rouge.

Wired: But not everyone at Abu Ghraib responded to the situation in the same way. So what makes one person in a situation commit evil acts while another in the same situation becomes a whistle-blower?

Zimbardo: There's no answer, based on what we know about a person, that we can predict whether they're going to be a hero whistle-blower or the brutal guard. We want to believe that if I was in some situation [like that], I would bring with it my usual compassion and empathy. But you know what? When I was the superintendent of the Stanford prison study, I was totally indifferent to the suffering of the prisoners, because my job as prison superintendent was to focus on the guards.

As principal [scientific] investigator [of the experiment], my job was to care about what happened to everybody because they were all under my experimental control. But once I switched to being the prison superintendent, I was a different person. It's hard to believe that, but I was transformed . . . .

Wired: You've said that the way to prevent evil actions is to teach the "banality of kindness" -- that is, to get society to exemplify ordinary people who engage in extraordinary moral actions. How do you do this?

Zimbardo: If you can agree on a certain number of things that are morally wrong, then one way to counteract them is by training kids. There are some programs, starting in the fifth grade, which get kids to think about the heroic mentality, the heroic imagination.

To be a hero you have to take action on behalf of someone else or some principle and you have to be deviant in your society, because the group is always saying don't do it; don't step out of line. If you're an accountant at Arthur Andersen, everyone who is doing the defrauding is telling you, "Hey, be one of the team."

Heroes have to always, at the heroic decisive moment, break from the crowd and do something different. But a heroic act involves a risk. If you're a whistle-blower you're going to get fired, you're not going to get promoted, you're going to get ostracized. And you have to say it doesn't matter.

Most heroes are more effective when they're social heroes rather than isolated heroes. A single person or even two can get dismissed by the system. But once you have three people, then it's the start of an opposition.

So what I'm trying to promote is not only the importance of each individual thinking "I'm a hero" and waiting for the right situation to come along in which I will act on behalf of some people or some principle, but also, "I'm going to learn the skills to influence other people to join me in that heroic action."

| | Technorati Links | to Del.icio.us