Even good people can find themselves doing evil things when placed in a powerful, overwhelming situation, research shows.
By Daniel DeNoon
Reviewed By Michael Smith
"The line between good and evil does not run between nations, but through every human heart." -- Writer Aleksandr Solzhenitsyn
There are two reasons why it's hard to look at the pictures of mistreatment and torture from Abu Ghraib prison in Iraq.
One is that the images portray abuse being heaped upon imprisoned individuals. But the second is even more disturbing. It's that the faces of the torturers are very much like our own -- and like those of our grown children, says Charles B. Strozier, PhD, director of the center on terrorism and public safety at John Jay College of Criminal Justice in New York.
"These are just ordinary, well-fed, American faces," Strozier tells WebMD. "The potential for torture and evil and masochism -- there is a potential for this in all of us. Even without any external pressure, evil can be evoked by certain situations."
That's not what we like to think. But it's true, says Thomas Blass, PhD, professor of psychology at University of Maryland Baltimore County.
"We think that what we do comes in a direct line from who we are: that good people do good things and evil people do evil," Blass tells WebMD. "But it is not the kind of person you are, it is the situation you find yourself in. The pressure of the situation we find ourselves in has a stronger effect on our behavior than we think -- strong enough to override our moral principals."
A Shocking Experiment
"First they came for the Communists, but I was not a Communist so I did not speak out. Then they came for the Socialists and the Trade Unionists, but I was neither, so I did not speak out. Then they came for the Jews, but I was not a Jew so I did not speak out. And when they came for me, there was no one left to speak out for me." --Pastor Martin Niemoeller
Blass is the biographer of Stanley Milgram, PhD. Published this year, his book, The Man Who Shocked the World: The Life and Legacy of Stanley Milgram, chronicles the life of the man who brought the world one of its most famous, and appalling, experiments.
In 1961, at his Yale laboratory, Milgram set up a fiendish looking machine. It had a bank of switches, listing progressively higher voltages. In his experiment, Milgram told volunteers they were helping with a study of whether punishment helped people learn. The volunteer "teacher" was supposed to shock a volunteer "learner" when the learner made a mistake. A researcher in a lab coat told the "teacher" to administer more and more severe shocks -- even when the "learner" begged them to stop, screamed, and eventually went ominously silent (the "learner" was actually an actor who did not really get shocked).
Before the experiment, Milgram asked a wide range of professors to predict what would happen. They were unanimous in predicting that nearly all of the "teachers" would refuse to continue once it was apparent they were hurting the "learner."
That didn't happen. Nearly all of the participants administered painful shocks to the learner. About two-thirds of them went all the way, repeatedly administering 450-volt shocks.
"Over 60% of people were willing to give increasingly severe shocks to the maximum, in spite of the fact that there were clear-cut protests of increasing intensity," Blass says. "The 'learner' says he wants to get out. He says he has a heart condition. And then he finally goes silent. Here is a condition in which all humane instinct tells you this shouldn't be happening. But it was happening."
Under certain conditions, Milgram found he could get up to 90% of people to go all the way.
"And that is because a person whose legitimacy they accepted told them to," Blass says. "Once that happens, mental transformations take place. You don't just go along. Certain internal changes take place. One is a shift in sense of responsibility. Whereas normally, as autonomous individuals, our behavior is rooted in moral sensibility, when I accept authority I shed the responsibility for my actions. I am no longer directed by my values."
This shows that pressure to conform can, under certain circumstances, cause us to do things we know is wrong, says psychologist David Silber, PhD, professor emeritus at George Washington University. Silber is an expert in crime, violence, and personality.
"What is it inside of us that allows us to do evil? That is hard to say," Silber tells WebMD. "The willingness of us to be conformists, not to rock the boat, rationalizing we are not really going to hurt anybody. I don't think Americans are any less likely to act in reprehensible ways under some conditions than Germans or other people."
Barrel of Evil: The Stanford Prison Experiment
Ten years after Milgram's first experiment, Stanford University psychologist Philip Zimbardo conducted the Stanford Prison Experiment.
Zimbardo recruited 24 psychologically normal, drug-free, healthy college men and randomly assigned them to be "prisoners" or "guards." The "guards" were given uniforms, sunglasses, and nightsticks. They were given vague instructions to make the "prisoners" feel helpless. The Stanford psychology department was turned into a makeshift prison, and the "prisoners" were arrested by real police, given numbers, dressed in humiliating smocks, and handed over to the guards.
Within days, the make-believe guards became increasingly brutal. The two-week experiment was halted after six days. Zimbardo himself became caught up in the role of "prison superintendent." He ended the experiment only after the objections of a colleague brought him to his senses.
"The behavior was uncannily similar to Abu Ghraib," Blass says. "They put hoods over the prisoners' heads. They sexually abused them. It shows that the circumstances you find yourself in that have a surprising effect on your behavior. In the Zimbardo experiment, it was the brutalizing prison environment and peer pressure. Even those who were not as brutal as others did not do anything to break the solidarity. Even the good guards were not that good. They were just passively not engaged."
Zimbardo, a recent past president of the American Psychological Association, also makes the connection to Abu Ghraib.
"We must separate guilt from blame," he wrote in a recent New York Times editorial. "Should these few Army reservists be blamed as the 'bad apples' in a good barrel of American soldiers, as our leaders have characterized them? Or are they the once-good apples soured and corrupted by an evil barrel? I argue for the latter perspective after having studied the psychology of evil for many decades."
Strozier notes that all Americans recently find themselves in a kind of evil barrel. Before the events of 9/11, he notes, the idea of torture was alien to Americans. Now even civil libertarians such as Alan Dershowitz argue for legalizing torture.
"There are two things one can say about torture," Strozier says. "One is that it doesn't work. Torturers brutalize their own society and enrage the citizens of others nations. And two, it happens to be wrong."
How to Resist Evil
"It may be that we are puppets -- puppets controlled by the strings of society. But at least we are puppets with perception, with awareness. And perhaps our awareness is the first step to our liberation." -- Psychologist Stanley Milgram
Even when immersed in a barrel of evil, some people do not sour. Often overlooked in the Milgram experiment are those who resisted evil orders.
One of those people is Joseph Dimow, now a columnist and editorial board member for Jewish Currents magazine. Dimow was 41 when he signed up for the 1961 experiment.
How did he resist?
"I'm not certain about it, but I would attribute it to my upbringing, background, education -- things of that sort," Dimow tells WebMD. "And I think something happening in one's life that makes them, not an outsider, but skeptical of going along with the crowd. And it's probably helpful to have some determination to think in an unorthodox manner and to question assumptions."
Dimow says his parents told him at an early age that while he should listen to his teachers, he should know that teachers aren't always right. He taught this lesson to his own children.
While having the right kind of childhood can help, there are concrete things one can do to resist evil.
Blass says that if you find yourself in a situation where you are pressed on to carry out acts you find distasteful -- or worse -- there are a number of things you can do:
Question the authority's legitimacy. "That concept is central," Blass says. "We often give too wide a berth to people who project authority, who have a commanding presence. We allow them to direct our behavior even in a domain not relevant to their authority."
Stop early. Distasteful acts often have an escalating quality. Once you start doing something you find objectionable, you're on a slippery slope. To stop at this point negates the rightness of what you did before -- so it's increasingly difficult to break off. "So if someone asks you to do something slightly wrong, don't do it," Blass says. "If it smells fishy and you are already uneasy about it, the chances are it will get worse. If you stop right now, the chances are you can pull out of it earlier rather than later."
Find an ally. "If two people get together and say, 'This is wrong,' that would empower them to withstand the pressures to behave abominably," Blass says. "It is much more difficult to withstand when you are the lone dissenter. Recruitment of allies is immensely liberating."
Training is important. "Part of the trap is that avenues for disobedience aren't always clear," Blass says. He notes that good military training includes learning exactly what to do when given a command one feels is illegal.
Work to strengthen your inner principles. "To the extent a person has strong, well-defined principles, the more likely that person is to speak up and say, 'That is wrong," Silber says.
Teach your children. "Pay attention to what you are communicating to your kids," Silber says. "And be sure you behave in front of them the way you say they should."
Published May, 21, 2004.
SOURCES: Charles B. Strozier, PhD, director, center on terrorism and public safety, John Jay College of Criminal Justice, New York. Thomas Blass, PhD, professor of psychology at University of Maryland Baltimore County; author, The Man Who Shocked the World: The Life and Legacy of Stanley Milgram. David Silber, PhD, professor emeritus of psychology, George Washington University, Washington, D.C. Joseph Dimow, columnist and editorial board member, Jewish Currents magazine. The New York Times, May 9, 2004.
©1996-2005 WebMD Inc. All rights reserved.