Saturday, July 26, 2008

sheep II


This is about how normal people come to commit torture, genocide, and a multitude of lesser sins when circumstances are created where those actions are normal. It's taken slightly abridged from Phillip Zimbardo's The Lucifer Effect, pp 273-275, bold emphasis mine.


Ten Lessons From the Milgram Studies: Creating Evil Traps for Good People

Let's outline some of the procedures in this research paradigm that seduced many ordinary citizens to engage in this apparently harmful behavior. In doing so, I want to draw parallels to compliance strategies used by "influence professionals" in real-world settings, such as salespeople, cult and military recruiters, media advertisers, and others. There are ten methods we can extract from Milgram's paradigm for this purpose:

1. Prearranging some form of contractual obligation, verbal or written, to control the individual's behavior in pseudolegal fashion. (In Milgram's experiment, this was done by publicly agreeing to accept the tasks and the procedures.)

2. Giving participants meaningful roles to play ("teacher," "learner") that carry with them previously learned positive values and automatically activate response scripts.

3. Presenting basic rules to be followed that seem to make sense before their actual use but can then be used arbitrarily and impersonally to justify mindless compliance. Also, systems control people by making their rules vague and changing them as necessary but insisting that "rules are rules" and thus must be followed.

4. Altering the semantics of the act, the actor, and the action (from "hurting victims" to "helping the experimenter," punishing the former for the lofty goals of scientific discovery)--replacing unpleasant reality with desirable rhetoric, gilding the frame so that the real picture is disguised.

5. Creating opportunities for the diffusion of responsibility or abdication of responsibility for negative outcomes: others will be responsible, or the actor won't be held liable. (In Milgram's experiment, the authority figure said, when questioned by any "teacher," that he would take responsibility for anything that happened to the "learner.")

6. Starting the path toward the ultimate evil act with a small, seemingly insignificant first step, the easy "foot in the door" that swings open subsequent greater compliance pressures, and leads down a slippery slope. (In the obedience study, the initial shock was only a mild 15 volts.)

7. Having successively increasing steps on the pathway that are gradual, so that they are hardly noticeably different from one's most recent prior action. "Just a little bit more." (By increasing each level of aggression in gradual steps of only 15 volt increments, over the thirty switches, no new level of harm seemed like a noticeable difference from the prior level to Milgram's participants.)

8. Gradually changing the nature of the authority figure (the researcher, in Milgram's study) from initially "just" and reasonable to "unjust" and demanding, even irrational. This tactic elicits initial compliance and later confusion, since we expect consistency from authorities and friends. Not acknowledging that this transformation has occurred leads to mindless obedience (and it is part of many "date rape" scenarios and a reason why abused women stay with abusing spouses).

9. Making the "exit costs" high and making the process of exiting difficult by allowing verbal dissent (which makes people feel better about themselves) while insisting on behavioral compliance.

10. Offering an ideology, or a big lie, to justify the use of any means to achieve the seemingly desirable, essential goal. (In Milgram's research this came in the form of providing an acceptable justification, or rationale, for engaging in the undesirable action, such as that science wants to help people improve their memory by judicious use of reward and punishment.) In social psychology experiments, this tactic is known as the "cover story" because it is a cover-up for the procedures that follow, which might be challenged because they do not make sense on their own. the real-world equivalent is known as an "ideology." Most nations relay on an ideology, typically "threats to national security," before going to war or to suppress dissident political opposition. When citizens fear that their national security is being threatened, they become willing to surrender their basic freedoms to a government that offers them that exchange. Erich Fromm's classic analysis Escape from Freedom made us aware of this trade-off, which Hitler and other dictators have long used to gain and maintain power: namely, the claim that they will be able to provide security in exchange for citizens giving up their freedoms, which will give them the ability to control thing better.

Such procedures are utilized in varied influence situations where those in authority want others to do their bidding but know that few would engage in the "end game" without first being properly prepared psychologically to do the "unthinkable." In the future, when you are compromising position where your compliance is at stake, thinking back to those stepping stones to mindless obedience may enable you to step back and not go all the way down the path--their path. A good way to avoid crimes of obedience is to assert one's personal authority and always take full responsibility for one's actions.


P.S. On an entirely unrelated note--happy birthday to Tony and Toad. :)

2 comments:

Anonymous said...

Hmm.... some tricks I need to remember!

Klari said...

I'd like to see this list re-written for easier understanding for those with smaller vocabularies. If it could be published in a more accessible form, it's power to spark change would increase dramatically.