Skip to main content
Advertisement
  • Loading metrics

Contesting the “Nature” Of Conformity: What Milgram and Zimbardo's Studies Really Show

Abstract

Understanding of the psychology of tyranny is dominated by classic studies from the 1960s and 1970s: Milgram's research on obedience to authority and Zimbardo's Stanford Prison Experiment. Supporting popular notions of the banality of evil, this research has been taken to show that people conform passively and unthinkingly to both the instructions and the roles that authorities provide, however malevolent these may be. Recently, though, this consensus has been challenged by empirical work informed by social identity theorizing. This suggests that individuals' willingness to follow authorities is conditional on identification with the authority in question and an associated belief that the authority is right.

Introduction

If men make war in slavish obedience to rules, they will fail.

Ulysses S. Grant [1]

Conformity is often criticized on grounds of morality. Many, if not all, of the greatest human atrocities have been described as “crimes of obedience” [2]. However, as the victorious American Civil War General and later President Grant makes clear, conformity is equally problematic on grounds of efficacy. Success requires leaders and followers who do not adhere rigidly to a pre-determined script. Rigidity cannot steel them for the challenges of their task or for the creativity of their opponents.

Given these problems, it would seem even more unfortunate if human beings were somehow programmed for conformity. Yet this is a view that has become dominant over the last half-century. Its influence can be traced to two landmark empirical programs led by social psychologists in the 1960s and early 1970s: Milgram's Obedience to Authority research and Zimbardo's Stanford Prison Experiment. These studies have not only had influence in academic spheres. They have spilled over into our general culture and shaped popular understanding, such that “everyone knows” that people inevitably succumb to the demands of authority, however immoral the consequences [3],[4]. As Parker puts it, “the hopeless moral of the [studies'] story is that resistance is futile” [5]. What is more, this work has shaped our understanding not only of conformity but of human nature more broadly [6].

Building on an established body of theorizing in the social identity tradition—which sees group-based influence as meaningful and conditional [7],[8]—we argue, however, that these understandings are mistaken. Moreover, we contend that evidence from the studies themselves (as well as from subsequent research) supports a very different analysis of the psychology of conformity.

The Classic Studies: Conformity, Obedience, and the Banality Of Evil

In Milgram's work [9],[10] members of the general public (predominantly men) volunteered to take part in a scientific study of memory. They found themselves cast in the role of a “Teacher” with the task of administering shocks of increasing magnitude (from 15 V to 450 V in 15-V increments) to another man (the “Learner”) every time he failed to recall the correct word in a previously learned pair. Unbeknown to the Teacher, the Learner was Milgram's confederate, and the shocks were not real. Moreover, rather than being interested in memory, Milgram was actually interested in seeing how far the men would go in carrying out the task. To his—and everyone else's [11]—shock, the answer was “very far.” In what came to be termed the “baseline” study [12] all participants proved willing to administer shocks of 300 V and 65% went all the way to 450 V. This appeared to provide compelling evidence that normal well-adjusted men would be willing to kill a complete stranger simply because they were ordered to do so by an authority.

Zimbardo's Stanford Prison Experiment took these ideas further by exploring the destructive behaviour of groups of men over an extended period [13],[14]. Students were randomly assigned to be either guards or prisoners within a mock prison that had been constructed in the Stanford Psychology Department. In contrast to Milgram's studies, the objective was to observe the interaction within and between the two groups in the absence of an obviously malevolent authority. Here, again, the results proved shocking. Such was the abuse meted out to the prisoners by the guards that the study had to be terminated after just 6 days. Zimbardo's conclusion from this was even more alarming than Milgram's. People descend into tyranny, he suggested, because they conform unthinkingly to the toxic roles that authorities prescribe without the need for specific orders: brutality was “a ‘natural’ consequence of being in the uniform of a ‘guard’ and asserting the power inherent in that role” [15].

Within psychology, Milgram and Zimbardo helped consolidate a growing “conformity bias” [16] in which the focus on compliance is so strong as to obscure evidence of resistance and disobedience [17]. However their arguments proved particularly potent because they seemed to mesh with real-world examples—particularly evidence of the “banality of evil.” This term was coined in Hannah Arendt's account of the trial of Adolf Eichmann [18], a chief architect of the Nazis' “final solution to the Jewish question” [19]. Despite being responsible for the transportation of millions of people to their death, Arendt suggested that Eichmann was no psychopathic monster. Instead his trial revealed him to be a diligent and efficient bureaucrat—a man more concerned with following orders than with asking deep questions about their morality or consequence.

Much of the power of Milgram and Zimbardo's research derives from the fact that it appears to give empirical substance to this claim that evil is banal [3]. It seems to show that tyranny is a natural and unavoidable consequence of humans' inherent motivation to bend to the wishes of those in authority—whoever they may be and whatever it is that they want us to do. Put slightly differently, it operationalizes an apparent tragedy of the human condition: our desire to be good subjects is stronger than our desire to be subjects who do good.

Questioning the Consensus: Conformity Isn't Natural and It Doesn't Explain Tyranny

The banality of evil thesis appears to be a truth almost universally acknowledged. Not only is it given prominence in social psychology textbooks [20], but so too it informs the thinking of historians [21],[22], political scientists [23], economists [24], and neuroscientists [25]. Indeed, via a range of social commentators, it has shaped the public consciousness much more broadly [26], and, in this respect, can lay claim to being the most influential data-driven thesis in the whole of psychology [27],[28].

Yet despite the breadth of this consensus, in recent years, we and others have reinterrogated its two principal underpinnings—the archival evidence pertaining to Eichmann and his ilk, and the specifics of Milgram and Zimbardo's empirical demonstrations—in ways that tell a very different story [29].

First, a series of thoroughgoing historical examinations have challenged the idea that Nazi bureaucrats were ever simply following orders [19],[26],[30]. This may have been the defense they relied upon when seeking to minimize their culpability [31], but evidence suggests that functionaries like Eichmann had a very good understanding of what they were doing and took pride in the energy and application that they brought to their work. Typically too, roles and orders were vague, and hence for those who wanted to advance the Nazi cause (and not all did), creativity and imagination were required in order to work towards the regime's assumed goals and to overcome the challenges associated with any given task [32]. Emblematic of this, the practical details of “the final solution” were not handed down from on high, but had to be elaborated by Eichmann himself. He then felt compelled to confront and disobey his superiors—most particularly Himmler—when he believed that they were not sufficiently faithful to eliminationist Nazi principles [19].

Second, much the same analysis can be used to account for behavior in the Stanford Prison Experiment. So while it may be true that Zimbardo gave his guards no direct orders, he certainly gave them a general sense of how he expected them to behave [33]. During the orientation session he told them, amongst other things, “You can create in the prisoners feelings of boredom, a sense of fear to some degree, you can create a notion of arbitrariness that their life is totally controlled by us, by the system, you, me… We're going to take away their individuality in various ways. In general what all this leads to is a sense of powerlessness” [34]. This contradicts Zimbardo's assertion that “behavioral scripts associated with the oppositional roles of prisoner and guard [were] the sole source of guidance” [35] and leads us to question the claim that conformity to these role-related scripts was the primary cause of guard brutality.

But even with such guidance, not all guards acted brutally. And those who did used ingenuity and initiative in responding to Zimbardo's brief. Accordingly, after the experiment was over, one prisoner confronted his chief tormentor with the observation that “If I had been a guard I don't think it would have been such a masterpiece” [34]. Contrary to the banality of evil thesis, the Zimbardo-inspired tyranny was made possible by the active engagement of enthusiasts rather than the leaden conformity of automatons.

Turning, third, to the specifics of Milgram's studies, the first point to note is that the primary dependent measure (flicking a switch) offers few opportunities for creativity in carrying out the task. Nevertheless, several of Milgram's findings typically escape standard reviews in which the paradigm is portrayed as only yielding up evidence of obedience. Initially, it is clear that the “baseline study” is not especially typical of the 30 or so variants of the paradigm that Milgram conducted. Here the percentage of participants going to 450 V varied from 0% to nearly 100%, but across the studies as a whole, a majority of participants chose not to go this far [10],[36],[37].

Furthermore, close analysis of the experimental sessions shows that participants are attentive to the demands made on them by the Learner as well as the Experimenter [38]. They are torn between two voices confronting them with irreconcilable moral imperatives, and the fact that they have to choose between them is a source of considerable anguish. They sweat, they laugh, they try to talk and argue their way out of the situation. But the experimental set-up does not allow them to do so. Ultimately, they tend to go along with the Experimenter if he justifies their actions in terms of the scientific benefits of the study (as he does with the prod “The experiment requires that you continue”) [39]. But if he gives them a direct order (“You have no other choice, you must go on”) participants typically refuse. Once again, received wisdom proves questionable. The Milgram studies seem to be less about people blindly conforming to orders than about getting people to believe in the importance of what they are doing [40].

Tyranny as a Product of Identification-Based Followership

Our suspicions about the plausibility of the banality of evil thesis and its various empirical substrates were first raised through our work on the BBC Prison Study (BPS [41]). Like the Stanford study, this study randomly assigned men to groups as guards and prisoners and examined their behaviour with a specially created “prison.” Unlike Zimbardo, however, we took no leadership role in the study. Without this, would participants conform to a hierarchical script or resist it?

The study generated three clear findings. First, participants did not conform automatically to their assigned role. Second, they only acted in terms of group membership to the extent that they actively identified with the group (such that they took on a social identification) [42]. Third, group identity did not mean that people simply accepted their assigned position; instead, it empowered them to resist it. Early in the study, the Prisoners' identification as a group allowed them successfully to challenge the authority of the Guards and create a more egalitarian system. Later on, though, a highly committed group emerged out of dissatisfaction with this system and conspired to create a new hierarchy that was far more draconian.

Ultimately, then, the BBC Prison Study came close to recreating the tyranny of the Stanford Prison Experiment. However it was neither passive conformity to roles nor blind obedience to rules that brought the study to this point. On the contrary, it was only when they had internalized roles and rules as aspects of a system with which they identified that participants used them as a guide to action. Moreover, on the basis of this shared identification, the hallmark of the tyrannical regime was not conformity but creative leadership and engaged followership within a group of true believers (see also [43],[44]). As we have seen, this analysis mirrors recent conclusions about the Nazi tyranny. To complete the argument, we suggest that it is also applicable to Milgram's paradigm.

The evidence, noted above, about the efficacy of different “prods” already points to the fact that compliance is bound up with a sense of commitment to the experiment and the experimenter over and above commitment to the learner (S. Haslam, SD Reicher, M. Birney, unpublished data) [39]. This use of prods is but one aspect of Milgram's careful management of the paradigm [13] that is aimed at securing participants' identification with the scientific enterprise.

Significantly, though, the degree of identification is not constant across all variants of the study. For instance, when the study is conducted in commercial premises as opposed to prestigious Yale University labs one might expect the identification to diminish and (as our argument implies) compliance to decrease. It does. More systematically, we have examined variations in participants' identification with the Experimenter and the science that he represents as opposed to their identification with the Learner and the general community. They always identify with both to some degree—hence the drama and the tension of the paradigm. But the degree matters, and greater identification with the Experimenter is highly predictive of a greater willingness among Milgram's participants to administer the maximum shock across the paradigm's many variants [37].

However, some of the most compelling evidence that participants' administration of shocks results from their identification with Milgram's scientific goals comes from what happened after the study had ended. In his debriefing, Milgram praised participants for their commitment to the advancement of science, especially as it had come at the cost of personal discomfort. This inoculated them against doubts concerning their own punitive actions, but it also it led them to support more of such actions in the future. “I am happy to have been of service,” one typical participant responded, “Continue your experiments by all means as long as good can come of them. In this crazy mixed up world of ours, every bit of goodness is needed” (S. Haslam, SD Reicher, K Millward, R MacDonald, unpublished data).

Conclusion

The banality of evil thesis shocks us by claiming that decent people can be transformed into oppressors as a result of their “natural” conformity to the roles and rules handed down by authorities. More particularly, the inclination to conform is thought to suppress oppressors' ability to engage intellectually with the fact that what they are doing is wrong.

Although it remains highly influential, this thesis loses credibility under close empirical scrutiny. On the one hand, it ignores copious evidence of resistance even in studies held up as demonstrating that conformity is inevitable [17]. On the other hand, it ignores the evidence that those who do heed authority in doing evil do so knowingly not blindly, actively not passively, creatively not automatically. They do so out of belief not by nature, out of choice not by necessity. In short, they should be seen—and judged—as engaged followers not as blind conformists [45].

What was truly frightening about Eichmann was not that he was unaware of what he was doing, but rather that he knew what he was doing and believed it to be right. Indeed, his one regret, expressed prior to his trial, was that he had not killed more Jews [19]. Equally, what is shocking about Milgram's experiments is that rather than being distressed by their actions [46], participants could be led to construe them as “service” in the cause of “goodness.”

To understand tyranny, then, we need to transcend the prevailing orthodoxy that this derives from something for which humans have a natural inclination—a “Lucifer effect” to which they succumb thoughtlessly and helplessly (and for which, therefore, they cannot be held accountable). Instead, we need to understand two sets of inter-related processes: those by which authorities advocate oppression of others and those that lead followers to identify with these authorities. How did Milgram and Zimbardo justify the harmful acts they required of their participants and why did participants identify with them—some more than others?

These questions are complex and full answers fall beyond the scope of this essay. Yet, regarding advocacy, it is striking how destructive acts were presented as constructive, particularly in Milgram's case, where scientific progress was the warrant for abuse. Regarding identification, this reflects several elements: the personal histories of individuals that render some group memberships more plausible than others as a source of self-definition; the relationship between the identities on offer in the immediate context and other identities that are held and valued in other contexts; and the structure of the local context that makes certain ways of orienting oneself to the social world seem more “fitting” than others [41],[47],[48].

At root, the fundamental point is that tyranny does not flourish because perpetrators are helpless and ignorant of their actions. It flourishes because they actively identify with those who promote vicious acts as virtuous [49]. It is this conviction that steels participants to do their dirty work and that makes them work energetically and creatively to ensure its success. Moreover, this work is something for which they actively wish to be held accountable—so long as it secures the approbation of those in power.

References

  1. 1. Strachan H (1983) European armies and the conduct of war. London: Unwin Hyman (p.3).
  2. 2. Kelman HC, Hamilton VL (1990) Crimes of obedience. New Haven: Yale University Press.
  3. 3. Novick P (1999) The Holocaust in American life. Boston: Houghton Mifflin.
  4. 4. Jetten J, Hornsey MJ (Eds.) (2011) Rebels in groups: dissent, deviance, difference and defiance. Chichester, UK: Wiley-Blackwell.
  5. 5. Parker I (2007) Revolution in social psychology: alienation to emancipation. London: Pluto Press. (p.84)
  6. 6. Smith, JR, Haslam SA. (Eds.) (2012) Social psychology: revisiting the classic studies. London: Sage.
  7. 7. Turner JC (1991) Social influence. Buckingham, UK: Open University Press.
  8. 8. Turner JC, Hogg MA, Oakes PJ, Reicher SD, Wetherell MS (1987). Rediscovering the social group: A self-categorization theory. Oxford: Blackwell.
  9. 9. Milgram S (1963) Behavioral study of obedience. J Abnorm Soc Psych 67: 371–378.
  10. 10. Milgram S (1974) Obedience to authority: an experimental view. New York: Harper & Row.
  11. 11. Blass T (2004) The man who shocked the world: the life and legacy of Stanley Milgram. New York, NY: Basic Books.
  12. 12. Russell NJ (2011) Milgram's obedience to authority experiments: origins and early evolution. Br J Soc Psychol 50: 140–162.
  13. 13. Haney C, Banks C, Zimbardo P (1973) A study of prisoners and guards in a simulated prison. Nav Res Rev September: 1–17 Washington (D.C.): Office of Naval Research. p.11.
  14. 14. Zimbardo P (2007) The Lucifer effect: how good people turn evil. London, UK: Random House.
  15. 15. Haney C, Banks C, Zimbardo P (1973) A study of prisoners and guards in a simulated prison. Nav Res Rev September: 1–17 Washington (D.C.): Office of Naval Research. p.12.
  16. 16. Moscovici S (1976) Social influence and social change. London, UK: Academic Press.
  17. 17. Haslam SA, Reicher SD (2012) When prisoners take over the prison: a social psychology of resistance. Pers Soc Psychol Rev 16: 152–179.
  18. 18. Arendt H (1963) Eichmann in Jerusalem: a report on the banality of evil. New York: Penguin.
  19. 19. Cesarani D (2004) Eichmann: his life and crimes. London: Heinemann.
  20. 20. Miller A (2004). What can the Milgram obedience experiments tell us about the Holocaust? Generalizing from the social psychology laboratory. Miller A, ed. The social psychology of good and evil. New York: Guilford. pp. 193–239.
  21. 21. Browning C (1992) Ordinary men: Reserve Police Battalion 101 and the Final Solution in Poland. London: Penguin Books.
  22. 22. Overy R (2011) Milgram and the historians. Psychologist 24: 662–663.
  23. 23. Helm C, Morelli M (1979) Stanley Milgram and the obedience experiment: authority, legitimacy, and human action. Polit Theory 7: 321–346.
  24. 24. Akerlof GA (1991) Procrastination and obedience. Am Econ Rev 81: 1–19.
  25. 25. Harris LT (2009) The influence of social group and context on punishment decisions: insights from social neuroscience. Gruter Institute Squaw Valley Conference May 21, 2009. Law, Behavior & the Brain. Available at SSRN: http://ssrn.com/abstract=1405319.
  26. 26. Lozowick Y (2002) Hitler's bureaucrats: the Nazi Security Police and the banality of evil. H. Watzman, translator. London: Continuum.
  27. 27. Blass T (Ed.) (2000) Obedience to authority. Current perspectives on the Milgram Paradigm. Mahwah (New Jersey): Erlbaum.
  28. 28. Benjamin LT, Simpson JA (2009) The power of the situation: the impact of Milgram's obedience studies on personality and social psychology. Am Psychol 64: 12–19.
  29. 29. Haslam SA, Reicher SD (2007) Beyond the banality of evil: three dynamics of an interactionist social psychology of tyranny. Pers Soc Psychol Bull 33: 615–622.
  30. 30. Vetlesen AJ (2005) Evil and human agency: understanding collective evildoing. Cambridge: Cambridge University Press.
  31. 31. Mandel DR (1998) The obedience alibi: Milgram's account of the Holocaust reconsidered. Analyse und Kritik 20: 74–94.
  32. 32. Kershaw I (1993) Working towards the Führer: reflections on the nature of the Hitler dictatorship. Contemp Eur Hist 2: 103–108.
  33. 33. Banyard P (2007) Tyranny and the tyrant. Psychologist 20: 494–495.
  34. 34. Zimbardo P (1989) Quiet rage: The Stanford Prison Study [video]. Stanford: Stanford University.
  35. 35. Zimbardo P (2004) A situationist perspective on the psychology of evil: understanding how good people are transformed into perpetrators. Miller A, editor. The social psychology of good and evil. New York: Guilford. pp. 21–50.
  36. 36. Milgram S (1965) Some conditions of obedience and disobedience to authority. Hum Relat 18: 57–76.
  37. 37. Reicher SD, Haslam SA, Smith JR (2012) Working towards the experimenter: reconceptualizing obedience within the Milgram paradigm as identification-based followership. Perspect Psychol Sci 7: 315–324.
  38. 38. Packer D (2008) Identifying systematic disobedience in Milgram's obedience experiments: a meta-analytic review. Perspect Psychol Sci 3: 301–304.
  39. 39. Burger JM, Girgis ZM, Manning CM (2011) In their own words: explaining obedience to authority through an examination of participants' comments. Social Psychological and Personality Science 2: 460–466.
  40. 40. Reicher SD, Haslam SA (2011) After shock? Towards a social identity explanation of the Milgram ‘obedience’ studies. Brit J Soc Psychol 50: 163–169.
  41. 41. Reicher SD, Haslam SA (2006) Rethinking the psychology of tyranny: the BBC prison study. Brit J Soc Psychol 45: 1–40.
  42. 42. Tajfel H, Turner JC (1979) An integrative theory of intergroup conflict. Austin WG, Worchel S, editors. The social psychology of intergroup relations. Monterey (California): Brooks/Cole. pp.33–47.
  43. 43. Packer DJ (2008) On being both with us and against us: a normative conflict model of dissent in social groups. Pers Soc Psychol Rev 12: 50–72.
  44. 44. Packer DJ, Chasteen AL (2010) Loyal deviance: testing the normative conflict model of dissent in social groups. Pers Soc Psychol B 36: 5–18.
  45. 45. Haslam SA, Reicher SD, Platow MJ (2008) The new psychology of leadership: identity, influence and power. Hove, UK: Psychology Press.
  46. 46. Milgram S (1964) Issues in the study of obedience: a reply to Baumrind. Am Psychol 19: 848–852.
  47. 47. Bruner JS (1957) On perceptual readiness. Psychol Rev 64: 123–152.
  48. 48. Oakes PJ, Haslam SA, Turner JC (1994) Stereotyping and social reality. Oxford: Blackwell.
  49. 49. Reicher SD, Haslam SA, Rath R (2008) Making a virtue of evil: a five-step model of the development of collective hate. Social and Personality Psychology Compass 2: 1313–1344.