Stanley Milgram Shock Experiment

Saul McLeod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul McLeod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Learn about our Editorial Process

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

Stanley Milgram, a psychologist at Yale University, carried out one of the most famous studies of obedience in psychology.

He conducted an experiment focusing on the conflict between obedience to authority and personal conscience.

Milgram (1963) examined justifications for acts of genocide offered by those accused at the World War II, Nuremberg War Criminal trials. Their defense often was based on obedience  – that they were just following orders from their superiors.

The experiments began in July 1961, a year after the trial of Adolf Eichmann in Jerusalem. Milgram devised the experiment to answer the question:

Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?” (Milgram, 1974).

Milgram (1963) wanted to investigate whether Germans were particularly obedient to authority figures, as this was a common explanation for the Nazi killings in World War II.

Milgram selected participants for his experiment by newspaper advertising for male participants to take part in a study of learning at Yale University.

The procedure was that the participant was paired with another person and they drew lots to find out who would be the ‘learner’ and who would be the ‘teacher.’  The draw was fixed so that the participant was always the teacher, and the learner was one of Milgram’s confederates (pretending to be a real participant).

stanley milgram generator scale

The learner (a confederate called Mr. Wallace) was taken into a room and had electrodes attached to his arms, and the teacher and researcher went into a room next door that contained an electric shock generator and a row of switches marked from 15 volts (Slight Shock) to 375 volts (Danger: Severe Shock) to 450 volts (XXX).

The shocks in Stanley Milgram’s obedience experiments were not real. The “learners” were actors who were part of the experiment and did not actually receive any shocks.

However, the “teachers” (the real participants of the study) believed the shocks were real, which was crucial for the experiment to measure obedience to authority figures even when it involved causing harm to others.

Milgram’s Experiment (1963)

Milgram (1963) was interested in researching how far people would go in obeying an instruction if it involved harming another person.

Stanley Milgram was interested in how easily ordinary people could be influenced into committing atrocities, for example, Germans in WWII.

Volunteers were recruited for a controlled experiment investigating “learning” (re: ethics: deception). 

Participants were 40 males, aged between 20 and 50, whose jobs ranged from unskilled to professional, from the New Haven area. They were paid $4.50 for just turning up.

Milgram

At the beginning of the experiment, they were introduced to another participant, a confederate of the experimenter (Milgram).

They drew straws to determine their roles – learner or teacher – although this was fixed, and the confederate was always the learner. There was also an “experimenter” dressed in a gray lab coat, played by an actor (not Milgram).

Two rooms in the Yale Interaction Laboratory were used – one for the learner (with an electric chair) and another for the teacher and experimenter with an electric shock generator.

Milgram Obedience: Mr Wallace

The “learner” (Mr. Wallace) was strapped to a chair with electrodes.

After he has learned a list of word pairs given to him to learn, the “teacher” tests him by naming a word and asking the learner to recall its partner/pair from a list of four possible choices.

The teacher is told to administer an electric shock every time the learner makes a mistake, increasing the level of shock each time. There were 30 switches on the shock generator marked from 15 volts (slight shock) to 450 (danger – severe shock).

Milgram Obedience IV Variations

The learner gave mainly wrong answers (on purpose), and for each of these, the teacher gave him an electric shock. When the teacher refused to administer a shock, the experimenter was to give a series of orders/prods to ensure they continued.

There were four prods, and if one was not obeyed, then the experimenter (Mr. Williams) read out the next prod, and so on.

Prod 1 : Please continue. Prod 2: The experiment requires you to continue. Prod 3 : It is absolutely essential that you continue. Prod 4 : You have no other choice but to continue.

These prods were to be used in order, and begun afresh for each new attempt at defiance (Milgram, 1974, p. 21). The experimenter also had two special prods available. These could be used as required by the situation:

  • Although the shocks may be painful, there is no permanent tissue damage, so please go on’ (ibid.)
  • ‘Whether the learner likes it or not, you must go on until he has learned all the word pairs correctly. So please go on’ (ibid., p. 22).

65% (two-thirds) of participants (i.e., teachers) continued to the highest level of 450 volts. All the participants continued to 300 volts.

Milgram did more than one experiment – he carried out 18 variations of his study.  All he did was alter the situation (IV) to see how this affected obedience (DV).

Conclusion 

The individual explanation for the behavior of the participants would be that it was something about them as people that caused them to obey, but a more realistic explanation is that the situation they were in influenced them and caused them to behave in the way that they did.

Some aspects of the situation that may have influenced their behavior include the formality of the location, the behavior of the experimenter, and the fact that it was an experiment for which they had volunteered and been paid.

Ordinary people are likely to follow orders given by an authority figure, even to the extent of killing an innocent human being.  Obedience to authority is ingrained in us all from the way we are brought up.

People tend to obey orders from other people if they recognize their authority as morally right and/or legally based. This response to legitimate authority is learned in a variety of situations, for example in the family, school, and workplace.

Milgram summed up in the article “The Perils of Obedience” (Milgram 1974), writing:

“The legal and philosophic aspects of obedience are of enormous import, but they say very little about how most people behave in concrete situations. I set up a simple experiment at Yale University to test how much pain an ordinary citizen would inflict on another person simply because he was ordered to by an experimental scientist. Stark authority was pitted against the subjects’ [participants’] strongest moral imperatives against hurting others, and, with the subjects’ [participants’] ears ringing with the screams of the victims, authority won more often than not. The extreme willingness of adults to go to almost any lengths on the command of an authority constitutes the chief finding of the study and the fact most urgently demanding explanation.”

Milgram’s Agency Theory

Milgram (1974) explained the behavior of his participants by suggesting that people have two states of behavior when they are in a social situation:

  • The autonomous state – people direct their own actions, and they take responsibility for the results of those actions.
  • The agentic state – people allow others to direct their actions and then pass off the responsibility for the consequences to the person giving the orders. In other words, they act as agents for another person’s will.

Milgram suggested that two things must be in place for a person to enter the agentic state:

  • The person giving the orders is perceived as being qualified to direct other people’s behavior. That is, they are seen as legitimate.
  • The person being ordered about is able to believe that the authority will accept responsibility for what happens.
According to Milgram, when in this agentic state, the participant in the obedience studies “defines himself in a social situation in a manner that renders him open to regulation by a person of higher status. In this condition the individual no longer views himself as responsible for his own actions but defines himself as an instrument for carrying out the wishes of others” (Milgram, 1974, p. 134).

Agency theory says that people will obey an authority when they believe that the authority will take responsibility for the consequences of their actions. This is supported by some aspects of Milgram’s evidence.

For example, when participants were reminded that they had responsibility for their own actions, almost none of them were prepared to obey.

In contrast, many participants who were refusing to go on did so if the experimenter said that he would take responsibility.

According to Milgram (1974, p. 188):

“The behavior revealed in the experiments reported here is normal human behavior but revealed under conditions that show with particular clarity the danger to human survival inherent in our make-up.

And what is it we have seen? Not aggression, for there is no anger, vindictiveness, or hatred in those who shocked the victim….

Something far more dangerous is revealed: the capacity for man to abandon his humanity, indeed, the inevitability that he does so, as he merges his unique personality into larger institutional structures.”

Milgram Experiment Variations

The Milgram experiment was carried out many times whereby Milgram (1965) varied the basic procedure (changed the IV).  By doing this Milgram could identify which factors affected obedience (the DV).

Obedience was measured by how many participants shocked to the maximum 450 volts (65% in the original study). Stanley Milgram conducted a total of 23 variations (also called conditions or experiments) of his original obedience study:

In total, 636 participants were tested in 18 variation studies conducted between 1961 and 1962 at Yale University.

In the original baseline study – the experimenter wore a gray lab coat to symbolize his authority (a kind of uniform).

The lab coat worn by the experimenter in the original study served as a crucial symbol of scientific authority that increased obedience. The lab coat conveyed expertise and legitimacy, making participants see the experimenter as more credible and trustworthy.

Milgram carried out a variation in which the experimenter was called away because of a phone call right at the start of the procedure.

The role of the experimenter was then taken over by an ‘ordinary member of the public’ ( a confederate) in everyday clothes rather than a lab coat. The obedience level dropped to 20%.

Change of Location:  The Mountain View Facility Study (1963, unpublished)

Milgram conducted this variation in a set of offices in a rundown building, claiming it was associated with “Research Associates of Bridgeport” rather than Yale.

The lab’s ordinary appearance was designed to test if Yale’s prestige encouraged obedience. Participants were led to believe that a private research firm experimented.

In this non-university setting, obedience rates dropped to 47.5% compared to 65% in the original Yale experiments. This suggests that the status of location affects obedience.

Private research firms are viewed as less prestigious than certain universities, which affects behavior. It is easier under these conditions to abandon the belief in the experimenter’s essential decency.

The impressive university setting reinforced the experimenter’s authority and conveyed an implicit approval of the research.

Milgram filmed this variation for his documentary Obedience , but did not publish the results in his academic papers. The study only came to wider light when archival materials, including his notes, films, and data, were studied by later researchers like Perry (2013) in the decades after Milgram’s death.

Two Teacher Condition

When participants could instruct an assistant (confederate) to press the switches, 92.5% shocked to the maximum of 450 volts.

Allowing the participant to instruct an assistant to press the shock switches diffused personal responsibility and likely reduced perceptions of causing direct harm.

By attributing the actions to the assistant rather than themselves, participants could more easily justify shocking to the maximum 450 volts, reflected in the 92.5% obedience rate.

When there is less personal responsibility, obedience increases. This relates to Milgram’s Agency Theory.

Touch Proximity Condition

The teacher had to force the learner’s hand down onto a shock plate when the learner refused to participate after 150 volts. Obedience fell to 30%.

Forcing the learner’s hand onto the shock plate after 150 volts physically connected the teacher to the consequences of their actions. This direct tactile feedback increased the teacher’s personal responsibility.

No longer shielded from the learner’s reactions, the proximity enabled participants to more clearly perceive the harm they were causing, reducing obedience to 30%. Physical distance and indirect actions in the original setup made it easier to rationalize obeying the experimenter.

The participant is no longer buffered/protected from seeing the consequences of their actions.

Social Support Condition

When the two confederates set an example of defiance by refusing to continue the shocks, especially early on at 150 volts, it permitted the real participant also to resist authority.

Two other participants (confederates) were also teachers but refused to obey. Confederate 1 stopped at 150 volts, and Confederate 2 stopped at 210 volts.

Their disobedience provided social proof that it was acceptable to disobey. This modeling of defiance lowered obedience to only 10% compared to 65% without such social support. It demonstrated that social modeling can validate challenging authority.

The presence of others who are seen to disobey the authority figure reduces the level of obedience to 10%.

Absent Experimenter Condition 

It is easier to resist the orders from an authority figure if they are not close by. When the experimenter instructed and prompted the teacher by telephone from another room, obedience fell to 20.5%.

Many participants cheated and missed out on shocks or gave less voltage than ordered by the experimenter. The proximity of authority figures affects obedience.

The physical absence of the authority figure enabled participants to act more freely on their own moral inclinations rather than the experimenter’s commands. This highlighted the role of an authority’s direct presence in influencing behavior.

A key reason the obedience studies fascinate people is Milgram presented them as a scientific experiment, contrasting himself as an “empirically grounded scientist” compared to philosophers. He claimed he systematically varied factors to alter obedience rates.

However, recent scholarship using archival records shows Milgram’s account of standardizing the procedure was misleading. For example, he published a list of standardized prods the experimenter used when participants questioned continuing. Milgram said these were delivered uniformly in a firm but polite tone.

Analyzing audiotapes, Gibson (2013) found considerable variation from the published protocol – the prods differed across trials. The point is not that Milgram did poor science, but that the archival materials reveal the limitations of the textbook account of his “standardized” procedure.

The qualitative data like participant feedback, Milgram’s notes, and researchers’ actions provide a fuller, messier picture than the obedience studies’ “official” story. For psychology students, this shows how scientific reporting can polish findings in a way that strays from the less tidy reality.

Critical Evaluation

Inaccurate description of the prod methodology:.

A key reason the obedience studies fascinate people is Milgram (1974) presented them as a scientific experiment, contrasting himself as an “empirically grounded scientist” compared to philosophers. He claimed he systematically varied factors to alter obedience rates.

However, recent scholarship using archival records shows Milgram’s account of standardizing the procedure was misleading. For example, he published a list of standardized prods the experimenter used when participants questioned continuing. Milgram said these were delivered uniformly in a firm but polite tone (Gibson, 2013; Perry, 2013; Russell, 2010).

Perry’s (2013) archival research revealed another discrepancy between Milgram’s published account and the actual events. Milgram claimed standardized prods were used when participants resisted, but Perry’s audiotape analysis showed the experimenter often improvised more coercive prods beyond the supposed script.

This off-script prodding varied between experiments and participants, and was especially prevalent with female participants where no gender obedience difference was found – suggesting the improvisation influenced results. Gibson (2013) and Russell (2009) corroborated the experimenter’s departures from the supposed fixed prods. 

Prods were often combined or modified rather than used verbatim as published.

Russell speculated the improvisation aimed to achieve outcomes the experimenter believed Milgram wanted. Milgram seemed to tacitly approve of the deviations by not correcting them when observing.

This raises significant issues around experimenter bias influencing results, lack of standardization compromising validity, and ethical problems with Milgram misrepresenting procedures.

Milgram’s experiment lacked external validity:

The Milgram studies were conducted in laboratory-type conditions, and we must ask if this tells us much about real-life situations.

We obey in a variety of real-life situations that are far more subtle than instructions to give people electric shocks, and it would be interesting to see what factors operate in everyday obedience. The sort of situation Milgram investigated would be more suited to a military context.

Orne and Holland (1968) accused Milgram’s study of lacking ‘experimental realism,”’ i.e.,” participants might not have believed the experimental set-up they found themselves in and knew the learner wasn’t receiving electric shocks.

“It’s more truthful to say that only half of the people who undertook the experiment fully believed it was real, and of those two-thirds disobeyed the experimenter,” observes Perry (p. 139).

Milgram’s sample was biased:

  • The participants in Milgram’s study were all male. Do the findings transfer to females?
  • Milgram’s study cannot be seen as representative of the American population as his sample was self-selected. This is because they became participants only by electing to respond to a newspaper advertisement (selecting themselves).
  • They may also have a typical “volunteer personality” – not all the newspaper readers responded so perhaps it takes this personality type to do so.

Yet a total of 636 participants were tested in 18 separate experiments across the New Haven area, which was seen as being reasonably representative of a typical American town.

Milgram’s findings have been replicated in a variety of cultures and most lead to the same conclusions as Milgram’s original study and in some cases see higher obedience rates.

However, Smith and Bond (1998) point out that with the exception of Jordan (Shanab & Yahya, 1978), the majority of these studies have been conducted in industrialized Western cultures, and we should be cautious before we conclude that a universal trait of social behavior has been identified.

Selective reporting of experimental findings:

Perry (2013) found Milgram omitted findings from some obedience experiments he conducted, reporting only results supporting his conclusions. A key omission was the Relationship condition (conducted in 1962 but unpublished), where participant pairs were relatives or close acquaintances.

When the learner protested being shocked, most teachers disobeyed, contradicting Milgram’s emphasis on obedience to authority.

Perry argued Milgram likely did not publish this 85% disobedience rate because it undermined his narrative and would be difficult to defend ethically since the teacher and learner knew each other closely.

Milgram’s selective reporting biased interpretations of his findings. His failure to publish all his experiments raises issues around researchers’ ethical obligation to completely and responsibly report their results, not just those fitting their expectations.

Unreported analysis of participants’ skepticism and its impact on their behavior:

Perry (2013) found archival evidence that many participants expressed doubt about the experiment’s setup, impacting their behavior. This supports Orne and Holland’s (1968) criticism that Milgram overlooked participants’ perceptions.

Incongruities like apparent danger, but an unconcerned experimenter likely cued participants that no real harm would occur. Trust in Yale’s ethics reinforced this. Yet Milgram did not publish his assistant’s analysis showing participant skepticism correlated with disobedience rates and varied by condition.

Obedient participants were more skeptical that the learner was harmed. This selective reporting biased interpretations. Additional unreported findings further challenge Milgram’s conclusions.

This highlights issues around thoroughly and responsibly reporting all results, not just those fitting expectations. It shows how archival evidence makes Milgram’s study a contentious classic with questionable methods and conclusions.

Ethical Issues

What are the potential ethical concerns associated with Milgram’s research on obedience?

While not a “contribution to psychology” in the traditional sense, Milgram’s obedience experiments sparked significant debate about the ethics of psychological research.

Baumrind (1964) criticized the ethics of Milgram’s research as participants were prevented from giving their informed consent to take part in the study. 

Participants assumed the experiment was benign and expected to be treated with dignity.

As a result of studies like Milgram’s, the APA and BPS now require researchers to give participants more information before they agree to take part in a study.

The participants actually believed they were shocking a real person and were unaware the learner was a confederate of Milgram’s.

However, Milgram argued that “illusion is used when necessary in order to set the stage for the revelation of certain difficult-to-get-at-truths.”

Milgram also interviewed participants afterward to find out the effect of the deception. Apparently, 83.7% said that they were “glad to be in the experiment,” and 1.3% said that they wished they had not been involved.

Protection of participants 

Participants were exposed to extremely stressful situations that may have the potential to cause psychological harm. Many of the participants were visibly distressed (Baumrind, 1964).

Signs of tension included trembling, sweating, stuttering, laughing nervously, biting lips and digging fingernails into palms of hands. Three participants had uncontrollable seizures, and many pleaded to be allowed to stop the experiment.

Milgram described a businessman reduced to a “twitching stuttering wreck” (1963, p. 377),

In his defense, Milgram argued that these effects were only short-term. Once the participants were debriefed (and could see the confederate was OK), their stress levels decreased.

“At no point,” Milgram (1964) stated, “were subjects exposed to danger and at no point did they run the risk of injurious effects resulting from participation” (p. 849).

To defend himself against criticisms about the ethics of his obedience research, Milgram cited follow-up survey data showing that 84% of participants said they were glad they had taken part in the study.

Milgram used this to claim that the study caused no serious or lasting harm, since most participants retrospectively did not regret their involvement.

Yet archival accounts show many participants endured lasting distress, even trauma, refuting Milgram’s insistence the study caused only fleeting “excitement.” By not debriefing all, Milgram misled participants about the true risks involved (Perry, 2013).

However, Milgram did debrief the participants fully after the experiment and also followed up after a period of time to ensure that they came to no harm.

Milgram debriefed all his participants straight after the experiment and disclosed the true nature of the experiment.

Participants were assured that their behavior was common, and Milgram also followed the sample up a year later and found no signs of any long-term psychological harm.

The majority of the participants (83.7%) said that they were pleased that they had participated, and 74% had learned something of personal importance.

Perry’s (2013) archival research found Milgram misrepresented debriefing – around 600 participants were not properly debriefed soon after the study, contrary to his claims. Many only learned no real shocks occurred when reading a mailed study report months later, which some may have not received.

Milgram likely misreported debriefing details to protect his credibility and enable future obedience research. This raises issues around properly informing and debriefing participants that connect to APA ethics codes developed partly in response to Milgram’s study.

Right to Withdrawal 

The BPS states that researchers should make it plain to participants that they are free to withdraw at any time (regardless of payment).

When expressing doubts, the experimenter assured them all was well. Trusting Yale scientists, many took the experimenter at his word that “no permanent tissue damage” would occur, and continued administering shocks despite reservations.

Did Milgram give participants an opportunity to withdraw? The experimenter gave four verbal prods which mostly discouraged withdrawal from the experiment:

  • Please continue.
  • The experiment requires that you continue.
  • It is absolutely essential that you continue.
  • You have no other choice, you must go on.

Milgram argued that they were justified as the study was about obedience, so orders were necessary.

Milgram pointed out that although the right to withdraw was made partially difficult, it was possible as 35% of participants had chosen to withdraw.

Replications

Direct replications have not been possible due to current ethical standards . However, several researchers have conducted partial replications and variations that aim to reproduce some aspects of Milgram’s methods ethically.

One important replication was conducted by Jerry Burger in 2009. Burger’s partial replication included several safeguards to protect participant welfare, such as screening out high-risk individuals, repeatedly reminding participants they could withdraw, and stopping at the 150-volt shock level. This was the point where Milgram’s participants first heard the learner’s protests.

As 79% of Milgram’s participants who went past 150 volts continued to the maximum 450 volts, Burger (2009) argued that 150 volts provided a reasonable estimate for obedience levels. He found 70% of participants continued to 150 volts, compared to 82.5% in Milgram’s comparable condition.

Another replication by Thomas Blass (1999) examined whether obedience rates had declined over time due to greater public awareness of the experiments. Blass correlated obedience rates from replication studies between 1963 and 1985 and found no relationship between year and obedience level. He concluded that obedience rates have not systematically changed, providing evidence against the idea of “enlightenment effects”.

Some variations have explored the role of gender. Milgram found equal rates of obedience for male and female participants. Reviews have found most replications also show no gender difference, with a couple of exceptions (Blass, 1999). For example, Kilham and Mann (1974) found lower obedience in female participants.

Partial replications have also examined situational factors. Having another person model defiance reduced obedience compared to a solo participant in one study, but did not eliminate it (Burger, 2009). The authority figure’s perceived expertise seems to be an influential factor (Blass, 1999). Replications have supported Milgram’s observation that stepwise increases in demands promote obedience.

Personality factors have been studied as well. Traits like high empathy and desire for control correlate with some minor early hesitation, but do not greatly impact eventual obedience levels (Burger, 2009). Authoritarian tendencies may contribute to obedience (Elms, 2009).

In sum, the partial replications confirm Milgram’s degree of obedience. Though ethical constraints prevent full reproductions, the key elements of his procedure seem to consistently elicit high levels of compliance across studies, samples, and eras. The replications continue to highlight the power of situational pressures to yield obedience.

Milgram (1963) Audio Clips

Below you can also hear some of the audio clips taken from the video that was made of the experiment. Just click on the clips below.

Why was the Milgram experiment so controversial?

The Milgram experiment was controversial because it revealed people’s willingness to obey authority figures even when causing harm to others, raising ethical concerns about the psychological distress inflicted upon participants and the deception involved in the study.

Would Milgram’s experiment be allowed today?

Milgram’s experiment would likely not be allowed today in its original form, as it violates modern ethical guidelines for research involving human participants, particularly regarding informed consent, deception, and protection from psychological harm.

Did anyone refuse the Milgram experiment?

Yes, in the Milgram experiment, some participants refused to continue administering shocks, demonstrating individual variation in obedience to authority figures. In the original Milgram experiment, approximately 35% of participants refused to administer the highest shock level of 450 volts, while 65% obeyed and delivered the 450-volt shock.

How can Milgram’s study be applied to real life?

Milgram’s study can be applied to real life by demonstrating the potential for ordinary individuals to obey authority figures even when it involves causing harm, emphasizing the importance of questioning authority, ethical decision-making, and fostering critical thinking in societal contexts.

Were all participants in Milgram’s experiments male?

Yes, in the original Milgram experiment conducted in 1961, all participants were male, limiting the generalizability of the findings to women and diverse populations.

Why was the Milgram experiment unethical?

The Milgram experiment was considered unethical because participants were deceived about the true nature of the study and subjected to severe emotional distress. They believed they were causing harm to another person under the instruction of authority.

Additionally, participants were not given the right to withdraw freely and were subjected to intense pressure to continue. The psychological harm and lack of informed consent violates modern ethical guidelines for research.

Baumrind, D. (1964). Some thoughts on ethics of research: After reading Milgram’s” Behavioral study of obedience.”.  American Psychologist ,  19 (6), 421.

Blass, T. (1999). The Milgram paradigm after 35 years: Some things we now know about obedience to authority 1.  Journal of Applied Social Psychology ,  29 (5), 955-978.

Brannigan, A., Nicholson, I., & Cherry, F. (2015). Introduction to the special issue: Unplugging the Milgram machine.  Theory & Psychology ,  25 (5), 551-563.

Burger, J. M. (2009). Replicating Milgram: Would people still obey today? American Psychologist, 64 , 1–11.

Elms, A. C. (2009). Obedience lite. American Psychologist, 64 (1), 32–36.

Gibson, S. (2013). Milgram’s obedience experiments: A rhetorical analysis. British Journal of Social Psychology, 52, 290–309.

Gibson, S. (2017). Developing psychology’s archival sensibilities: Revisiting Milgram’s obedience’ experiments.  Qualitative Psychology ,  4 (1), 73.

Griggs, R. A., Blyler, J., & Jackson, S. L. (2020). Using research ethics as a springboard for teaching Milgram’s obedience study as a contentious classic.  Scholarship of Teaching and Learning in Psychology ,  6 (4), 350.

Haslam, S. A., & Reicher, S. D. (2018). A truth that does not always speak its name: How Hollander and Turowetz’s findings confirm and extend the engaged followership analysis of harm-doing in the Milgram paradigm. British Journal of Social Psychology, 57, 292–300.

Haslam, S. A., Reicher, S. D., & Birney, M. E. (2016). Questioning authority: New perspectives on Milgram’s ‘obedience’ research and its implications for intergroup relations. Current Opinion in Psychology, 11 , 6–9.

Haslam, S. A., Reicher, S. D., Birney, M. E., Millard, K., & McDonald, R. (2015). ‘Happy to have been of service’: The Yale archive as a window into the engaged followership of participants in Milgram’s ‘obedience’ experiment. British Journal of Social Psychology, 54 , 55–83.

Kaplan, D. E. (1996). The Stanley Milgram papers: A case study on appraisal of and access to confidential data files. American Archivist, 59 , 288–297.

Kaposi, D. (2022). The second wave of critical engagement with Stanley Milgram’s ‘obedience to authority’experiments: What did we learn?.  Social and Personality Psychology Compass ,  16 (6), e12667.

Kilham, W., & Mann, L. (1974). Level of destructive obedience as a function of transmitter and executant roles in the Milgram obedience paradigm. Journal of Personality and Social Psychology, 29 (5), 696–702.

Milgram, S. (1963). Behavioral study of obedience . Journal of Abnormal and Social Psychology , 67, 371-378.

Milgram, S. (1964). Issues in the study of obedience: A reply to Baumrind. American Psychologist, 19 , 848–852.

Milgram, S. (1965). Some conditions of obedience and disobedience to authority . Human Relations, 18(1) , 57-76.

Milgram, S. (1974). Obedience to authority: An experimental view . Harpercollins.

Miller, A. G. (2009). Reflections on” Replicating Milgram”(Burger, 2009), American Psychologis t, 64 (1):20-27

Nicholson, I. (2011). “Torture at Yale”: Experimental subjects, laboratory torment and the “rehabilitation” of Milgram’s “obedience to authority”. Theory & Psychology, 21 , 737–761.

Nicholson, I. (2015). The normalization of torment: Producing and managing anguish in Milgram’s “obedience” laboratory. Theory & Psychology, 25 , 639–656.

Orne, M. T., & Holland, C. H. (1968). On the ecological validity of laboratory deceptions. International Journal of Psychiatry, 6 (4), 282-293.

Orne, M. T., & Holland, C. C. (1968). Some conditions of obedience and disobedience to authority. On the ecological validity of laboratory deceptions. International Journal of Psychiatry, 6 , 282–293.

Perry, G. (2013). Behind the shock machine: The untold story of the notorious Milgram psychology experiments . New York, NY: The New Press.

Reicher, S., Haslam, A., & Miller, A. (Eds.). (2014). Milgram at 50: Exploring the enduring relevance of psychology’s most famous studies [Special issue]. Journal of Social Issues, 70 (3), 393–602

Russell, N. (2014). Stanley Milgram’s obedience to authority “relationship condition”: Some methodological and theoretical implications. Social Sciences, 3, 194–214

Shanab, M. E., & Yahya, K. A. (1978). A cross-cultural study of obedience. Bulletin of the Psychonomic Society .

Smith, P. B., & Bond, M. H. (1998). Social psychology across cultures (2nd Edition) . Prentice Hall.

Further Reading

  • The power of the situation: The impact of Milgram’s obedience studies on personality and social psychology
  • Seeing is believing: The role of the film Obedience in shaping perceptions of Milgram’s Obedience to Authority Experiments
  • Replicating Milgram: Would people still obey today?

Learning Check

Which is true regarding the Milgram obedience study?
  • The aim was to see how obedient people would be in a situation where following orders would mean causing harm to another person.
  • Participants were under the impression they were part of a learning and memory experiment.
  • The “learners” in the study were actual participants who volunteered to be shocked as part of the experiment.
  • The “learner” was an actor who was in on the experiment and never actually received any real shocks.
  • Although the participant could not see the “learner”, he was able to hear him clearly through the wall
  • The study was directly influenced by Milgram’s observations of obedience patterns in post-war Europe.
  • The experiment was designed to understand the psychological mechanisms behind war crimes committed during World War II.
  • The Milgram study was universally accepted in the psychological community, and no ethical concerns were raised about its methodology.
  • When Milgram’s experiment was repeated in a rundown office building in Bridgeport, the percentage of the participants who fully complied with the commands of the experimenter remained unchanged.
  • The experimenter (authority figure) delivered verbal prods to encourage the teacher to continue, such as ‘Please continue’ or ‘Please go on’.
  • Over 80% of participants went on to deliver the maximum level of shock.
  • Milgram sent participants questionnaires after the study to assess the effects and found that most felt no remorse or guilt, so it was ethical.
  • The aftermath of the study led to stricter ethical guidelines in psychological research.
  • The study emphasized the role of situational factors over personality traits in determining obedience.

Answers : Items 3, 8, 9, and 11 are the false statements.

Short Answer Questions
  • Briefly explain the results of the original Milgram experiments. What did these results prove?
  • List one scenario on how an authority figure can abuse obedience principles.
  • List one scenario on how an individual could use these principles to defend their fellow peers.
  • In a hospital, you are very likely to obey a nurse. However, if you meet her outside the hospital, for example in a shop, you are much less likely to obey. Using your knowledge of how people resist pressure to obey, explain why you are less likely to obey the nurse outside the hospital.
  • Describe the shock instructions the participant (teacher) was told to follow when the victim (learner) gave an incorrect answer.
  • State the lowest voltage shock that was labeled on the shock generator.
  • What would likely happen if Milgram’s experiment included a condition in which the participant (teacher) had to give a high-level electric shock for the first wrong answer?
Group Activity

Gather in groups of three or four to discuss answers to the short answer questions above.

For question 2, review the different scenarios you each came up with. Then brainstorm on how these situations could be flipped.

For question 2, discuss how an authority figure could instead empower those below them in the examples your groupmates provide.

For question 3, discuss how a peer could do harm by using the obedience principles in the scenarios your groupmates provide.

Essay Topic
  • What’s the most important lesson of Milgram’s Obedience Experiments? Fully explain and defend your answer.
  • Milgram selectively edited his film of the obedience experiments to emphasize obedient behavior and minimize footage of disobedience. What are the ethical implications of a researcher selectively presenting findings in a way that fits their expected conclusions?

Print Friendly, PDF & Email

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Sweepstakes
  • Guided Meditations
  • Verywell Mind Insights
  • 2024 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

Understanding the Milgram Experiment in Psychology

A closer look at Milgram's controversial studies of obedience

Isabelle Adam (CC BY-NC-ND 2.0) via Flickr

Factors That Influence Obedience

  • Ethical Concerns
  • Replications

How far do you think people would go to obey an authority figure? Would they refuse to obey if the order went against their values or social expectations? Those questions were at the heart of an infamous and controversial study known as the Milgram obedience experiments.

Yale University  psychologist   Stanley Milgram  conducted these experiments during the 1960s. They explored the effects of authority on obedience. In the experiments, an authority figure ordered participants to deliver what they believed were dangerous electrical shocks to another person. These results suggested that people are highly influenced by authority and highly obedient . More recent investigations cast doubt on some of the implications of Milgram's findings and even the results and procedures themselves. Despite its problems, the study has, without question, made a significant impact on psychology .

At a Glance

Milgram's experiments posed the question: Would people obey orders, even if they believed doing so would harm another person? Milgram's findings suggested the answer was yes, they would. The experiments have long been controversial, both because of the startling findings and the ethical problems with the research. More recently, experts have re-examined the studies, suggesting that participants were often coerced into obeying and that at least some participants recognized that the other person was just pretending to be shocked. Such findings call into question the study's validity and authenticity, but some replications suggest that people are surprisingly prone to obeying authority.

History of the Milgram Experiments

Milgram started his experiments in 1961, shortly after the trial of the World War II criminal Adolf Eichmann had begun. Eichmann’s defense that he was merely following instructions when he ordered the deaths of millions of Jews roused Milgram’s interest.

In his 1974 book "Obedience to Authority," Milgram posed the question, "Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?"

Procedure in the Milgram Experiment

The participants in the most famous variation of the Milgram experiment were 40 men recruited using newspaper ads. In exchange for their participation, each person was paid $4.50.

Milgram developed an intimidating shock generator, with shock levels starting at 15 volts and increasing in 15-volt increments all the way up to 450 volts. The many switches were labeled with terms including "slight shock," "moderate shock," and "danger: severe shock." The final three switches were labeled simply with an ominous "XXX."

Each participant took the role of a "teacher" who would then deliver a shock to the "student" in a neighboring room whenever an incorrect answer was given. While participants believed that they were delivering real shocks to the student, the “student” was a confederate in the experiment who was only pretending to be shocked.

As the experiment progressed, the participant would hear the learner plead to be released or even complain about a heart condition. Once they reached the 300-volt level, the learner would bang on the wall and demand to be released.

Beyond this point, the learner became completely silent and refused to answer any more questions. The experimenter then instructed the participant to treat this silence as an incorrect response and deliver a further shock.

Most participants asked the experimenter whether they should continue. The experimenter then responded with a series of commands to prod the participant along:

  • "Please continue."
  • "The experiment requires that you continue."
  • "It is absolutely essential that you continue."
  • "You have no other choice; you must go on."

Results of the Milgram Experiment

In the Milgram experiment, obedience was measured by the level of shock that the participant was willing to deliver. While many of the subjects became extremely agitated, distraught, and angry at the experimenter, they nevertheless continued to follow orders all the way to the end.

Milgram's results showed that 65% of the participants in the study delivered the maximum shocks. Of the 40 participants in the study, 26 delivered the maximum shocks, while 14 stopped before reaching the highest levels.

Why did so many of the participants in this experiment perform a seemingly brutal act when instructed by an authority figure? According to Milgram, there are some situational factors that can explain such high levels of obedience:

  • The physical presence of an authority figure dramatically increased compliance .
  • The fact that Yale (a trusted and authoritative academic institution) sponsored the study led many participants to believe that the experiment must be safe.
  • The selection of teacher and learner status seemed random.
  • Participants assumed that the experimenter was a competent expert.
  • The shocks were said to be painful, not dangerous.

Later experiments conducted by Milgram indicated that the presence of rebellious peers dramatically reduced obedience levels. When other people refused to go along with the experimenter's orders, 36 out of 40 participants refused to deliver the maximum shocks.

More recent work by researchers suggests that while people do tend to obey authority figures, the process is not necessarily as cut-and-dried as Milgram depicted it.

In a 2012 essay published in PLoS Biology , researchers suggested that the degree to which people are willing to obey the questionable orders of an authority figure depends largely on two key factors:

  • How much the individual agrees with the orders
  • How much they identify with the person giving the orders

While it is clear that people are often far more susceptible to influence, persuasion , and obedience than they would often like to be, they are far from mindless machines just taking orders. 

Another study that analyzed Milgram's results concluded that eight factors influenced the likelihood that people would progress up to the 450-volt shock:

  • The experimenter's directiveness
  • Legitimacy and consistency
  • Group pressure to disobey
  • Indirectness of proximity
  • Intimacy of the relation between the teacher and learner
  • Distance between the teacher and learner

Ethical Concerns in the Milgram Experiment

Milgram's experiments have long been the source of considerable criticism and controversy. From the get-go, the ethics of his experiments were highly dubious. Participants were subjected to significant psychological and emotional distress.

Some of the major ethical issues in the experiment were related to:

  • The use of deception
  • The lack of protection for the participants who were involved
  • Pressure from the experimenter to continue even after asking to stop, interfering with participants' right to withdraw

Due to concerns about the amount of anxiety experienced by many of the participants, everyone was supposedly debriefed at the end of the experiment. The researchers reported that they explained the procedures and the use of deception.

Critics of the study have argued that many of the participants were still confused about the exact nature of the experiment, and recent findings suggest that many participants were not debriefed at all.

Replications of the Milgram Experiment

While Milgram’s research raised serious ethical questions about the use of human subjects in psychology experiments , his results have also been consistently replicated in further experiments. One review further research on obedience and found that Milgram’s findings hold true in other experiments. In one study, researchers conducted a study designed to replicate Milgram's classic obedience experiment. The researchers made several alterations to Milgram's experiment.

  • The maximum shock level was 150 volts as opposed to the original 450 volts.
  • Participants were also carefully screened to eliminate those who might experience adverse reactions to the experiment.

The results of the new experiment revealed that participants obeyed at roughly the same rate that they did when Milgram conducted his original study more than 40 years ago.

Some psychologists suggested that in spite of the changes made in the replication, the study still had merit and could be used to further explore some of the situational factors that also influenced the results of Milgram's study. But other psychologists suggested that the replication was too dissimilar to Milgram's original study to draw any meaningful comparisons.

One study examined people's beliefs about how they would do compared to the participants in Milgram's experiments. They found that most people believed they would stop sooner than the average participants. These findings applied to both those who had never heard of Milgram's experiments and those who were familiar with them. In fact, those who knew about Milgram's experiments actually believed that they would stop even sooner than other people.

Another novel replication involved recruiting participants in pairs and having them take turns acting as either an 'agent' or 'victim.' Agents then received orders to shock the victim. The results suggest that only around 3.3% disobeyed the experimenter's orders.

Recent Criticisms and New Findings

Psychologist Gina Perry suggests that much of what we think we know about Milgram's famous experiments is only part of the story. While researching an article on the topic, she stumbled across hundreds of audiotapes found in Yale archives that documented numerous variations of Milgram's shock experiments.

Participants Were Often Coerced

While Milgram's reports of his process report methodical and uniform procedures, the audiotapes reveal something different. During the experimental sessions, the experimenters often went off-script and coerced the subjects into continuing the shocks.

"The slavish obedience to authority we have come to associate with Milgram’s experiments comes to sound much more like bullying and coercion when you listen to these recordings," Perry suggested in an article for Discover Magazine .

Few Participants Were Really Debriefed

Milgram suggested that the subjects were "de-hoaxed" after the experiments. He claimed he later surveyed the participants and found that 84% were glad to have participated, while only 1% regretted their involvement.

However, Perry's findings revealed that of the 700 or so people who took part in different variations of his studies between 1961 and 1962, very few were truly debriefed.

A true debriefing would have involved explaining that the shocks weren't real and that the other person was not injured. Instead, Milgram's sessions were mainly focused on calming the subjects down before sending them on their way.

Many participants left the experiment in a state of considerable distress. While the truth was revealed to some months or even years later, many were simply never told a thing.

Variations Led to Differing Results

Another problem is that the version of the study presented by Milgram and the one that's most often retold does not tell the whole story. The statistic that 65% of people obeyed orders applied only to one variation of the experiment, in which 26 out of 40 subjects obeyed.

In other variations, far fewer people were willing to follow the experimenters' orders, and in some versions of the study, not a single participant obeyed.

Participants Guessed the Learner Was Faking

Perry even tracked down some of the people who took part in the experiments, as well as Milgram's research assistants. What she discovered is that many of his subjects had deduced what Milgram's intent was and knew that the "learner" was merely pretending.

Such findings cast Milgram's results in a new light. It suggests that not only did Milgram intentionally engage in some hefty misdirection to obtain the results he wanted but that many of his participants were simply playing along.

An analysis of an unpublished study by Milgram's assistant, Taketo Murata, found that participants who believed they were really delivering a shock were less likely to obey, while those who did not believe they were actually inflicting pain were more willing to obey. In other words, the perception of pain increased defiance, while skepticism of pain increased obedience.

A review of Milgram's research materials suggests that the experiments exerted more pressure to obey than the original results suggested. Other variations of the experiment revealed much lower rates of obedience, and many of the participants actually altered their behavior when they guessed the true nature of the experiment.

Impact of the Milgram Experiment

Since there is no way to truly replicate the experiment due to its serious ethical and moral problems, determining whether Milgram's experiment really tells us anything about the power of obedience is impossible to determine.

So why does Milgram's experiment maintain such a powerful hold on our imaginations, even decades after the fact? Perry believes that despite all its ethical issues and the problem of never truly being able to replicate Milgram's procedures, the study has taken on the role of what she calls a "powerful parable."

Milgram's work might not hold the answers to what makes people obey or even the degree to which they truly obey. It has, however, inspired other researchers to explore what makes people follow orders and, perhaps more importantly, what leads them to question authority.

Recent findings undermine the scientific validity of the study. Milgram's work is also not truly replicable due to its ethical problems. However, the study has led to additional research on how situational factors can affect obedience to authority.

Milgram’s experiment has become a classic in psychology , demonstrating the dangers of obedience. The research suggests that situational variables have a stronger sway than personality factors in determining whether people will obey an authority figure. However, other psychologists argue that both external and internal factors heavily influence obedience, such as personal beliefs and overall temperament.

Milgram S.  Obedience to Authority: An Experimental View.  Harper & Row.

Russell N, Gregory R. The Milgram-Holocaust linkage: challenging the present consensus . State Crim J. 2015;4(2):128-153.

Russell NJC. Milgram's obedience to authority experiments: origins and early evolution . Br J Soc Psychol . 2011;50:140-162. doi:10.1348/014466610X492205

Haslam SA, Reicher SD. Contesting the "nature" of conformity: What Milgram and Zimbardo's studies really show . PLoS Biol. 2012;10(11):e1001426. doi:10.1371/journal.pbio.1001426

Milgram S. Liberating effects of group pressure . J Person Soc Psychol. 1965;1(2):127-234. doi:10.1037/h0021650

Haslam N, Loughnan S, Perry G. Meta-Milgram: an empirical synthesis of the obedience experiments .  PLoS One . 2014;9(4):e93927. doi:10.1371/journal.pone.0093927

Perry G. Deception and illusion in Milgram's accounts of the obedience experiments . Theory Appl Ethics . 2013;2(2):79-92.

Blass T. The Milgram paradigm after 35 years: some things we now know about obedience to authority . J Appl Soc Psychol. 1999;29(5):955-978. doi:10.1111/j.1559-1816.1999.tb00134.x

Burger J. Replicating Milgram: Would people still obey today? . Am Psychol . 2009;64(1):1-11. doi:10.1037/a0010932

Elms AC. Obedience lite . American Psychologist . 2009;64(1):32-36. doi:10.1037/a0014473

Miller AG. Reflections on “replicating Milgram” (Burger, 2009) . American Psychologist . 2009;64(1):20-27. doi:10.1037/a0014407

Grzyb T, Dolinski D. Beliefs about obedience levels in studies conducted within the Milgram paradigm: Better than average effect and comparisons of typical behaviors by residents of various nations .  Front Psychol . 2017;8:1632. doi:10.3389/fpsyg.2017.01632

Caspar EA. A novel experimental approach to study disobedience to authority .  Sci Rep . 2021;11(1):22927. doi:10.1038/s41598-021-02334-8

Haslam SA, Reicher SD, Millard K, McDonald R. ‘Happy to have been of service’: The Yale archive as a window into the engaged followership of participants in Milgram’s ‘obedience’ experiments . Br J Soc Psychol . 2015;54:55-83. doi:10.1111/bjso.12074

Perry G, Brannigan A, Wanner RA, Stam H. Credibility and incredulity in Milgram’s obedience experiments: A reanalysis of an unpublished test . Soc Psychol Q . 2020;83(1):88-106. doi:10.1177/0190272519861952

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

The Milgram Experiment: How Far Will You Go to Obey an Order?

Understand the infamous study and its conclusions about human nature

  • Archaeology
  • Ph.D., Psychology, University of California - Santa Barbara
  • B.A., Psychology and Peace & Conflict Studies, University of California - Berkeley

A brief Milgram experiment summary is as follows: In the 1960s, psychologist Stanley Milgram conducted a series of studies on the concepts of obedience and authority. His experiments involved instructing study participants to deliver increasingly high-voltage shocks to an actor in another room, who would scream and eventually go silent as the shocks became stronger. The shocks weren't real, but study participants were made to believe that they were.

Today, the Milgram experiment is widely criticized on both ethical and scientific grounds. However, Milgram's conclusions about humanity's willingness to obey authority figures remain influential and well-known.

Key Takeaways: The Milgram Experiment

  • The goal of the Milgram experiment was to test the extent of humans' willingness to obey orders from an authority figure.
  • Participants were told by an experimenter to administer increasingly powerful electric shocks to another individual. Unbeknownst to the participants, shocks were fake and the individual being shocked was an actor.
  • The majority of participants obeyed, even when the individual being shocked screamed in pain.
  • The experiment has been widely criticized on ethical and scientific grounds.

Detailed Milgram’s Experiment Summary

In the most well-known version of the Milgram experiment, the 40 male participants were told that the experiment focused on the relationship between punishment, learning, and memory. The experimenter then introduced each participant to a second individual, explaining that this second individual was participating in the study as well. Participants were told that they would be randomly assigned to roles of "teacher" and "learner." However, the "second individual" was an actor hired by the research team, and the study was set up so that the true participant would always be assigned to the "teacher" role.

During the Milgram experiment, the learner was located in a separate room from the teacher (the real participant), but the teacher could hear the learner through the wall. The experimenter told the teacher that the learner would memorize word pairs and instructed the teacher to ask the learner questions. If the learner responded incorrectly to a question, the teacher would be asked to administer an electric shock. The shocks started at a relatively mild level (15 volts) but increased in 15-volt increments up to 450 volts. (In actuality, the shocks were fake, but the participant was led to believe they were real.)

Participants were instructed to give a higher shock to the learner with each wrong answer. When the 150-volt shock was administered, the learner would cry out in pain and ask to leave the study. He would then continue crying out with each shock until the 330-volt level, at which point he would stop responding.

During this process, whenever participants expressed hesitation about continuing with the study, the experimenter would urge them to go on with increasingly firm instructions, culminating in the statement, "You have no other choice, you must go on." The study ended when participants refused to obey the experimenter’s demand, or when they gave the learner the highest level of shock on the machine (450 volts).

Milgram found that participants obeyed the experimenter at an unexpectedly high rate: 65% of the participants gave the learner the 450-volt shock.

Critiques of the Milgram Experiment

The Milgram experiment has been widely criticized on ethical grounds. Milgram’s participants were led to believe that they acted in a way that harmed someone else, an experience that could have had long-term consequences. Moreover, an investigation by writer Gina Perry uncovered that some participants appear to not have been fully debriefed after the study —they were told months later, or not at all, that the shocks were fake and the learner wasn’t harmed. Milgram’s studies could not be perfectly recreated today, because researchers today are required to pay much more attention to the safety and well-being of human research subjects.

Researchers have also questioned the scientific validity of Milgram’s results. In her examination of the study, Perry found that Milgram’s experimenter may have gone off script and told participants to obey many more times than the script specified. Additionally, some research suggests that participants may have figured out that the learner was not harmed: in interviews conducted after the Milgram experiment, some participants reported that they didn’t think the learner was in any real danger. This mindset is likely to have affected their behavior in the study.

Variations on the Milgram Experiment

Milgram and other researchers conducted numerous versions of the experiment over time. The participants' levels of compliance with the experimenter’s demands varied greatly from one study to the next. For example, when participants were in closer proximity to the learner (e.g. in the same room), they were less likely to give the learner the highest level of shock.

Another version of the Milgram experiment brought three "teachers" into the experiment room at once. One was a real participant, and the other two were actors hired by the research team. During the experiment, the two non-participant teachers would quit as the level of shocks began to increase. Milgram found that these conditions made the real participant far more likely to "disobey" the experimenter, too: only 10% of participants gave the 450-volt shock to the learner.

In yet another version of the Milgram experiment, two experimenters were present, and during the experiment, they would begin arguing with one another about whether it was right to continue the study. In this version, none of the participants gave the learner the 450-volt shock.

Replicating the Milgram Experiment

Researchers have sought to replicate Milgram's original study with additional safeguards in place to protect participants. In 2009, Jerry Burger replicated Milgram’s famous experiment at Santa Clara University with new safeguards in place: the highest shock level was 150 volts, and participants were told that the shocks were fake immediately after the experiment ended. Additionally, participants were screened by a clinical psychologist before the experiment began, and those found to be at risk of a negative reaction to the study were deemed ineligible to participate.

Burger found that participants obeyed at similar levels as Milgram’s participants: 82.5% of Milgram’s participants gave the learner the 150-volt shock, and 70% of Burger’s participants did the same.

The Legacy of the Milgram Experiment

Milgram’s interpretation of his research was that everyday people are capable of carrying out unthinkable actions in certain circumstances. His research has been used to explain atrocities such as the Holocaust and the Rwandan genocide, though these applications are by no means widely accepted or agreed upon.

Importantly, not all participants obeyed the experimenter’s demands , and Milgram’s studies shed light on the factors that enable people to stand up to authority. In fact, as sociologist Matthew Hollander writes, we may be able to learn from the participants who disobeyed, as their strategies may enable us to respond more effectively to an unethical situation. The Milgram experiment suggested that human beings are susceptible to obeying authority, but it also demonstrated that obedience is not inevitable.

  • Baker, Peter C. “Electric Schlock: Did Stanley Milgram's Famous Obedience Experiments Prove Anything?” Pacific Standard (2013, Sep. 10). https://psmag.com/social-justice/electric-schlock-65377
  • Burger, Jerry M. "Replicating Milgram: Would People Still Obey Today?."  American Psychologist 64.1 (2009): 1-11. http://psycnet.apa.org/buy/2008-19206-001
  • Gilovich, Thomas, Dacher Keltner, and Richard E. Nisbett. Social Psychology . 1st edition, W.W. Norton & Company, 2006.
  • Hollander, Matthew. “How to Be a Hero: Insight From the Milgram Experiment.” HuffPost Contributor Network (2015, Apr. 29). https://www.huffingtonpost.com/entry/how-to-be-a-hero-insight-_b_6566882
  • Jarrett, Christian. “New Analysis Suggests Most Milgram Participants Realised the ‘Obedience Experiments’ Were Not Really Dangerous.” The British Psychological Society: Research Digest (2017, Dec. 12). https://digest.bps.org.uk/2017/12/12/interviews-with-milgram-participants-provide-little-support-for-the-contemporary-theory-of-engaged-followership/
  • Perry, Gina. “The Shocking Truth of the Notorious Milgram Obedience Experiments.” Discover Magazine Blogs (2013, Oct. 2). http://blogs.discovermagazine.com/crux/2013/10/02/the-shocking-truth-of-the-notorious-milgram-obedience-experiments/
  • Romm, Cari. “Rethinking One of Psychology's Most Infamous Experiments.” The Atlantic (2015, Jan. 28) . https://www.theatlantic.com/health/archive/2015/01/rethinking-one-of-psychologys-most-infamous-experiments/384913/
  • Gilligan's Ethics of Care
  • What Is Behaviorism in Psychology?
  • What Was the Robbers Cave Experiment in Psychology?
  • What Is the Zeigarnik Effect? Definition and Examples
  • What Is a Conditioned Response?
  • Psychodynamic Theory: Approaches and Proponents
  • Social Cognitive Theory: How We Learn From the Behavior of Others
  • Kohlberg's Stages of Moral Development
  • What's the Difference Between Eudaimonic and Hedonic Happiness?
  • Genie Wiley, the Feral Child
  • What Is the Law of Effect in Psychology?
  • What Is the Recency Effect in Psychology?
  • Heuristics: The Psychology of Mental Shortcuts
  • What Is Survivor's Guilt? Definition and Examples
  • 5 Psychology Studies That Will Make You Feel Good About Humanity
  • What Is Cognitive Bias? Definition and Examples

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • v.10(11); 2012 Nov

Logo of plosbiol

Contesting the “Nature” Of Conformity: What Milgram and Zimbardo's Studies Really Show

S. alexander haslam.

1 School of Psychology, University of Queensland, St. Lucia, Australia

Stephen. D. Reicher

2 School of Psychology, University of St. Andrews, St Andrews, Scotland

A re-analysis of classic psychology studies suggests that tyranny does not result from blind conformity to rules and roles, but may involve identification with authorities who represent vicious acts as virtuous.

Understanding of the psychology of tyranny is dominated by classic studies from the 1960s and 1970s: Milgram's research on obedience to authority and Zimbardo's Stanford Prison Experiment. Supporting popular notions of the banality of evil, this research has been taken to show that people conform passively and unthinkingly to both the instructions and the roles that authorities provide, however malevolent these may be. Recently, though, this consensus has been challenged by empirical work informed by social identity theorizing. This suggests that individuals' willingness to follow authorities is conditional on identification with the authority in question and an associated belief that the authority is right.

Introduction

If men make war in slavish obedience to rules, they will fail. Ulysses S. Grant [1]

Conformity is often criticized on grounds of morality. Many, if not all, of the greatest human atrocities have been described as “crimes of obedience” [2] . However, as the victorious American Civil War General and later President Grant makes clear, conformity is equally problematic on grounds of efficacy. Success requires leaders and followers who do not adhere rigidly to a pre-determined script. Rigidity cannot steel them for the challenges of their task or for the creativity of their opponents.

Given these problems, it would seem even more unfortunate if human beings were somehow programmed for conformity. Yet this is a view that has become dominant over the last half-century. Its influence can be traced to two landmark empirical programs led by social psychologists in the 1960s and early 1970s: Milgram's Obedience to Authority research and Zimbardo's Stanford Prison Experiment. These studies have not only had influence in academic spheres. They have spilled over into our general culture and shaped popular understanding, such that “everyone knows” that people inevitably succumb to the demands of authority, however immoral the consequences [3] , [4] . As Parker puts it, “the hopeless moral of the [studies'] story is that resistance is futile” [5] . What is more, this work has shaped our understanding not only of conformity but of human nature more broadly [6] .

Building on an established body of theorizing in the social identity tradition—which sees group-based influence as meaningful and conditional [7] , [8] —we argue, however, that these understandings are mistaken. Moreover, we contend that evidence from the studies themselves (as well as from subsequent research) supports a very different analysis of the psychology of conformity.

The Classic Studies: Conformity, Obedience, and the Banality Of Evil

In Milgram's work [9] , [10] members of the general public (predominantly men) volunteered to take part in a scientific study of memory. They found themselves cast in the role of a “Teacher” with the task of administering shocks of increasing magnitude (from 15 V to 450 V in 15-V increments) to another man (the “Learner”) every time he failed to recall the correct word in a previously learned pair. Unbeknown to the Teacher, the Learner was Milgram's confederate, and the shocks were not real. Moreover, rather than being interested in memory, Milgram was actually interested in seeing how far the men would go in carrying out the task. To his—and everyone else's [11] —shock, the answer was “very far.” In what came to be termed the “baseline” study [12] all participants proved willing to administer shocks of 300 V and 65% went all the way to 450 V. This appeared to provide compelling evidence that normal well-adjusted men would be willing to kill a complete stranger simply because they were ordered to do so by an authority.

Zimbardo's Stanford Prison Experiment took these ideas further by exploring the destructive behaviour of groups of men over an extended period [13] , [14] . Students were randomly assigned to be either guards or prisoners within a mock prison that had been constructed in the Stanford Psychology Department. In contrast to Milgram's studies, the objective was to observe the interaction within and between the two groups in the absence of an obviously malevolent authority. Here, again, the results proved shocking. Such was the abuse meted out to the prisoners by the guards that the study had to be terminated after just 6 days. Zimbardo's conclusion from this was even more alarming than Milgram's. People descend into tyranny, he suggested, because they conform unthinkingly to the toxic roles that authorities prescribe without the need for specific orders: brutality was “a ‘natural’ consequence of being in the uniform of a ‘guard’ and asserting the power inherent in that role” [15] .

Within psychology, Milgram and Zimbardo helped consolidate a growing “conformity bias” [16] in which the focus on compliance is so strong as to obscure evidence of resistance and disobedience [17] . However their arguments proved particularly potent because they seemed to mesh with real-world examples—particularly evidence of the “banality of evil.” This term was coined in Hannah Arendt's account of the trial of Adolf Eichmann [18] , a chief architect of the Nazis' “final solution to the Jewish question” [19] . Despite being responsible for the transportation of millions of people to their death, Arendt suggested that Eichmann was no psychopathic monster. Instead his trial revealed him to be a diligent and efficient bureaucrat—a man more concerned with following orders than with asking deep questions about their morality or consequence.

Much of the power of Milgram and Zimbardo's research derives from the fact that it appears to give empirical substance to this claim that evil is banal [3] . It seems to show that tyranny is a natural and unavoidable consequence of humans' inherent motivation to bend to the wishes of those in authority—whoever they may be and whatever it is that they want us to do. Put slightly differently, it operationalizes an apparent tragedy of the human condition: our desire to be good subjects is stronger than our desire to be subjects who do good.

Questioning the Consensus: Conformity Isn't Natural and It Doesn't Explain Tyranny

The banality of evil thesis appears to be a truth almost universally acknowledged. Not only is it given prominence in social psychology textbooks [20] , but so too it informs the thinking of historians [21] , [22] , political scientists [23] , economists [24] , and neuroscientists [25] . Indeed, via a range of social commentators, it has shaped the public consciousness much more broadly [26] , and, in this respect, can lay claim to being the most influential data-driven thesis in the whole of psychology [27] , [28] .

Yet despite the breadth of this consensus, in recent years, we and others have reinterrogated its two principal underpinnings—the archival evidence pertaining to Eichmann and his ilk, and the specifics of Milgram and Zimbardo's empirical demonstrations—in ways that tell a very different story [29] .

First, a series of thoroughgoing historical examinations have challenged the idea that Nazi bureaucrats were ever simply following orders [19] , [26] , [30] . This may have been the defense they relied upon when seeking to minimize their culpability [31] , but evidence suggests that functionaries like Eichmann had a very good understanding of what they were doing and took pride in the energy and application that they brought to their work. Typically too, roles and orders were vague, and hence for those who wanted to advance the Nazi cause (and not all did), creativity and imagination were required in order to work towards the regime's assumed goals and to overcome the challenges associated with any given task [32] . Emblematic of this, the practical details of “the final solution” were not handed down from on high, but had to be elaborated by Eichmann himself. He then felt compelled to confront and disobey his superiors—most particularly Himmler—when he believed that they were not sufficiently faithful to eliminationist Nazi principles [19] .

Second, much the same analysis can be used to account for behavior in the Stanford Prison Experiment. So while it may be true that Zimbardo gave his guards no direct orders, he certainly gave them a general sense of how he expected them to behave [33] . During the orientation session he told them, amongst other things, “You can create in the prisoners feelings of boredom, a sense of fear to some degree, you can create a notion of arbitrariness that their life is totally controlled by us, by the system, you, me… We're going to take away their individuality in various ways. In general what all this leads to is a sense of powerlessness” [34] . This contradicts Zimbardo's assertion that “behavioral scripts associated with the oppositional roles of prisoner and guard [were] the sole source of guidance” [35] and leads us to question the claim that conformity to these role-related scripts was the primary cause of guard brutality.

But even with such guidance, not all guards acted brutally. And those who did used ingenuity and initiative in responding to Zimbardo's brief. Accordingly, after the experiment was over, one prisoner confronted his chief tormentor with the observation that “If I had been a guard I don't think it would have been such a masterpiece” [34] . Contrary to the banality of evil thesis, the Zimbardo-inspired tyranny was made possible by the active engagement of enthusiasts rather than the leaden conformity of automatons.

Turning, third, to the specifics of Milgram's studies, the first point to note is that the primary dependent measure (flicking a switch) offers few opportunities for creativity in carrying out the task. Nevertheless, several of Milgram's findings typically escape standard reviews in which the paradigm is portrayed as only yielding up evidence of obedience. Initially, it is clear that the “baseline study” is not especially typical of the 30 or so variants of the paradigm that Milgram conducted. Here the percentage of participants going to 450 V varied from 0% to nearly 100%, but across the studies as a whole, a majority of participants chose not to go this far [10] , [36] , [37] .

Furthermore, close analysis of the experimental sessions shows that participants are attentive to the demands made on them by the Learner as well as the Experimenter [38] . They are torn between two voices confronting them with irreconcilable moral imperatives, and the fact that they have to choose between them is a source of considerable anguish. They sweat, they laugh, they try to talk and argue their way out of the situation. But the experimental set-up does not allow them to do so. Ultimately, they tend to go along with the Experimenter if he justifies their actions in terms of the scientific benefits of the study (as he does with the prod “The experiment requires that you continue”) [39] . But if he gives them a direct order (“You have no other choice, you must go on”) participants typically refuse. Once again, received wisdom proves questionable. The Milgram studies seem to be less about people blindly conforming to orders than about getting people to believe in the importance of what they are doing [40] .

Tyranny as a Product of Identification-Based Followership

Our suspicions about the plausibility of the banality of evil thesis and its various empirical substrates were first raised through our work on the BBC Prison Study (BPS [41] ). Like the Stanford study, this study randomly assigned men to groups as guards and prisoners and examined their behaviour with a specially created “prison.” Unlike Zimbardo, however, we took no leadership role in the study. Without this, would participants conform to a hierarchical script or resist it?

The study generated three clear findings. First, participants did not conform automatically to their assigned role. Second, they only acted in terms of group membership to the extent that they actively identified with the group (such that they took on a social identification) [42] . Third, group identity did not mean that people simply accepted their assigned position; instead, it empowered them to resist it. Early in the study, the Prisoners' identification as a group allowed them successfully to challenge the authority of the Guards and create a more egalitarian system. Later on, though, a highly committed group emerged out of dissatisfaction with this system and conspired to create a new hierarchy that was far more draconian.

Ultimately, then, the BBC Prison Study came close to recreating the tyranny of the Stanford Prison Experiment. However it was neither passive conformity to roles nor blind obedience to rules that brought the study to this point. On the contrary, it was only when they had internalized roles and rules as aspects of a system with which they identified that participants used them as a guide to action. Moreover, on the basis of this shared identification, the hallmark of the tyrannical regime was not conformity but creative leadership and engaged followership within a group of true believers (see also [43] , [44] ). As we have seen, this analysis mirrors recent conclusions about the Nazi tyranny. To complete the argument, we suggest that it is also applicable to Milgram's paradigm.

The evidence, noted above, about the efficacy of different “prods” already points to the fact that compliance is bound up with a sense of commitment to the experiment and the experimenter over and above commitment to the learner (S. Haslam, SD Reicher, M. Birney, unpublished data) [39] . This use of prods is but one aspect of Milgram's careful management of the paradigm [13] that is aimed at securing participants' identification with the scientific enterprise.

Significantly, though, the degree of identification is not constant across all variants of the study. For instance, when the study is conducted in commercial premises as opposed to prestigious Yale University labs one might expect the identification to diminish and (as our argument implies) compliance to decrease. It does. More systematically, we have examined variations in participants' identification with the Experimenter and the science that he represents as opposed to their identification with the Learner and the general community. They always identify with both to some degree—hence the drama and the tension of the paradigm. But the degree matters, and greater identification with the Experimenter is highly predictive of a greater willingness among Milgram's participants to administer the maximum shock across the paradigm's many variants [37] .

However, some of the most compelling evidence that participants' administration of shocks results from their identification with Milgram's scientific goals comes from what happened after the study had ended. In his debriefing, Milgram praised participants for their commitment to the advancement of science, especially as it had come at the cost of personal discomfort. This inoculated them against doubts concerning their own punitive actions, but it also it led them to support more of such actions in the future. “I am happy to have been of service,” one typical participant responded, “Continue your experiments by all means as long as good can come of them. In this crazy mixed up world of ours, every bit of goodness is needed” (S. Haslam, SD Reicher, K Millward, R MacDonald, unpublished data).

The banality of evil thesis shocks us by claiming that decent people can be transformed into oppressors as a result of their “natural” conformity to the roles and rules handed down by authorities. More particularly, the inclination to conform is thought to suppress oppressors' ability to engage intellectually with the fact that what they are doing is wrong.

Although it remains highly influential, this thesis loses credibility under close empirical scrutiny. On the one hand, it ignores copious evidence of resistance even in studies held up as demonstrating that conformity is inevitable [17] . On the other hand, it ignores the evidence that those who do heed authority in doing evil do so knowingly not blindly, actively not passively, creatively not automatically. They do so out of belief not by nature, out of choice not by necessity. In short, they should be seen—and judged—as engaged followers not as blind conformists [45] .

What was truly frightening about Eichmann was not that he was unaware of what he was doing, but rather that he knew what he was doing and believed it to be right. Indeed, his one regret, expressed prior to his trial, was that he had not killed more Jews [19] . Equally, what is shocking about Milgram's experiments is that rather than being distressed by their actions [46] , participants could be led to construe them as “service” in the cause of “goodness.”

To understand tyranny, then, we need to transcend the prevailing orthodoxy that this derives from something for which humans have a natural inclination—a “Lucifer effect” to which they succumb thoughtlessly and helplessly (and for which, therefore, they cannot be held accountable). Instead, we need to understand two sets of inter-related processes: those by which authorities advocate oppression of others and those that lead followers to identify with these authorities. How did Milgram and Zimbardo justify the harmful acts they required of their participants and why did participants identify with them—some more than others?

These questions are complex and full answers fall beyond the scope of this essay. Yet, regarding advocacy, it is striking how destructive acts were presented as constructive, particularly in Milgram's case, where scientific progress was the warrant for abuse. Regarding identification, this reflects several elements: the personal histories of individuals that render some group memberships more plausible than others as a source of self-definition; the relationship between the identities on offer in the immediate context and other identities that are held and valued in other contexts; and the structure of the local context that makes certain ways of orienting oneself to the social world seem more “fitting” than others [41] , [47] , [48] .

At root, the fundamental point is that tyranny does not flourish because perpetrators are helpless and ignorant of their actions. It flourishes because they actively identify with those who promote vicious acts as virtuous [49] . It is this conviction that steels participants to do their dirty work and that makes them work energetically and creatively to ensure its success. Moreover, this work is something for which they actively wish to be held accountable—so long as it secures the approbation of those in power.

Funding Statement

The authors received no specific funding for this work.

Rosemary K.M. Sword and Philip Zimbardo Ph.D.

50 Years On: What We've Learned From the Stanford Prison Experiment

The experiment generated important research into unexplored territories..

Posted August 16, 2021 | Reviewed by Tyler Woods

  • What Is Shyness?
  • Take our Social Anxiety Test
  • Find a therapist near me
  • I developed 3 new areas of research after the Stanford prison experiment (SPE): good and evil, time perspective, and shyness.
  • The SPE was closed down after 6 days because the "guards" became so brutal and as Superintendent, I was too caught up in my role.
  • The Heroic Imagination Project teaches people how to be Everyday Heroes and take effective actions in challenging situations.

Phil Zimbardo

Fifty years ago this month I conducted a research experiment that could have been a blight to my career . Instead, what has become known as the Stanford prison experiment (SPE) drove me to extensively pursue the question: Why do good people do evil things? After three decades of research on this subject, I recorded my findings in The Lucifer Effect: Understanding How Good People Turn Evil (Random House, 2007).

But the SPE also led me to research three new topics that hadn’t previously been studied:

1) Heroism: Why, in difficult situations, some people heroically step forward to help others, oftentimes complete strangers, while others stand by and watch.

2) Time Perspective: The psychological time warp experienced by participants of the SPE—not knowing if it was day or night or what day it was—led to my research in people’s individual time perspectives and how these affect our lives.

3) Shyness : Rethinking shyness as a self-imposed psychological prison led me to conduct research on shyness in adults, and then create a clinic in the community designed to cure shyness.

The Experiment in a Nutshell

In August 1971, I led a team of researchers at Stanford University to determine the psychological effects of being a guard or a prisoner. The study was funded by the US Office of Naval Research as both the US Navy and the US Marine Corps were interested in the causes of conflict between military guards and prisoners. In the study, 24 normal college students were randomly assigned to play the role of guard or inmate for two weeks in a simulated prison located in the basement of the Stanford Psychology Department building. But the guards quickly became so brutal, and I had become so caught up in my role as Superintendent, that I shut down the experiment after only six days.

Challenging the Truth

There seem to be powerful silent barriers to dealing with new truths emanating from psychological laboratories and field experiments that tell us things about how the mind works, which challenge our basic assumptions. We want to believe our decisions are wisely informed, that our actions are rational, that our personal conscience buffers us against tyrannical authorities. Moreover, we want to believe in the dominating influence of our good character despite social circumstances. Yes, those personal beliefs are sometimes true, but often they are not, and rigidly defending them can get us in trouble individually and collectively. Let’s see how.

Denial and Finger Pointing

When we discover two or three ordinary American citizens administered extreme electric shocks to an innocent victim on the relentless commands of a heartless authority, we say, “no way, not me.” Yale University psychologist Stanley Milgram’s obedience to authority research has been in the public arena for decades, yet we ignore its message of the power of unjust authority in undercutting our moral conscience. Similarly, the SPE research made vivid the power of hostile situational forces in overwhelming dispositional tendencies toward compassion and human dignity. Still, many who insist on honoring the dominance of character over circumstance reject its situational power message.

In 2004, people around the world witnessed online photos of horrific actions of American Military Police guards in Iraq’s Abu Ghraib Prison against prisoners in their charge. It was portrayed as the work of a “few bad apples” according to military brass and Bush administration spokespeople. I publicly challenged this traditional focus on individual dispositions by portraying American servicemen as good apples that were forced to operate in a Bad Barrel (the Situation) created by Bad Barrel Makers (the System).

I became an expert witness in the defense of the Staff Sergeant in charge of the night shift, where all the abuses took place. In that capacity, I had personal access to the defendant, to all 1,000 photos and videos, to all dozen military investigations, and more. It was sufficient to validate my view of that prison as a replica of the Stanford prison experiment on steroids, and of my defendant, Chip Frederick, as really a Good Apple corrupted by being forced to function for 12 hours every night for many months in the worst barrel imaginable. My situation-based testimony to the military Court Martial hearings helped reduce the severity of his sentence from 15 years down to only four years.

The January 6, 2021 insurrection is a recent example of some Good Apples being corrupted by a Bad Barrel. In this case, the Bad Barrel is the insidiousness of fascism led by the former president and other fraudulent politicians as well as media personalities. These “leaders” have been generously dumping poison in the Barrel and over the Apples with lies that feed the Apples’ deepest fears.

“The Stanford Prison Experiment” Film

In 2015, The Stanford Prison Experiment was made into a film starring Billy Crudup as me and Olivia Thrilby as Christina Maslach, the whistle-blowing graduate student (whom I later married) who pointed out the experiment had gone awry and had changed me to such a degree that she didn’t know who I was anymore. Her personal challenge led me to end the study the next day. The film received two awards at the Sundance Film Festival: best screenwriting and best science feature.

child is sitting jeans

The Stanford Prison Experiment movie enables viewers to look through the observation window as if they were part of the prison staff watching this remarkable drama slowly unfold, and simultaneously observe those observers as well. They are witnesses to the gradual transformations taking place, hour by hour, day by day, and guard shift by guard shift. Viewers see what readers of The Lucifer Effect book account can only imagine. As these young students become the characters inhabited in their roles and dressed in their costumes, as prisoners or guards, a Pirandellian drama emerges.

The fixed line between Good, like us, and Evil, like them, is relentlessly blurred as it becomes ever more permeable. Ordinary people soon slip into doing extraordinarily bad things to other people, who are actually just like them except for a random coin flip. Other healthy people soon get sick mentally, being unable to cope with the learned helplessness imposed on them in that unique, unfamiliar setting. They do not offer comfort to their buddies as they break down, nor do those who adopt a “good guard” persona ever do anything to limit the sadistic excesses of the cruel guards heading their shifts.

Finally, the movie also tracks the emotional changes in the lead character (me) as his compassion and intellectual curiosity get distilled and submerged over time. The initial roles of research creator and objective observer are dominated by power and insensitivity to prisoners' suffering in the new role of Prison Superintendent.

Visit the official Stanford Prison Experiment website to learn more about the experiment.

Heroic Imagination

Phil Zimbardo

I should add that, along with continuing research in time perspectives and time perspective therapy , my new mission in life has been to empower everyone to wisely resist negative situational forces and evil by becoming Everyday Heroes in Training. Our non-profit Heroic Imagination Project (HIP) teaches ordinary people how to stand up, speak out and take effective actions in challenging situations in their lives.

Rosemary K.M. Sword and Philip Zimbardo Ph.D.

Rosemary K.M. Sword and Philip Zimbardo are authors, along with Richard M. Sword, of The Time Cure: Overcoming PTSD with the New Psychology of Time Perspective Therapy.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

July 2024 magazine cover

Sticking up for yourself is no easy task. But there are concrete skills you can use to hone your assertiveness and advocate for yourself.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience
  • Utility Menu

University Logo

Department of Psychology

  • https://twitter.com/PsychHarvard
  • https://www.facebook.com/HarvardPsychology/
  • https://www.youtube.com/channel/UCFBv7eBJIQWCrdxPRhYft9Q
  • Participate

Stanley Milgram

Black and white photograph of Stanley Milgram as a young man. (Image Source: Harvard Faculty Registry)

In 1954 Harvard’s Department of Social Relations took the unusual step of admitting a bright young student who had not taken a single psychology course.  Fortunately Stanley Milgram was soon up to speed in social psychology, and in the course of his doctoral work at Harvard he conducted an innovative cross-cultural comparison of conformity in Norway and France under the guidance of Gordon Allport. 

Obtaining his Ph.D. in 1960, Milgram was ready to expand his work on conformity with a series of experiments on obedience to authority that he conducted as an assistant professor at Yale from 1960 to 1963. Inspired by Hannah Arendt’s report on the trial of Adolph Eichmann in Jerusalem, Milgram wondered whether her claims about “the banality of evil” – that evil acts can come from ordinary people following orders as they do their jobs – could be demonstrated in the lab. Milgram staged meticulously designed sham experiments in which subjects were ordered to administer dangerous shocks to fellow volunteers (in reality, the other volunteers were confederates and the shocks were fake). Contradicting the predictions of every expert he polled , Milgram found that more than seventy percent of the subjects administered what they thought might be fatal shocks to an innocent stranger. Collectively known as The Milgram Experiment, this groundbreaking work demonstrated the human tendency to obey commands issued by an authority figure, and more generally, the tendency for behavior to be controlled more by the demands of the situation than by idiosyncratic traits of the person.

The Milgram Experiment is one of the best-known social psychology studies of the 20th century. With this remarkable accomplishment under his belt, young Dr. Milgram returned to Harvard in 1963 to take a position as Assistant Professor of Social Psychology.

During this time at Harvard, Milgram undertook a new, equally innovative line of research, known as the Small World Experiment.  Milgram asked a sample of people to trace out a chain of personal connections to a designated stranger living thousands of miles away. His finding that most people could do this successfully with a chain of six or fewer links yielded the familiar expression “Six Degrees of Separation,” which later became the name of a play and a movie,  a source for the game “Six Degrees of Kevin Bacon,” and a major theme of Malcolm Gladwell’s 2000 bestseller,  The Tipping Point . The internet has made it easier to study social networks, and several decades after its discovery, the phenomenon has become a subject of intense new research.

Stanley Milgram left Harvard in 1967 to return to his hometown, New York City, accepting a position as head of the social psychology program at the Graduate Center of the City University of New York.  Tragically, he died of a heart attack at the age of 51. Milgram is listed as number 46 on the American Psychological Association’s list of the 100 most eminent psychologists of the 20th century.

Blass, T. (2002).  The man who shocked the world.  Psychology Today, Mar/Apr2002, 35(2), p. 68.

Eminent psychologists of the 20th century.  (July/August, 2002). Monitor on Psychology, 33(7), p.29.

Milgram, S. (1977).  The individual in a social world.  Reading, MA:  Addison-Wesley Publishing Co.

Role/Affiliation

Filter: role.

  • Faculty (26)
  • Affiliated Faculty (5)
  • Non-Ladder Faculty (15)
  • Visiting Scholars (4)
  • Fellows and Associates (72)
  • Graduate Students (69)
  • Historical Faculty (24)
  • Postdocs and Research Associates (60)
  • Professors Emeriti (4)

Filter: Research Program

  • Clinical Science (6)
  • Cognition, Brain, & Behavior (17)
  • Developmental Psychology (7)
  • Social Psychology (12)

Filter by alphabetical grouping of Last Name

  • Skip to main content
  • Skip to primary sidebar

IResearchNet

Stanley Milgram’s Experiment

Stanley Milgram was one of the most influential social psychologists of the twentieth century. Born in 1933 in New York, he obtained a BA from Queen’s College, and went on to receive a PhD in psychology from Harvard. Subsequently, Milgram held faculty positions in psychology at Yale University and the City University of New York until his untimely death in 1984. Although Milgram never held a formal appointment in sociology, his work was centrally focused on the social psychological aspects of social structure.

Stanley Milgram’s Experiment

In a historic coincidence, in 1961, just as Milgram was about to begin work on his famous obedience experiments, the world witnessed the trial of Adolf Otto Eichmann, a high ranking Nazi official who was in charge of organizing the transport of millions of Jews to the death camps. To many, Eichmann appeared not at all to be the fervent anti Semite that many had suspected him to be; rather, his main defense was that he was only ‘‘following orders’’ as an administrator. To the political theorist Hannah Arendt, Eichmann’s case illustrated the ‘‘banality of evil,’’ in which personal malice appeared to matter less than the desire of individuals to fulfill their roles in the larger context of a bureaucracy. Milgram’s research is arguably the most striking example to illustrate this dynamic.

Milgram planned and conducted his obedience experiments between 1960 and 1963 at Yale University. In order to be able to study obedience to authority, he put unsuspecting research participants in a novel situation, which he staged in the laboratory. With the help of actors and props, Milgram set up an experimental ruse that was so real that hardly any of his research participants suspected that, in reality, nothing was what it pretended to be.

For this initial study, using newspaper ads promising $4.50 for participation in a psychological study, Milgram recruited men aged 20 to 50, ranging from elementary school drop outs to PhDs. Each research participant arrived in the lab along with another man, white and roughly 30 years of age, whom they thought to be another research participant. In reality, this person was a confederate, that is, an actor in cahoots with the experimenter. The experimenter explained that both men were about to take part in a study that explored the effect of punishment on memory. One man would assume the role of a ‘‘teacher’’ who would read a series of word pairings (e.g., nice day, blue box), which the other (‘‘the learner’’) was supposed to memorize. Subsequently, the teacher would read the first word of the pair with the learner having to select the correct second word from a list. Every mistake by the learner would be punished with an electric shock. It was further made clear that, although the shocks would be painful, they would not do any permanent harm.

Following this explanation, the experimenter assigned both men to the roles. Because the procedure was rigged, the unsuspecting research participant always was assigned to the role of teacher. As first order of business, the learner was seated in an armchair in an adjoining room such that he would be separated by a wall from the teacher, but would other wise be able to hear him from the main room. Electrodes were affixed to the learner’s arms, who was subsequently strapped to the chair apparently to make sure that improper movements would not endanger the success of the experiment.

In the main room, the teacher was told that he would have to apply electric shocks every time the learner made a mistake. For this purpose, the learner was seated in front of an electric generator with various dials. The experimenter instructed the teacher to steadily increase the voltage of the shock each time the learner made a new mistake. The shock generator showed a row of levers ranging from 15 volts on the left to 450 volts on the right, with each lever in between delivering a shock 15 volts higher than its neighbor on the left. Milgram labeled the voltage level, left to right, from ‘‘Slight Shock’’ to ‘‘Danger: Severe Shock,’’ with the last two switches being marked ‘‘XXX.’’ The teacher was told that he simply should work his way from the left to the right without using any lever twice. To give the teacher an idea of the electric current he would deliver to the learner, he received a sample shock of 45 volts, which most research participants found surprisingly painful. However, despite its appearance, in reality the generator never emitted any electric shocks. It was merely a device that allowed Milgram to examine how far the teacher would go in harming another person based on the experimenter’s say so.

As learning trials started, the teacher applied electric shocks to the learner. The learner’s responses were scripted such that he apparently made many mistakes, requiring the teacher to increase shock levels by 15 volts with every new mistake. As the strength of electric shocks increased, occasional grunts and moans of pain were heard from the learner. At 120 volts the learner started complaining about the pain. At 150 volts, the learner demanded to be released on account of a heart condition, and the protest continued until the shocks reached 300 volts and the learner started pounding on the wall. At 315 volts the learner stopped responding altogether.

As the complaints by the learner started, the teacher would often turn to the experimenter, who was seated at a nearby desk, wondering whether and how to proceed. The experimenter, instead of terminating the experiment, replied with a scripted succession of prods:

  • Prod 1: ‘‘Please continue.’’
  • Prod 2: ‘‘The experiment requires that you continue.’’
  • Prod 3: ‘‘It is absolutely necessary to continue.’’
  • Prod 4: ‘‘You have no other choice: you must go on.’’

These prods were successful in coaxing many teachers into continuing to apply electric shocks even when the learner no longer responded to the word memory questions. Indeed, in the first of Milgram’s experiments, a stunning 65 percent of all participants continued all the way to 450 volts, and not a single participant refused to continue the shocks before they reached the 300 volt level! The high levels of compliance illustrate the powerful effect of the social structure that participants had entered. By accepting the role of teacher in the experiment in exchange for the payment of a nominal fee, participants had agreed to accept the authority of the experimenter and carry out his instructions. In other words, just as Milgram suspected, the social forces of hierarchy and obedience could push normal and well adjusted individuals into harming others.

The overall level of obedience, however, does not reveal the tremendous amount of stress that all teachers experienced. Because the situation was extremely realistic, teachers were agonizing over whether or not to continue the electric shocks. Should they care for the well being of the obviously imperiled learners and even put their life in danger? Or should they abide by a legitimate authority figure, who presented his instructions crisply and confidently? Participants typically sought to resolve this conflict by seeking assurances that the experimenter, and not themselves, would accept full responsibility for their actions. Once they felt assured, they typically continued to apply shocks that would have likely electrocuted the learner.

Milgram expanded his initial research into a series of 19 experiments in which he carefully examined the conditions under which obedience would occur. For instance, the teacher’s proximity to the learner was an important factor in lowering obedience, that is, the proportion of people willing to deliver the full 450 volts. When the teacher was in the same room with the learner, obedience dropped to 40 percent, and when the teacher was required to touch the learner and apply physical force to deliver the electric shock, obedience dropped to 30 percent.

Milgram further suspected that the social status of the experimenter, presumably a serious Yale University researcher in a white lab coat, would have important implications for obedience. Indeed, when there was no obvious connection with Yale, and the above experiment was repeated in a run down office building in Bridgeport, Connecticut, obedience dropped to 48 percent. Indeed, when not the white coated experimenter but another confederate encouraged the teacher to continue the shocks, all participants terminated the experiment as soon as the confederate complained. Milgram concluded that ‘‘a substantial proportion of people do what they are told to do, irrespective of the content of the act and with out limitations of conscience, so long as they perceive that the command comes from a legitimate authority’’ (1965). However, additional studies highlighted that obedience is in part contingent on surveillance. When the experimenter transmitted his orders not in person but via telephone, obedience levels dropped to 20 percent, with many participants only pretending to apply higher and higher electric shocks.

Since its initial publication in 1963, Mil gram’s research has drawn a lot of criticism, mainly on ethical grounds. First, it was alleged that it was unethical to deceive participants to the extent that occurred in these studies. It is important to note that all participants were fully debriefed on the deception, and most did not seem to mind and were relieved to find out that they had not shocked the learner. The second ethical criticism is, however, much more serious. As alluded to earlier, Milgram exposed his participants to tremendous levels of stress. Milgram, anticipating this criticism, inter viewed participants after the experiment and followed up several weeks later. The over whelming majority of his participants commented that they enjoyed being in the experiment, and only a small minority experienced regret. Even though personally Milgram rejected allegations of having mistreated his participants, his own work suggests that he may have gone too far: ‘‘Subjects were observed to sweat, tremble, bite their lips, groan, and dig their fingernails into their flesh . . . A mature and initially poised businessman entered the laboratory smiling and confident. Within 20 minutes, he was reduced to a twitching, stuttering wreck who was rapidly approaching a point of nervous collapse’’ (1963: 375). Today, Milgram’s obedience studies are generally considered unethical and would not pass muster with regard to contemporary regulations protecting the well being of research participants. Ironically, partly because Milgram’s studies illustrated the power of hierarchical social relationships, contemporary researchers are at great pains to avoid coercion and allow participants to terminate their participation in any research study at any time without penalty.

Another type of criticism of the obedience studies has questioned their generality and charged that their usefulness in explaining real world events is limited. Indeed, Milgram conducted his research when trust in authorities was higher than it is nowadays. However, Milgram’s studies have withstood this criticism. Reviews of research conducted using Milgram’s paradigm have generally found obedience levels to be at roughly 60 percent (see, e.g., Blass 2000). In one of his studies Milgram further documented that there was no apparent difference in the responses of women and men. More recent research using more ethically acceptable methods further testifies to the power of obedience in shaping human action (Blass 2000).

Milgram offers an important approach to explaining the Holocaust by emphasizing the bureaucratic nature of evil, which relegated individuals to executioners of orders issued by a legitimate authority. Sociologists have extended this analysis and provided compelling accounts of obedience as root causes of many horrific crimes, ranging from the My Lai massacre to Watergate (Hamilton & Kelman 1989). How ever, it is arguably somewhat unclear to what extent Milgram’s findings can help explain the occurrence of the Holocaust itself. Whereas obedience kept the machinery of death running with frightening efficiency, historians often caution against ignoring the malice and sadism that many of Hitler’s executioners brought to the task (see Blass 2004).

Milgram’s dramatic experiments have left a lasting impression beyond the social sciences. They are the topic of various movies, including the 1975 TV film The Tenth Level starring William Shatner. Further, the 37 percent of participants who did not obey were memorialized in a 1986 song by the rock musician Peter Gabriel titled ‘‘We Do What We’re Told (Milgram’s 37).’’

References:

  • Blass, T. (Ed.) (2000) Obedience to Authority: Current Perspectives on the Milgram Paradigm. Erlbaum, Mahwah, NJ.
  • Blass, T. (2004) The Man Who Shocked the World: The Life and Legacy of Stanley Milgram. Basic Books, New York.
  • Hamilton, V. L. & Kelman, H. (1989) Crimes of Obedience: Toward a Social Psychology of Authority and Responsibility. Yale University Press, New Haven.
  • Milgram, S. (1963) Behavioral Study of Obedience. Journal of Abnormal and Social Psychology 69: 371-8.
  • Milgram, S. (1965) Some Conditions of Obedience and Disobedience to Authority. Human Relations 18: 57-76.
  • Milgram, S. (1974) Obedience to Authority: An Experimental View. Harper & Row, New York.

Encyclopedia Britannica

  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • Games & Quizzes
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center
  • Introduction

Education and national conformity studies

Obedience experiments.

  • Later experiments and publications

Stanley Milgram

Stanley Milgram

Our editors will review what you’ve submitted and determine whether to revise the article.

  • Academia - Stanley Milgram: the Obedience Studies in Social-Societal Context
  • American Scientist - Milgram's Progress
  • Harvard University - Department of Psychology - Biogra[hy of Stanley Milgram
  • GoodTherapy - Biography of Stanley Milgram
  • The New York Times - Dr. Stanley MilGram,51, is Daed - Studied Obidience to Authority
  • Table Of Contents

Stanley Milgram (born August 15, 1933, New York City , New York , U.S.—died December 20, 1984, New York City) was an American social psychologist known for his controversial and groundbreaking experiments on obedience to authority. Milgram’s obedience experiments, in addition to other studies that he carried out during his career, generally are considered to have provided important insight into human social behaviour, particularly conformity and social pressure. See also Milgram experiment .

Milgram was born and raised in the Bronx, the second of three children in a working-class Jewish family. As a youth, he was an exceptional student, with interests in science and the arts. At Queens College (later part of the City University of New York [CUNY]), he studied political science , in addition to taking courses in art, literature, and music. In 1953, following his third year at the college, he toured Europe and became increasingly interested in international relations . He was accepted into the graduate program in international affairs at Columbia University . However, in 1954, after completing a bachelor’s degree in political science at Queens College, Milgram instead began graduate studies in the social relations department at Harvard University .

At Harvard, Milgram took classes with leading social psychologists of the day, including Gordon Allport , Jerome Bruner , Roger Brown, and Solomon Asch , all of whom greatly influenced the direction of Milgram’s academic career. Of particular interest to Milgram were Asch’s conformity experiments, which showed that individual behaviour can be influenced by group behaviour , with individuals conforming to group perspectives, even when choices made by the group are obviously incorrect. Milgram set out to apply Asch’s group technique, with several variations, to the study of conformity on a national level, seeking to explore national stereotypes . He focused initially on the United States and Norway and later added France, using his connections at Harvard to travel to Oslo and Paris to establish study groups there. He used an auditory task to measure conformity, with participants in closed booths asked to distinguish between the lengths of two tones. Participants also heard the responses of other members of the study group, who supposedly occupied closed booths next to the participant (the group responses were recorded, and the other booths were empty). Milgram’s findings suggested that Americans and Norwegians differed little in conformity rates and that, of the three groups, the French were the least conforming.

stanley milgram prison experiment

In 1960, after earning a Ph.D. from Harvard, Milgram accepted a position as assistant professor at Yale University . There he narrowed his research to obedience. Having been acutely aware from his youth of his Jewish heritage and the tragedies suffered by Jews in Europe during the Holocaust , he was interested in understanding the factors that led people to inflict harm on others. He designed an unprecedented experiment—later known as the Milgram experiment —whereby study subjects, who believed that they were participating in a learning experiment about punishment and memory , were instructed by an authority figure (the experimenter) to inflict seemingly painful shocks to a helpless victim (the learner). Both the experimenter and the learner were actors hired by Milgram, and the shocks were simulated via an authentic-appearing shock generator that was equipped with 30 voltage levels, increasing from 15 to 450 volts. Subjects were instructed by the experimenter to deliver a shock to the learner whenever the latter gave an incorrect answer to a question. With each incorrect response, shock intensity increased. At predetermined voltage levels, the learner (usually in a separate room) either banged on the adjoining wall, cried out in pain and pleaded with the participant to stop, or complained about a fictitious heart condition.

Prior to carrying out the experiments, Milgram and Yale psychology students whom he polled about possible outcomes of such a study predicted that only a very small percentage (from 0 to 3 percent) of people would inflict the most-extreme-intensity shock. Hence, Milgram was surprised with the results of early pilot studies, in which most participants continued through to the extreme 450-volt limit. The first official experiments carried out by Milgram in 1961 yielded similar results—26 out of 40 men recruited for the study proved to be fully obedient to the experimenter, delivering shocks through 450 volts. Variations in the experimental design showed that obedience was highest when the learner was in a separate room, as opposed to being in close proximity to the subject (e.g., in the same room or near enough to touch). Subjects persisted in their obedience despite verbally expressing their disapproval of continuing with the shocks.

Milgram suspected that subjects struggled to disengage from the experiment because of its incremental (“slippery slope”) progression—small demands, seemingly benign , became increasingly adverse. Subjects also may have been readily conforming, seeing themselves as inferior to the experimenter in their knowledge of learning, or they may have viewed themselves as being free of responsibility, simply carrying out the experimenter’s commands.

Although thought-provoking, the experiments and their findings were highly controversial. The situation placed extreme stress on the subjects, some of whom experienced nervous laughter that culminated in seizures. In debriefing , Milgram did not reveal the full truth about the experiments to his subjects, leaving some to think that they really had shocked another person; it was not until many months later that subjects learned the true nature of the experiments. The validity of the findings also was later drawn into question by reports claiming that some participants suspected that they were the subjects being studied, with the aim of the study being to see how far they would obey the experimenter.

Advertisement

The shocking truth of Stanley Milgram's obedience experiments

Milgram dismayed the world when he revealed how little it took to turn everyday people into torturers – but we were misled

By Gina Perry

14 March 2018

Milgram

© Alexandra Milgram

WEARING a neat suit and tie, Adolf Eichmann brought the horror of Nazi concentration camps into American living rooms, making a new generation aware of the second world war’s atrocities. Eichmann was a high-ranking officer of the Third Reich, and his trial for war crimes was televised nightly across the US from April to August 1961.

Stanley Milgram was riveted. He was a 26-year-old assistant professor at Yale University with childhood memories of the war, such as gathering around the radio with his family in their Brooklyn apartment for news of Jewish relatives in Eastern Europe. As the trial unfolded, Eichmann insisted he was merely following orders. This gave Milgram an idea for a research project that would become one of the most controversial experiments in the history of psychology.

Milgram’s exploration into the limits of obedience to authority captured the public imagination, not least because of his chilling conclusion: that the majority of us could become torturers with just a few words of encouragement from a single authority figure.

I arrived at Yale in 2007, excited to take a close look at this classic experiment and its recently released archive material. But what I found revealed a disturbing, twisted tale. This landmark research is as misunderstood as it is famous.

In the early 1960s, social psychology was still an emerging discipline, one that quickly gained a reputation for experiments that concealed their true nature so as to trick people into behaving naturally. Pioneers like Milgram were expected to develop storytelling, acting and stagecraft skills as part of their research toolkit.

Milgram advertised in the local paper for paid volunteers.…

Sign up to our weekly newsletter

Receive a weekly dose of discovery in your inbox! We'll also keep you up to date with New Scientist events and special offers.

To continue reading, subscribe today with our introductory offers

No commitment, cancel anytime*

Offer ends 2nd of July 2024.

*Cancel anytime within 14 days of payment to receive a refund on unserved issues.

Inclusive of applicable taxes (VAT)

Existing subscribers

More from New Scientist

Explore the latest news, articles and features

stanley milgram prison experiment

We may finally know how the placebo effect relieves pain

Guy sleeping on the couch in what looks like an uncomfortable position; Shutterstock ID 241260808; purchase_order: -; job: -; client: -; other: -

How to use psychology to hack your mind and fall in love with exercise

Subscriber-only

TOPSHOT - Fitness coach Gabrielle Friscira gives a lesson by videoconference in Saint-Remy-lHonore, west of Paris, on April 15, 2020, on the 30th day of a strict lockdown in France aimed at curbing the spread of the COVID-19 pandemic, caused by the novel coronavirus. (Photo by FRANCK FIFE / AFP) (Photo by FRANCK FIFE/AFP via Getty Images)

If your gym instructor is an iPad, what is lost – and gained?

Voters cast ballots

Speed of decision-making reflects our biases

Popular articles.

Trending New Scientist articles

Share this on:

  • Story Highlights
  • Next Article in Health »

(CNN) -- If someone told you to press a button to deliver a 450-volt electrical shock to an innocent person in the next room, would you do it?

Stanley Milgram began conducting his famous psychology experiments in 1961.

Common sense may say no, but decades of research suggests otherwise.

In the early 1960s, a young psychologist at Yale began what became one of the most widely recognized experiments in his field. In the first series, he found that about two-thirds of subjects were willing to inflict what they believed were increasingly painful shocks on an innocent person when the experimenter told them to do so, even when the victim screamed and pleaded.

The legacy of Stanley Milgram, who died 24 years ago on December 20, reaches far beyond that initial round of experiments. Researchers have been working on the questions he posed for decades, and have not settled on a brighter vision of human obedience.

A new study to be published in the January issue of American Psychologist confirmed these results in an experiment that mimics many of Milgram's original conditions. This and other studies have corroborated the startling conclusion that the majority of people, when placed in certain kinds of situations, will follow orders, even if those orders entail harming another person.

"It's situations that make ordinary people into evil monsters, and it's situations that make ordinary people into heroes," said Philip Zimbardo, professor emeritus of psychology at Stanford University and author of "The Lucifer Effect: Understanding How Good People Turn Evil."

How Milgram's experiments worked

Milgram, who also came up with the theory behind "six degrees of separation" -- the idea that everyone is connected to everyone else through a small number of acquaintances -- set out to figure out why people would turn against their own neighbors in circumstances such as Nazi-occupied Europe. Referring to Nazi leader Adolf Eichmann, Milgram wrote in 1974, "Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?"

His experiment in its standard form included a fake shock machine, a "teacher," a "learner" and an experimenter in a laboratory setting. The participant was told that he or she had to teach the student to memorize a pair of words, and the punishment for a wrong answer was a shock from the machine.

  • Happiness is contagious in social networks
  • Queuing psychology: Can waiting in line be fun?

The teacher sat in front of the shock machine, which had 30 levers, each corresponding to an additional 15 volts. With each mistake the student made, the teacher had to pull the next lever to deliver a more painful punishment.

While the machine didn't generate shocks and a recorded voice track simulated painful reactions, the teacher was led to believe that he or she was shocking a student, who screamed and asked to leave at higher voltages, and eventually fell silent.

If the teacher questioned continuing as instructed, the experimenter simply said, "The experiment requires that you go on," said Thomas Blass, author of the biography "The Man Who Shocked The World: The Life and Legacy of Stanley Milgram" and the Web site StanleyMilgram.com .

About 65 percent of participants pulled levers corresponding to the maximum voltage -- 450 volts -- in spite of the screams of agony from the learner.

"What the experiment shows is that the person whose authority I consider to be legitimate, that he has a right to tell me what to do and therefore I have obligation to follow his orders, that person could make me, make most people, act contrary to their conscience," Blass said.

His study's design imitated Milgram's, even using the same scripts for the experimenter and suffering learner, but the key difference was that this experiment stopped at 150 volts -- when the learner starts asking to leave. In Milgram's experiment, 79 percent of participants who got to that point went all the way to the maximum shock, he said.

To eliminate bias from the fame of Milgram's experiment, Burger ruled out anyone who had taken two or more college-level psychology classes, and anyone who expressed familiarity with it in the debriefing. The "teachers" in this recent experiment, conducted in 2006, also received several reminders that they could quit whenever they wanted, unlike in Milgram's study.

The new results correlate well with Milgram's: 70 percent of the 40 participants were willing to continue after 150 volts, compared with 82.5 percent in Milgram's study -- a difference that is not statistically significant, Burger said.

Still, some psychologists quoted in the same issue of American Psychologist questioned how comparable this study is to Milgram's, given the differences in methods.

The idea of blind obedience isn't as important in these studies as the larger message about the power of the situation, Burger said. It's also significant that the participant begins with small voltages that increase in small doses over time.

"It's that gradual incremental nature that, as we know, is a very powerful way to change attitudes and behaviors," he said.

Stanford Prison Experiment

This idea of circumstances driving immoral behavior also came out in the Stanford Prison Experiment, a study done in 1971 that is the subject of a film in preproduction, written and directed by Christopher McQuarrie. Work on the film will resume in 2009 after McQuarrie's "Valkyrie" is released, his spokesperson said.

In this study, designed by Stanford's Zimbardo, two dozen male college students were randomly designated as either prison guards or prisoners, and lived in the basement of the university's psychology building playing these roles in their respective uniforms.

Within three days, participants had extreme stress reactions, Zimbardo said. The guards became abusive to the prisoners -- sexually taunting them, asking them to strip naked and demanding that they clean toilet bowls with their bare hands, Zimbardo said. Five prisoners had to be released before the study was over.

Zimbardo's own role illustrated his point: Because he took on the role of prison administrator, he became so engrossed in the jail system that he didn't stop the experiment as soon as this cruelty began, he said.

"If I were simply the principal experimenter, I would have ended it after the second kid broke down," he said. "We all did bad things in this study, including me, but it's diagnostic of the power situation."

Turning the principle around

But while ordinary people have the potential to do evil, they also have the power to do good. That's the subject of the Everyday Heroism project, a collection of social scientists, including Zimbardo, seeking to understand heroic activity -- an area in which almost no research has been done, he said.

Acts such as learning first aid, leading others to the exit in an emergency and encouraging family members to recycle are some heroic behaviors that Zimbardo seeks to encourage.

All About Psychology • Social and Behavioral Sciences

Sound Off: Your opinions and comments

Post a comment, from the blogs: controversy, commentary, and debate, sit tight, we're getting to the good stuff.

  • FDA halts work at nut plant linked to salmonella
  • What polonium does to the body
  • 8 healthy ways to boost energy

Share this on:

  • Story Highlights
  • Next Article in Health »

(CNN) -- If someone told you to press a button to deliver a 450-volt electrical shock to an innocent person in the next room, would you do it?

Stanley Milgram began conducting his famous psychology experiments in 1961.

Common sense may say no, but decades of research suggests otherwise.

In the early 1960s, a young psychologist at Yale began what became one of the most widely recognized experiments in his field. In the first series, he found that about two-thirds of subjects were willing to inflict what they believed were increasingly painful shocks on an innocent person when the experimenter told them to do so, even when the victim screamed and pleaded.

The legacy of Stanley Milgram, who died 24 years ago on December 20, reaches far beyond that initial round of experiments. Researchers have been working on the questions he posed for decades, and have not settled on a brighter vision of human obedience.

A new study to be published in the January issue of American Psychologist confirmed these results in an experiment that mimics many of Milgram's original conditions. This and other studies have corroborated the startling conclusion that the majority of people, when placed in certain kinds of situations, will follow orders, even if those orders entail harming another person.

"It's situations that make ordinary people into evil monsters, and it's situations that make ordinary people into heroes," said Philip Zimbardo, professor emeritus of psychology at Stanford University and author of "The Lucifer Effect: Understanding How Good People Turn Evil."

How Milgram's experiments worked

Milgram, who also came up with the theory behind "six degrees of separation" -- the idea that everyone is connected to everyone else through a small number of acquaintances -- set out to figure out why people would turn against their own neighbors in circumstances such as Nazi-occupied Europe. Referring to Nazi leader Adolf Eichmann, Milgram wrote in 1974, "Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?"

His experiment in its standard form included a fake shock machine, a "teacher," a "learner" and an experimenter in a laboratory setting. The participant was told that he or she had to teach the student to memorize a pair of words, and the punishment for a wrong answer was a shock from the machine.

  • Happiness is contagious in social networks
  • Queuing psychology: Can waiting in line be fun?

The teacher sat in front of the shock machine, which had 30 levers, each corresponding to an additional 15 volts. With each mistake the student made, the teacher had to pull the next lever to deliver a more painful punishment.

While the machine didn't generate shocks and a recorded voice track simulated painful reactions, the teacher was led to believe that he or she was shocking a student, who screamed and asked to leave at higher voltages, and eventually fell silent.

If the teacher questioned continuing as instructed, the experimenter simply said, "The experiment requires that you go on," said Thomas Blass, author of the biography "The Man Who Shocked The World: The Life and Legacy of Stanley Milgram" and the Web site StanleyMilgram.com .

About 65 percent of participants pulled levers corresponding to the maximum voltage -- 450 volts -- in spite of the screams of agony from the learner.

"What the experiment shows is that the person whose authority I consider to be legitimate, that he has a right to tell me what to do and therefore I have obligation to follow his orders, that person could make me, make most people, act contrary to their conscience," Blass said.

His study's design imitated Milgram's, even using the same scripts for the experimenter and suffering learner, but the key difference was that this experiment stopped at 150 volts -- when the learner starts asking to leave. In Milgram's experiment, 79 percent of participants who got to that point went all the way to the maximum shock, he said.

To eliminate bias from the fame of Milgram's experiment, Burger ruled out anyone who had taken two or more college-level psychology classes, and anyone who expressed familiarity with it in the debriefing. The "teachers" in this recent experiment, conducted in 2006, also received several reminders that they could quit whenever they wanted, unlike in Milgram's study.

The new results correlate well with Milgram's: 70 percent of the 40 participants were willing to continue after 150 volts, compared with 82.5 percent in Milgram's study -- a difference that is not statistically significant, Burger said.

Still, some psychologists quoted in the same issue of American Psychologist questioned how comparable this study is to Milgram's, given the differences in methods.

The idea of blind obedience isn't as important in these studies as the larger message about the power of the situation, Burger said. It's also significant that the participant begins with small voltages that increase in small doses over time.

"It's that gradual incremental nature that, as we know, is a very powerful way to change attitudes and behaviors," he said.

Stanford Prison Experiment

This idea of circumstances driving immoral behavior also came out in the Stanford Prison Experiment, a study done in 1971 that is the subject of a film in preproduction, written and directed by Christopher McQuarrie. Work on the film will resume in 2009 after McQuarrie's "Valkyrie" is released, his spokesperson said.

In this study, designed by Stanford's Zimbardo, two dozen male college students were randomly designated as either prison guards or prisoners, and lived in the basement of the university's psychology building playing these roles in their respective uniforms.

Within three days, participants had extreme stress reactions, Zimbardo said. The guards became abusive to the prisoners -- sexually taunting them, asking them to strip naked and demanding that they clean toilet bowls with their bare hands, Zimbardo said. Five prisoners had to be released before the study was over.

Zimbardo's own role illustrated his point: Because he took on the role of prison administrator, he became so engrossed in the jail system that he didn't stop the experiment as soon as this cruelty began, he said.

"If I were simply the principal experimenter, I would have ended it after the second kid broke down," he said. "We all did bad things in this study, including me, but it's diagnostic of the power situation."

Turning the principle around

But while ordinary people have the potential to do evil, they also have the power to do good. That's the subject of the Everyday Heroism project, a collection of social scientists, including Zimbardo, seeking to understand heroic activity -- an area in which almost no research has been done, he said.

Acts such as learning first aid, leading others to the exit in an emergency and encouraging family members to recycle are some heroic behaviors that Zimbardo seeks to encourage.

All About Psychology • Social and Behavioral Sciences

Sound Off: Your opinions and comments

Post a comment, from the blogs: controversy, commentary, and debate, sit tight, we're getting to the good stuff.

  • FDA halts work at nut plant linked to salmonella
  • What polonium does to the body
  • 8 healthy ways to boost energy

The Stanford Prison Experiment was massively influential. We just learned it was a fraud.

The most famous psychological studies are often wrong, fraudulent, or outdated. Textbooks need to catch up.

by Brian Resnick

Rorschach test 

The Stanford Prison Experiment, one of the most famous and compelling psychological studies of all time, told us a tantalizingly simple story about human nature.

The study took paid participants and assigned them to be “inmates” or “guards” in a mock prison at Stanford University. Soon after the experiment began, the “guards” began mistreating the “prisoners,” implying evil is brought out by circumstance. The authors, in their conclusions, suggested innocent people, thrown into a situation where they have power over others, will begin to abuse that power. And people who are put into a situation where they are powerless will be driven to submission, even madness.

The Stanford Prison Experiment has been included in many, many introductory psychology textbooks and is often cited uncritically . It’s the subject of movies, documentaries, books, television shows, and congressional testimony .

But its findings were wrong. Very wrong. And not just due to its questionable ethics or lack of concrete data — but because of deceit.

  • Philip Zimbardo defends the Stanford Prison Experiment, his most famous work 

A new exposé published by Medium based on previously unpublished recordings of Philip Zimbardo, the Stanford psychologist who ran the study, and interviews with his participants, offers convincing evidence that the guards in the experiment were coached to be cruel. It also shows that the experiment’s most memorable moment — of a prisoner descending into a screaming fit, proclaiming, “I’m burning up inside!” — was the result of the prisoner acting. “I took it as a kind of an improv exercise,” one of the guards told reporter Ben Blum . “I believed that I was doing what the researchers wanted me to do.”

The findings have long been subject to scrutiny — many think of them as more of a dramatic demonstration , a sort-of academic reality show, than a serious bit of science. But these new revelations incited an immediate response. “We must stop celebrating this work,” personality psychologist Simine Vazire tweeted , in response to the article . “It’s anti-scientific. Get it out of textbooks.” Many other psychologists have expressed similar sentiments.

( Update : Since this article published, the journal American Psychologist has published a thorough debunking of the Stanford Prison Experiment that goes beyond what Blum found in his piece. There’s even more evidence that the “guards” knew the results that Zimbardo wanted to produce, and were trained to meet his goals. It also provides evidence that the conclusions of the experiment were predetermined.)

Many of the classic show-stopping experiments in psychology have lately turned out to be wrong, fraudulent, or outdated. And in recent years, social scientists have begun to reckon with the truth that their old work needs a redo, the “ replication crisis .” But there’s been a lag — in the popular consciousness and in how psychology is taught by teachers and textbooks. It’s time to catch up.

Many classic findings in psychology have been reevaluated recently

stanley milgram prison experiment

The Zimbardo prison experiment is not the only classic study that has been recently scrutinized, reevaluated, or outright exposed as a fraud. Recently, science journalist Gina Perry found that the infamous “Robbers Cave“ experiment in the 1950s — in which young boys at summer camp were essentially manipulated into joining warring factions — was a do-over from a failed previous version of an experiment, which the scientists never mentioned in an academic paper. That’s a glaring omission. It’s wrong to throw out data that refutes your hypothesis and only publicize data that supports it.

Perry has also revealed inconsistencies in another major early work in psychology: the Milgram electroshock test, in which participants were told by an authority figure to deliver seemingly lethal doses of electricity to an unseen hapless soul. Her investigations show some evidence of researchers going off the study script and possibly coercing participants to deliver the desired results. (Somewhat ironically, the new revelations about the prison experiment also show the power an authority figure — in this case Zimbardo himself and his “warden” — has in manipulating others to be cruel.)

  • The Stanford Prison Experiment is based on lies. Hear them for yourself.

Other studies have been reevaluated for more honest, methodological snafus. Recently, I wrote about the “marshmallow test,” a series of studies from the early ’90s that suggested the ability to delay gratification at a young age is correlated with success later in life . New research finds that if the original marshmallow test authors had a larger sample size, and greater research controls, their results would not have been the showstoppers they were in the ’90s. I can list so many more textbook psychology findings that have either not replicated, or are currently in the midst of a serious reevaluation.

  • Social priming: People who read “old”-sounding words (like “nursing home”) were more likely to walk slowly — showing how our brains can be subtly “primed” with thoughts and actions.
  • The facial feedback hypothesis: Merely activating muscles around the mouth caused people to become happier — demonstrating how our bodies tell our brains what emotions to feel.
  • Stereotype threat: Minorities and maligned social groups don’t perform as well on tests due to anxieties about becoming a stereotype themselves.
  • Ego depletion: The idea that willpower is a finite mental resource.

Alas, the past few years have brought about a reckoning for these ideas and social psychology as a whole.

Many psychological theories have been debunked or diminished in rigorous replication attempts. Psychologists are now realizing it’s more likely that false positives will make it through to publication than inconclusive results. And they’ve realized that experimental methods commonly used just a few years ago aren’t rigorous enough. For instance, it used to be commonplace for scientists to publish experiments that sampled about 50 undergraduate students. Today, scientists realize this is a recipe for false positives , and strive for sample sizes in the hundreds and ideally from a more representative subject pool.

Nevertheless, in so many of these cases, scientists have moved on and corrected errors, and are still doing well-intentioned work to understand the heart of humanity. For instance, work on one of psychology’s oldest fixations — dehumanization, the ability to see another as less than human — continues with methodological rigor, helping us understand the modern-day maltreatment of Muslims and immigrants in America.

In some cases, time has shown that flawed original experiments offer worthwhile reexamination. The original Milgram experiment was flawed. But at least its study design — which brings in participants to administer shocks (not actually carried out) to punish others for failing at a memory test — is basically repeatable today with some ethical tweaks.

And it seems like Milgram’s conclusions may hold up: In a recent study, many people found demands from an authority figure to be a compelling reason to shock another. However, it’s possible, due to something known as the file-drawer effect, that failed replications of the Milgram experiment have not been published. Replication attempts at the Stanford prison study, on the other hand, have been a mess .

In science, too often, the first demonstration of an idea becomes the lasting one — in both pop culture and academia. But this isn’t how science is supposed to work at all!

Science is a frustrating, iterative process. When we communicate it, we need to get beyond the idea that a single, stunning study ought to last the test of time. Scientists know this as well, but their institutions have often discouraged them from replicating old work, instead of the pursuit of new and exciting, attention-grabbing studies. (Journalists are part of the problem too , imbuing small, insignificant studies with more importance and meaning than they’re due.)

Thankfully, there are researchers thinking very hard, and very earnestly, on trying to make psychology a more replicable, robust science. There’s even a whole Society for the Improvement of Psychological Science devoted to these issues.

Follow-up results tend to be less dramatic than original findings , but they are more useful in helping discover the truth. And it’s not that the Stanford Prison Experiment has no place in a classroom. It’s interesting as history. Psychologists like Zimbardo and Milgram were highly influenced by World War II. Their experiments were, in part, an attempt to figure out why ordinary people would fall for Nazism. That’s an important question, one that set the agenda for a huge amount of research in psychological science, and is still echoed in papers today.

Textbooks need to catch up

Psychology has changed tremendously over the past few years. Many studies used to teach the next generation of psychologists have been intensely scrutinized, and found to be in error. But troublingly, the textbooks have not been updated accordingly .

That’s the conclusion of a 2016 study in Current Psychology. “ By and large,” the study explains (emphasis mine):

introductory textbooks have difficulty accurately portraying controversial topics with care or, in some cases, simply avoid covering them at all. ... readers of introductory textbooks may be unintentionally misinformed on these topics.

The study authors — from Texas A&M and Stetson universities — gathered a stack of 24 popular introductory psych textbooks and began looking for coverage of 12 contested ideas or myths in psychology.

The ideas — like stereotype threat, the Mozart effect , and whether there’s a “narcissism epidemic” among millennials — have not necessarily been disproven. Nevertheless, there are credible and noteworthy studies that cast doubt on them. The list of ideas also included some urban legends — like the one about the brain only using 10 percent of its potential at any given time, and a debunked story about how bystanders refused to help a woman named Kitty Genovese while she was being murdered.

The researchers then rated the texts on how they handled these contested ideas. The results found a troubling amount of “biased” coverage on many of the topic areas.

stanley milgram prison experiment

But why wouldn’t these textbooks include more doubt? Replication, after all, is a cornerstone of any science.

One idea is that textbooks, in the pursuit of covering a wide range of topics, aren’t meant to be authoritative on these individual controversies. But something else might be going on. The study authors suggest these textbook authors are trying to “oversell” psychology as a discipline, to get more undergraduates to study it full time. (I have to admit that it might have worked on me back when I was an undeclared undergraduate.)

There are some caveats to mention with the study: One is that the 12 topics the authors chose to scrutinize are completely arbitrary. “And many other potential issues were left out of our analysis,” they note. Also, the textbooks included were printed in the spring of 2012; it’s possible they have been updated since then.

Recently, I asked on Twitter how intro psychology professors deal with inconsistencies in their textbooks. Their answers were simple. Some say they decided to get rid of textbooks (which save students money) and focus on teaching individual articles. Others have another solution that’s just as simple: “You point out the wrong, outdated, and less-than-replicable sections,” Daniël Lakens , a professor at Eindhoven University of Technology in the Netherlands, said. He offered a useful example of one of the slides he uses in class.

Anecdotally, Illinois State University professor Joe Hilgard said he thinks his students appreciate “the ‘cutting-edge’ feeling from knowing something that the textbook didn’t.” (Also, who really, earnestly reads the textbook in an introductory college course?)

And it seems this type of teaching is catching on. A (not perfectly representative) recent survey of 262 psychology professors found more than half said replication issues impacted their teaching . On the other hand, 40 percent said they hadn’t. So whether students are exposed to the recent reckoning is all up to the teachers they have.

If it’s true that textbooks and teachers are still neglecting to cover replication issues, then I’d argue they are actually underselling the science. To teach the “replication crisis” is to teach students that science strives to be self-correcting. It would instill in them the value that science ought to be reproducible.

Understanding human behavior is a hard problem. Finding out the answers shouldn’t be easy. If anything, that should give students more motivation to become the generation of scientists who get it right.

“Textbooks may be missing an opportunity for myth busting,” the Current Psychology study’s authors write. That’s, ideally, what young scientist ought to learn: how to bust myths and find the truth.

Further reading: Psychology’s “replication crisis”

  • The replication crisis, explained. Psychology is currently undergoing a painful period of introspection. It will emerge stronger than before.
  • The “marshmallow test” said patience was a key to success. A new replication tells us s’more.
  • The 7 biggest problems facing science, according to 270 scientists
  • What a nerdy debate about p-values shows about science — and how to fix it
  • Science is often flawed. It’s time we embraced that.

Most Popular

  • Why is everyone mad at Blake Lively?
  • The US government has to start paying for things again
  • Take a mental break with the newest Vox crossword
  • People are falling in love with — and getting addicted to — AI voices
  • How Raygun earned her spot — fair and square — as an Olympics breaker

Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.

 alt=

This is the title for the native ad

 alt=

More in Science

Why does it feel like everyone is getting Covid?

Covid’s summer surge, explained

Earthquakes are among our deadliest disasters. Scientists are racing to get ahead of them.

Japan’s early-warning system shows a few extra seconds can save scores of lives.

The only child stigma, debunked

Being an only child doesn’t mess you up for life. We promise.

We have a drug that might delay menopause — and help us live longer

Ovaries age faster than the rest of the body. Figuring out how to slow menopause might help all of us age better.

Ditching factory farming can help prevent another pandemic

The neglected environmental and health benefits of fighting Big Meat — for humans.

These reviled birds of prey literally save people’s lives

When vultures died off in India, people died too.

The Partially Examined Life Philosophy Podcast

A Philosophy Podcast and Philosophy Blog

stanley milgram prison experiment

Episode 176: Situationism in Psych: Milgram & Stanford Prison Experiments (Part One)

November 6, 2017 by Mark Linsenmayer 7 Comments

Podcast: Play in new window | Download (Duration: 43:29 — 39.9MB)

stanley milgram prison experiment

Do difficult situations make good people act badly? Are there really “good” and “bad” people, or are we all about the same, but put in different situations? Situationism is supported by Milgram’s experiment, where most subjects could be easily pressured into delivering shocks to an innocent person (really an actor… punked!). A more immersive example was provided by The Stanford Prison Experiment, where students took on the roles of guard and prisoner, and quickly became sadistic and passive respectively. John Doris argues that situationism is a direct attack on virtue ethics, that really there is no such thing as a virtue like “bravery” or “generosity” that cuts across all sorts of situations. While there are of course consistent personality traits, these don’t map against the virtues as depicted by Aristotle and our common cultural notions. Rather, they’re more context-dependent, specific to certain types of situations.

David and his pal Tamler Sommers (who previously appeared on PEL ep. 93 on free will) previously discussed situationism back in Very Bad Wizards ep. 9 . By comparing the two, you can objectively compare the quality of the two podcasts and/or Dave’s virtue at 2012 vs. 2017 and/or how he talks with many fewer listeners vs. a very large audience.

Watch the new version of the Milgram experiment as shown on the BBC. Read about the criticisms of the experiment on Wikipedia. This episode of The Psych Files podcast talks about the recent replication of the study at Santa Clara University.

Watch the documentary on the Stanford Prison Experiment. Watch Zimbardo’s 2007 talk about The Lucifer Effect (the book that the recent film about the experiment is based on) and his experience defending one of the defendants in the Abu Ghraib torture case; he describes Milgram’s experiment and those following it.

Netflix subscribers can see both studies dramatized in films from 2015: The Experimenter and The Stanford Prison Experiment .

Listeners may want to revisit PEL’s three episodes on Aristotle’s Nichomachean Ethics , staring with ep. 5 .

Continues with Part Two . Get the full, unbroken, ad-free Citizen Edition . Please support PEL!

Milgram picture by Olle Halvars .

Facebook

November 6, 2017 at 11:35 am

You guys mention that people weren’t allowed to leave, but in other places I’ve read they said that two people left mid study. I didn’t get the same sense that Zimbardo was someone with an ulterior motive. In his paper he writes, “We were horrified because we saw some boys treat others as if they were despicable animals, taking pleasure in cruelty, while other boys became servile, dehumanized robots who though only of escape, of their own individual survival and of their mounting hatred of the guards.”

He also says, “I terminated the experiment not only because of the escalating level of violence and degradation by the “guards” against the “prisoners”…but also because I was made aware of the personal transformation that I was undergoing personally…I had become a Prison Superintendent, the second role I played in addition to that of Principal Investigator. I began to talk, walk, and act like a rigid institutional authority figure more concerned about the security of “my prison” than the needs of the young men entrusted to my care as a psychological researcher.” (Zimbardo 2005 A situationist perspective on the psychology of evil: Understanding how good people are transformed into perpetrators.) also (Zimbardo, P.G., Maslach, C., & Haney, C. 1999 Reflections of the Stanford Prison Experiment: Genesis, transformation, conseuquences.)

Also – perhaps I misheard in the podcast, but it seemed like you guys discounted the participants bc they responded to an ad for $15. There were nearly a hundred volunteers and he chose 24 and he said in an interview that he purposefully chose bright college students. He wanted to show it could happen to anyone. If anything he seems to have skewed the participants to being more intelligent than average. They also had a day when parents could come and visit and they did, so it’s not like these were poor, neglected, psychologically damaged people. I’m not saying that he did the proper screening. All I read was that it was “extensive”. I’m simply saying that it seems you are casting unwarranted doubt on this study. That may be totally justified, but I’m just not seeing it and wish you could have explained more fully why the extreme doubt was placed. Certainly it is unethical, but I don’t know if that equals flawed in terms of validity. If anything, I think it makes it more compelling.

November 6, 2017 at 5:15 pm

When you guys were discussing character over time versus just one instance, I thought this was the best part of the conversation. I don’t think these experiments can really determine character so much as capacity. As I read these (and I read them a long while ago) I thought the take away point was that we have the capacity to do things that we might believe we would never. It made me think about people who are labeled as sex offenders – who bear this Scarlet Letter their whole lives, and while I can never imagine doing what any one of them did, it is in essence pegging their character based on one action – which seems unjustified. Somewhere I read about Zimbardo saying that their were three types of guards: one that followed all the rules, one that did special favors for the prisoners, and the mean ones (I’m summarizing). I don’t know the break down was of those three categories, but that sounded pretty true to life to me and a whole lot different than what I originally thought which was that everyone was a mean guard and the whole experiment had to be ended because of that. It’s much more ambiguous of a finding IMO. To me this is the flawed part of his experiment. We don’t see that everyone can be cruel because not everyone was. They don’t stand up for the inmates because, perhaps, they know they are there by choice and might behave differently if it were a real prison and they saw injustices. It seems highly dependent on things that were not considered in the experiment. For example, I am a pleaser. I’m always going to do the right thing and always follow the rules. I often wonder how/if my own personal moral compass comes in because my drive to conform socially and to make sure I am doing what I am supposed to do is so strong. In this case, I could see myself thinking that I am supposed to be either a “good guard” or a “good prisoner” and never once step out and be bold and say HEY! this whole thing is wrong! This is unethical.

November 6, 2017 at 6:39 pm

*bare/*there (really wish you guys had an edit function that lasted longer than a couple minutes!)

' src=

November 8, 2017 at 6:44 pm

One of the main factors in Nazi Germany was anti-semitism: Hitler channeled a pre-existing anti-semitism in the German population.

People generally seem willing to be act cruelly towards members of groups which they consider to be “not their peers” or inferior or their enemy. “Normal” white people in the U.S. did nothing about Jim Crow legislation against black people and were complicit in its functioning. “Normal” U.S. military personnel torture “enemy combatants” in Guantanamo.

I put “normal” between quotation marks because first of all, I’m not sure that being normal means being virtuous at all. Hannah Arendt says that “normal” people in Nazi Germany did not help to save Jews: those who did help to save them tended to have been outsiders of sorts, for example, Schindler, who was a “corrupt” businessman and a bit of a con artist. “Normal” people everywhere tend to follow the herd and if the herd hates Jews or blacks or gays, they go along willingly.

[…] to part 1 first or get the ad-free Citizen Edition. Please support PEL! We'll be offering pre-orders for the […]

[…] Dave @peez. Hear him on The Partially Examined Life, undoubtedly the apex of his professional […]

[…] Listen to Mark’s other podcast discuss the not-really-escape-room-like Stanford Prison Experim…. […]

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

 Yes, please add me to your mailing list (for an occasional announcement or newsletter)

About The Partially Examined Life

Become a pel citizen.

PEL Citizens have access to all podcast episodes, free access to podcast transcripts, guided readings, episode guides, PEL music, and other citizen-exclusive material. Click here to join.

Blog Post Categories

  • Announcements
  • Book Excerpts
  • Citizen Content
  • Citizen Document
  • Citizen News
  • Close Reading
  • Combat and Classics
  • Constellary Tales
  • Exclude from Newsletter
  • Featured Ad-Free
  • Featured Article
  • General Announcements
  • Letter to the Editor
  • Misc. Philosophical Musings
  • Nakedly Examined Music Podcast
  • Nakedly Self-Examined Music
  • Not School Recording
  • Not School Report
  • Other (i.e. Lesser) Podcasts
  • PEL Nightcap
  • PEL's Notes
  • Personal Philosophies
  • Phi Fic Podcast
  • Philosophy vs. Improv
  • Podcast Episode (Citizen)
  • Podcast Episodes
  • Pretty Much Pop
  • Song Self-Exam
  • Supporter Exclusive
  • Things to Watch
  • Vintage Episode (Citizen)
  • Web Detritus

Yellow electric shock warning on a grey brick wall. The sign reads 11000V. The colours of the image are washed out.

Program: Milgram Shock and Stanford Prison — what we misunderstand about the most infamous experiments in psychology

Program: All In the Mind

Brought to you by

ABC Radio National

Presented by

Sana Qadar portrait

What makes people do evil things?

Psychologist Stanley Milgram wanted to understand if people could be led to do awful things, just by being told to do them.

The experiment he would devise to test this would become one of the most infamous examples of unethical studies in the field of psychology: The Milgram Shock Experiment.

But Professor Alex Haslam says that's not the full picture… And the findings are misunderstood.

Guest: Professor Alex Haslam Professor of Social and Organizational Psychology Australian Laureate Fellow University of Queensland

Producers: Jennifer Leake Rose Kerr

Audio Engineer: Isabella Tropiano

  • Sana Qadar, Presenter

Image Details

Milgram's experiments raise questions about ethics in research.  ( Getty Images: Colors Hunter — Chasseur de Couleurs )

Sana Qadar: For as long as there have been humans, there has been reason to wonder; why do people do evil things? Psychologist Stanley Milgram wanted to find out the answer to that, and he had good reason.

Alex Haslam: One of the key things to know about him was that he was of Jewish descent. His family, most of them had fled Germany just prior to the Holocaust or to escape it, but not all of them had.

Sana Qadar: Those who were left behind were mostly killed.

Alex Haslam: And the fact that many of his relatives had perished in the Holocaust provided a critical impetus for his own research, which was to understand how it was possible that civilised, decent, ordinary people might have perpetrated or come to perpetrate these acts of horrendous evil.

Sana Qadar: Stanley Milgram was especially interested in the idea of obedience. As Nazi figures faced trial after World War II, many claimed they were simply following orders. Milgram wanted to interrogate this; could people really be led to do awful things, just by being told to do them?

Alex Haslam: The banality of evil, the idea that when we do evil things it's just a rather banal process of obedience and just kowtowing to authority.

Sana Qadar: But the experiment he would devise to test this would become one of the most infamous examples of unethical research in the field of psychology.

You're listening to All in the Mind, I'm Sana Qadar.

Not only was the Milgram Shock Experiment unethical, but what it demonstrates about why people do evil things is also deeply misunderstood, says University of Queensland Psychology Professor Alex Haslam.

Alex Haslam: So what you want people to believe is that this stuff is all thoughtless, whereas actually it's anything but; it requires intellectual and behavioural engagement.

Sana Qadar: So today, we revisit the Milgram Shock Experiment, what happened, and what it really tells us about the nature of evil.

Alex has been studying the Milgram Shock Experiment for the past two decades, but his fascination goes back even further than that.

Alex Haslam: I think every student of psychology is going to come across these studies at some point or other. I think I did my first week at university. But I think pretty much anybody who hears about the study and the line of research that Milgram was doing is going to ask those questions, like why, why did he find what he did, why did people behave as they did in those studies?

Sana Qadar: Before we ponder that, let's talk about how the study was designed. Stanley Milgram, as we've established, wanted to understand what made people do terrible things. He wanted to look at their willingness to follow orders. But he didn't tell study participants that's what he was investigating. Instead, he told them his study was looking at how punishment affected learning.

Alex Haslam: And what happened is they came into the laboratory, and in the laboratory is an experimenter and another person, who looked like a participant, was actually a stooge, a confederate of the experimenter.

Sana Qadar: The stooge and the unwitting study participant would then draw lots over who would play the role of the so-called 'teacher' in the experiment, and who would play the role of the 'learner'. But this draw was rigged as well, the stooge was always going to end up the learner, and the participant would always play the teacher.

Alex Haslam: Then the learner had to learn some word pairs, and then subsequently was asked to recall them. And as the task went on, they made errors.

Sana Qadar: As they did, the teacher had to administer shocks to them, using a machine that had a series of buttons on it.

Alex Haslam: Going up in 30 intervals of 15 volts, from 15 volts all the way to 450 volts, which said at the far end, 'danger, extreme shock', and actually at the far end it said 'XXX'. So effectively, they were going to have to administer lethal shocks. And the point was that every time the learner made an error, then the shock increased.

Sana Qadar: To be clear, the shocks weren't real, nothing was actually going to happen. But the study participants, the teacher, they didn't know that.

Alex Haslam: And the question that Milgram really was interested in was nothing to do with the effects of punishment on learning, it's whether or not the participants would go along with the experimenter's instructions to administer these shocks to the learner. And the bottom line and the thing that everybody remembers from the studies is that in one critical variant of that experiment, what became known as the baseline condition, 26 of the 40 participants went all the way to the extreme end of the scale, suggesting that a normal person would be effectively willing to kill somebody else simply because it was required of them in the context of taking part in a science experiment and they were asked to do so by someone in authority.

Sana Qadar: That's incredible, more than half were willing to go all the way to lethal shocks.

Alex Haslam: Yeah, and that's the bit that everybody remembers, that's obviously the headline findings from the study.

Sana Qadar: But Professor Haslam says this is the first thing we get wrong about the Milgram Shock Experiment. That result only occurred in one scenario.

Alex Haslam: As it turns out, there were actually about 25 variants of the study, and in different variants the level of obedience varied between 0% and 100%. So actually it was very contingent behaviour, but that one condition is the one everybody remembers.

Sana Qadar: What was the difference in that baseline condition versus others where there was 0%?

Alex Haslam: In that basic condition, the experimenter was in the room with the teacher, and the learner was in a different room. But in other variants, for example, the learner was in the same room right next to the teacher, and the experimenter was in a different room. Or sometimes they did the experiment in Yale, in the laboratories there, but they did other variants where they did it in a downtown shopping centre.

They did other variants where there were two other confederates who both refused to administer shocks. And then the question was, well, when two other people say 'I'm not going to do it', what do you do? And all of these different variants had very profoundly different results. And I mean, the dynamics of that in themselves are incredibly interesting, but they're a much neglected feature of the paradigm and of its findings.

Sana Qadar: So there was something about the person who was on the receiving end of that shock not being in the room that made them more willing to administer the punishment.

Alex Haslam: Yeah. One of the studies that we've done more recently trying to interrogate what was going on there, was that, broadly speaking, things which lead you to identify with the experimenter and with the science make you much more likely to obey, and that, for example, is if the experimenter is in the room, or if you do it in Yale and so on and so forth, and things that lead you to identify with the learner actually make you less likely to obey, less likely to punish them. So if the learner is sitting next to you, and you can see them writhing, seemingly writhing in agony because you've just administered a shock to them, then you're much less likely to administer it. So actually the power of the Milgram paradigm is that it captures the drama or the tension between these two psychological forces; the identification on the one hand with the science experimenter, on the other hand with the learner as a representative of the general community.

Sana Qadar: Professor Alex Haslam says Milgram's early writing on his studies reflected this nuance. But as the years went on, that changed.

Alex Haslam: He started out in…the studies were conducted in the summer of 1963, and the first couple of things that he wrote about them, he focused a lot on these kinds of dynamics and issues to do with identification. By the time he wrote his book Obedience to Authority in 1974, he'd actually settled on a much more straightforward account, which was basically the idea that when people are given instructions by a person in authority, they just cede responsibility to that authority and, if you like, blindly obey them. So effectively his model becomes a model of blind obedience.

What's interesting is that if you look at his accounts and his notebooks, and myself and my colleague, Steve Reicher, have spent a lot of time at Yale in the archives there going through them, it's very clear that early on there was no way that he thought that was what was happening. And when you actually look at the transcripts and all the materials, it's pretty clear that it was anything but blind obedience. But that's what people have come to understand the studies as showing, and that's kind of what it says in most of the psychology textbooks. And if anyone listening to this program has ever studied psychology, they almost definitely will have heard of the Milgram studies, and they almost definitely would have been presented with that account of them.

The problem is it doesn't explain the findings at all. In particular, the idea of blind obedience doesn't explain the variance that you see across the conditions. And realistically, Milgram knew it didn't. But over time, I think he just he just took the path of least resistance and the story that people wanted to hear, which was the story of blind obedience, that kind of went down quite well in certain quarters, I think.

In the intervening 60 or 70 years, lots of historians and lots of other people have been incredibly critical of Milgram's analysis, arguing that it really is a total misrepresentation of what goes on in the world.

Sana Qadar: So what do the Milgram Shock Experiments tell us then? Well, Alex says a lot about the role of leadership and buy-in.

Alex Haslam: Our core argument is that to the extent that the Milgram study and others like it shed light on the process of tyranny, they do so by showing that this is a process by which people get recruited to particular projects, tyrannical evil projects, but they do so through a process of leadership in which they're led to identify with a particular cause. In Milgram's studies, the cause was science. And they believe that what they are doing is good. So very few people sign up to, you know, evil projects because they're evil, they sign up to them because they think they're good and they're making a worthwhile and valuable contribution to communities that matter to them, whether that's Germany or white America, or science.

And the process of identification is absolutely critical. And on the one hand, you have leadership which cultivates that identification, and gets people to believe in us and our project, and Milgram did that in his studies. And on the part of the participants then, rather than obedience, what you get is something that we call engaged followership, which is where they buy into the project and then they go along with it to the extent that they identify with it. And in that context, they respond creatively to the injunctions and urgings of their leader.

And just to give you one real-world application of those ideas, we recently worked on a paper which was around the attack on the Capitol on January 6. What you see is that dynamic between Trump and his followers where Trump is urging them to identify with a cause of overthrowing the result of the election…

Donald Trump: [archival] Mike Pence is going to have to come through for us, and if he doesn't, that will be a sad day for our country.

Alex Haslam: And then those followers, to the extent that they identify with him, then show that engaged followership and then act and respond creatively to what they perceive as his direction or where they think he wants them to go.

[Audio of Capitol Hill riot]

Sana Qadar: Alex says a similar dynamic played out in Nazi Germany,

Alex Haslam: The British historian Ian Kershaw refers to that in the context of the Nazi state as a process of working towards the Führer. So he says, look, the dynamic of Nazi Germany was not a dynamic of obedience, it wasn't that Hitler told people what to do and they went and did it, on the contrary, actually one of the things that was characteristic of the German state was there weren't a lot of orders, and that's exactly why you can have Holocaust deniers who say no one ever said to do that, because they kind of didn't. But what people did was they said, 'What is it that the Führer wants me to do?' and then they responded creatively to what they understood his will to be. And the same is true of actually most acts of tyranny. If you look at petty or large-scale ones, that actually when people do those things it's because they think that they're right and they identify with a leadership that points them in the right direction, but doesn't necessarily step them through what they have to do, but rather leaves them to their own creative devices.

Sana Qadar: Okay, coming back to the Milgram Shock Experiment, let's talk about what made this study unethical because the shocks that were being administered weren't real. No one was actually getting shocked. So how did this study crossover into unethical territory?

Alex Haslam: Well, it's really fascinating in the sense that the power of the Milgram paradigm too is partly that if when you watch it or see it, you see the tension in the eyes and the body language of participants. And the point there is precisely because Milgram has created this tense situation, this situation of psychological conflict between being torn between the experimenter and the learner, it's a very aversive situation for learners. And again, the idea that many of them resolve that by ostensibly being willing to, as it were, kill another person, well, you can imagine that at the end of the experiment when there is a big reveal and you say, 'Well, actually, the shocks weren't real and this person was an actor, but, by the way, you just killed them,' and the idea that you could confront people with evidence of their own willingness to kill somebody is in itself pretty confronting. And there's plenty of evidence that the participants found the whole experience incredibly stressful. And the idea that you should have to endure that level of psychological torture in the name of science I think it's something that people correctly have a lot of difficulty with.

Sana Qadar: Do we know what happened to the participants and how this experiment affected them going forward?

Alex Haslam: So one of the really interesting things about the process through which Milgram managed the process was that at the end of the study, he gave the participants, the people who had been administering the shocks, a debriefing, and in them, he, if you like, managed their guilt and their anxiety and stress by explaining to them that they had indeed done something really wonderful. And he said, 'Look, you know, I know this was really stressful for you, but look, it was really important because, okay, we didn't find out about the effects of punishment on learning but what we have found out is something about people's willingness to obey the toxic instructions of an authority figure. And this is really important to understand things in the world, and you have made a great contribution to science.' And the evidence is that most of them left the laboratory reasonably comfortable with that explanation.

Sana Qadar: Really.

Alex Haslam: They said, okay, I get it now and that's kind of interesting. And they went away…the critical bit there was that that bit of the process was itself really about Milgram's leadership and their engaged followership, because now he's saying, 'Look, the science here was really important, and you've made a really good contribution. Yeah, it was painful, but what you did was good.'

The really interesting thing then is they go out into the world…and there's one really interesting story that's told about one participant where he goes into a bar in New Haven in Connecticut, and he meets a school teacher, this is many years later, and they're talking and the guy says, 'Oh, I teach psychology,' and the other guy goes, 'Oh yeah, I once took part in a psychology study,' and it transpires he'd been a participant in Milgram's research, and this teacher was teaching about Milgram's research. And the teacher says, 'Look, it would be really fascinating if you could come along to my class and talk to my students about your experience.' And the guy says, 'Absolutely, I'll be really happy to come along.'

So he goes along to the class thinking that all the students are going to think he was such a wonderful guy because he done this amazing thing for science. But of course, that's not their view at all, they think he's a monster, and they're going, 'How the hell did you do it? Why did you do that?' Though actually the ethics have this sort of nuance to them too, which is actually Milgram resolved the ethics by making people feel that what they had done was good and worthy, but actually you can argue, well, how ethical is that? How ethical is it to get people to feel good about having participated in a study of this form? And that's not a question that people routinely ask.

Sana Qadar: So all of Alex's analysis points to the power of a leader, but the situation people find themselves in, the context, matters too. This is where perhaps the most well-known unethical experiment in psychology comes into play.

Alex Haslam: So we talked about Milgram, and the message of Milgram is that people blindly obey the orders of an authority figure and that leads them to perpetrate these acts of harm. The Stanford Prison Experiment ostensibly tells an even more shocking story, which is you don't even necessarily have to have a leader, you just have to put people in situations which seems to require people to behave in a toxic way, and that they will conform to the expectations of that context.

Sana Qadar: A quick refresher on the Stanford Prison Experiment. It took place in 1971, in the basement of the Stanford psychology department. Psychologist Philip Zimbardo created a mock prison down there, and he recruited a bunch of students and assigned some of them to play the role of prisoners and others to play the role of guards. The aim was to measure the effect of role playing and social expectations on behaviour over a period of two weeks. But things turned so brutal so quickly that Zimbardo terminated the experiment after just six days.

Alex Haslam: Zimbardo's story then is he just put the guards and prisoners in there and said, 'Okay, you do your best, see what happens,' and then stood back and observed the events and the dynamics unfold. And they unfolded in such a way that, he argues, because they conformed to the expectations associated with their roles, the guards then started to mete out punishment and abuse to the prisoners because that's what they thought was demanded of their roles, that was the script, if you like, that they had inherited. And then the experiment was stopped after six days, such was the brutality that the prisoners were experiencing. And this became a living hell, and Zimbardo's argument was that's because, again, normal people just conform to the expectations of a given situation, you don't have to be a monster in order to do monstrous things, you just have to find yourself in a monstrous situation of the form that he had created.

Sana Qadar: But Professor Alex Haslam, again, has a different take.

Alex Haslam: Well, let's start with the leadership bit. Zimbardo, in his own account of the study, says we had no role in the study, the guards' behavioural scripts (his phrases) were the sole source of guidance. But actually when you look into the Stanford archives, you see a number of things. One is that actually Zimbardo gave a briefing to his guards before the study, in which he laid out pretty clearly how he thought this thing was going to unfold. And also in that context, it's pretty clear too that he really saw the guards as his research assistants, they were the people who were going to make this thing happen. And there's other research actually which shows very neatly that if you just play people Zimbardo's briefing of the guards, they come away with a very clear idea of what it was that he wanted them to do. Then as things unfold, when some of the guards do that, he obviously doesn't interfere and say, no, you shouldn't do that, stop, he lets them on and he gives the indication that that's kind of what he wants. So there's that bit of it.

Sana Qadar: But not all of the guards were so keen and willing to follow orders.

Alex Haslam: This came to light a couple of years ago through a French researcher who went into the Stanford archive and unearthed some interviews that Zimbardo and his fellow experimenters had conducted with guards who didn't want to brutalise the prisoners. So that for a start tells you that it wasn't blind conformity. And what's interesting there is that then Zimbardo and his experimenter interject and say no, no, it's really important that you go and behave like a tough guard because we're going to make these really important statements about prison brutality, and this, that and the other.

So they, rather like Milgram, try to persuade those reluctant guards that this is a worthy cause. And of course that was why they were there. Again, the participants, they'd come to the laboratory to make a contribution to science, and to the extent that they thought this was a worthwhile project, they kind of went along with it, but many of them didn't, and they didn't. So, you see, Zimbardo's leadership, which has been totally airbrushed out of the analysis, was critical. But so too was this structured engaged followership on the part of the participants.

Sana Qadar: Zimbardo recorded videos of how the Prison Experiment unfolded over the six days. And there's one exchange that happens towards the end of the study between one of the most brutal guards and one of the prisoners. Alex says it's illustrative of his arguments.

Alex Haslam: There's a bit of a tense interaction, and the guard says to the prisoner, 'If you had been a guard, what do you think you would have done?' And the prisoner says, 'I don't know what I'd have done, all I know is that if I had been a guard, I don't think I would have taken the job on as enthusiastically as you did. And if I had been a guard, I don't think it would have been such a masterpiece,' is the phrase he uses.

So the point there, again, is we've been handed down the view that evil and tyranny on the part of the perpetrators is just like painting-by-numbers, you just follow your orders or you just do what you're told or you just conform blindly. And our analysis and the work we've done around it, and I think that aligns much more with the historical evidence too is that, no, tyranny where it's effective is not a paint-by-numbers exercise, it's a masterpiece, and it requires that very constructive engagement by perpetrators.

Historians talking about this say that's precisely why the Nazi state was so frightening and so effective, was that you had that high level of engagement by most Nazis. By the same token, if you look at the Soviet state, as it emerged subsequently, it wasn't characterised by those high levels of identification, didn't have the dynamism and wasn't anything like as effective.

Sana Qadar: So why does this idea of blind obedience being the culprit in a lot of wrongdoing persist then? Why do we so often fail to see the role of leaders? Well, it's self-serving in a sense.

Alex Haslam: We argue the reason for that is because when people are talking about these things in the world at large, often it's the leaders who are ostensibly going to be held to account for them. And leaders want to be able to say, no, no, no, it wasn't me, this was just human psychology and these people were just sheep blindly going along with whatever they were told to do. So you want an analysis, which you can…and again, you can talk about…for example, if you talk about Volkswagen, and Dieselgate.

Sana Qadar: Dieselgate, also known as the Volkswagen emissions scandal, went down in 2015 when Volkswagen was found to have installed devices on its cars in the US that could cheat emissions tests.

Alex Haslam: What you've got there was exactly this dynamic. Volkswagen was saying, look, we've got to sell our cars, we've got these crazy environmental laws, we need to find a way around them. And the engineers go and they devise these instruments that allow them to bypass that, okay, that's engaged followership, because they identify with Volkswagen, they don't like this whole environmental stuff.

But when they get to court, well, Volkswagen don't want to say yeah, actually, you know what, we created an environment in which we wanted our engineers to solve this problem. No, they just say no, there was a few bad apples here and there was a bit of stuff that we weren't really in control of and that's just what kind of happens. And of course that's what happened to all the people that Murdoch hung out to dry in the phone hacking thing. It's nothing to see here, it's all about these daft or disengaged followers, who are just kind of conforming blindly without really thinking about it. So what you want people to believe is that this stuff is all thoughtless, whereas actually it's anything but; it requires intellectual and behavioural engagement.

Sana Qadar: One of the biggest legacies of the Milgram Shock Experiment and the Stanford Prison Experiment is that afterwards ethics standards for psychological research were finally introduced in the United States.

Alex Haslam: People said, hang on, this just isn't right. And those two studies really precipitated a sort of a tightening up, a clamping down on that kind of research. And the thing that's most fascinating is that most people when they persuade themselves or other people to do bad things, ultimately do so because they've persuaded themselves that this is a worthy enterprise. Milgram absolutely believed it was a worthy enterprise, because he was really interested in studying the Holocaust and understanding why half his family had been killed. I mean, that seems a pretty reasonable thing to want to understand. And the idea that you're going to stress a few participants in the process of answering that question, you can see there that the cost and benefits is going to lead you to say, well, maybe we do have to do some difficult science here.

But ultimately, nobody does those things because when they do that maths they come out thinking, you know what, I'm just a bad person. Whenever we do harm, whether it's a scientist or in the world, generally speaking we do it because we think it's for the greater good.

Sana Qadar: That's Alex Haslam, a Professor of Social and Organisational Psychology at the University of Queensland. This episode was produced by myself, Jennifer Leake and Rose Kerr, our sound engineer was Isabella Tropiano. That's it for All in the Mind, I'm Sana Qadar, and I'll catch you next time.

Discover more podcasts

Download the ABC listen app to hear more of your favourite podcasts

stanley milgram prison experiment

Stanley Milgram (Psychologist Biography)

practical psychology logo

In 2002, the Review of General Psychology listed Milgram as the 46th most eminent psychologist of the 20th century.

Stanley Milgram

Who Is Stanley Milgram?

Stanley Milgram was an American social psychologist, researcher, and author. He is best known for his infamous obedience experiment. Milgram’s work contributed significantly to a deeper understanding of human nature and helped to establish ethical standards for future psychology experiments. 

Stanley Milgram's Early Life

Stanley Milgram was born on August 15, 1933, in the Bronx, New York. He was the second of three children born to Samuel and Adele Milgram, working-class Jewish immigrants from Eastern Europe. Milgram’s mother was from Romania and his father, from Hungary. His father was a baker and his mother worked in the bakery.

Samuel and Adele Milgram were hardworking people who impressed upon their children the importance of education and having a profession. Milgram’s sister, Marjorie, was a year and a half older than he was, and his brother, Joel, was five years younger. Stanley enjoyed a very close relationship with Joel, who was always proud of his older brother’s achievements.

The Milgram family lived in a neighbourhood that consisted primarily of Jewish immigrants. As a child, Milgram had very little interaction with, or knowledge about, the non-Jewish world. The year of his birth was also the year Adolf Hitler took control of Germany, and with the rise of Nazism, his parents became increasingly concerned about the welfare of their Jewish relatives in Europe. Milgram witnessed their constant worry and as a boy, often huddled with them around the radio listening anxiously to news about the war that had broken out in Europe.

The events in Europe had a significant impact on Milgram, who identified with the suffering of his fellow Jews at the hands of the Nazis. In his bar mitzvah speech, he reflected on their tragic fate and referred to it as part of his own heritage. His concern over the Jewish community in Europe remained with him long into adulthood and even helped to shape his famous obedience experiments.

Philip Zimbardo and Stanley Milgram

As a boy, Milgram displayed above-average intelligence. He shied away from sports and while other children played in the streets, he spent his time exploring science. He was gifted a chemistry set by an older cousin and enjoyed mixing different chemicals together and observing the reaction. On one occasion, he and his friends lowered a container of sodium into the Bronx River and the resulting explosion caused fire engines to rush to the site.

Milgram attended James Monroe High School in the Bronx, where he became classmates with another future prominent psychologist, Philip Zimbardo . His primary focus during this time was on getting into college and excelling academically. Milgram was a member of Arista, an honor society, and served as an editor for the school paper. He also had an interest in drama and assisted with stagecraft for his school’s productions.

Educational Background

After completing high school, Milgram attended Queens College, which later became part of the City University of New York (CUNY). While there, he was appointed vice president of the International Relations Club and the Debating Club. He graduated in 1954 with a bachelor’s degree in political science. By that time, however, Milgram had become dissatisfied with the philosophical nature of political science. His interest shifted to social psychology which he believed offered a more practical approach to the issues that were of interest to him. For example, he wanted to better understand how Hitler was able to seize control of Germany and initiate the Holocaust.

Milgram applied for graduate studies in social psychology at Harvard but was initially rejected as he had no background in psychology. Determined to pursue his goal, he signed up for several psychology courses during the summer of 1954 at three different institutions - Hunter College, Brooklyn College, and New York University. He was later admitted to Harvard in the fall of that year and was awarded a Ford Foundation fellowship to pursue his studies. He graduated with a PhD in social psychology in 1960.

What Inspired Stanley Milgram to Study Social Influence?

Several faculty members at Harvard had a significant impact on Milgram’s academic and professional career, including prominent psychologists Jerome Bruner, Gordon Allport, and Roger Brown. His greatest scientific influence, however, was Solomon Asch , who served as a visiting lecturer at Harvard from 1955 to 1956. During that time, Milgram worked as Asch’s teaching and research assistant, getting a firsthand view of his experiments.

Milgram was particularly interested in Asch’s conformity studies and adapted Asch’s methods to a study of cross-cultural differences in conformity between Norway and France. He spent eighteen months between Oslo and Paris conducting this research which served as the basis of his doctoral dissertation. The study was completed under the direction of Gordon Allport. Between 1959 and 1960, Milgram took a part time job as a research (and editing) assistant to Asch at the Institute for Advanced Study in Princeton, New Jersey.

After completing his PhD, Milgram was offered a position as assistant professor of social psychology at Yale University, where his research focus shifted to the subject of obedience. In 1963, he accepted a similar position in Harvard’s social relations department but was unable to secure tenure at that university. Disappointed by this fact, he left Harvard in 1967 and joined the faculty at CUNY as professor and head of their social psychology graduate program. He did not plan on staying at CUNY for more than five years as he had hopes of working at a more prestigious university. However, the experience turned out to be much better than he had anticipated, and he remained there for the rest of his career.

Milgram’s Obedience Experiment

Milgram conducted his highly influential and controversial obedience experiment while he was an assistant professor at Yale University. Milgram was inspired to design his experiments after Adolf Eichmann, one of the organizers of the Holocaust, was captured in 1960 by Israeli intelligence agents in Argentina. Eichmann was forced to stand trial in Israel in 1961, fifteen years after he escaped from a detention camp at the end of World War II. Like several other Nazi officers who stood trial for war crimes in Nuremberg in 1946, the basis of Eichmann’s defence was that he was just following orders from his superiors.

Milgram began his obedience experiment in July 1961, a year after Eichmann’s trial ended. He was interested in the reasons presented by the defense to justify the acts of genocide committed by the Nazis. Could it be that these Nazi officers were nothing more than obedient soldiers carrying out the grisly orders given to them by authority figures? Milgram wanted to find out.

How the Obedience Experiment Worked

The obedience experiment was designed to show the extent to which regular people would obey orders if it meant seriously hurting another person. Milgram attracted male subjects by running an ad in a local newspaper and had them draw lots to see if they would play the role of “teacher” or “learner” in an experiment to improve memory. The draw was rigged so that the subject was always assigned the role of the teacher. Unknown to the teacher, the learner was actually working with Milgram and was fully aware of the real purpose of the study.

At the start of the experiment, the teacher watched as the learner was taken to a room, strapped securely to a chair, and fitted with electrodes. The teacher was then taken to a second room and seated in front of an electric shock generator. There were thirty switches on the shock generator and they were labeled from 15 volts (Slight Shock) to 375 volts (Danger: Severe Shock) to 450 volts (XXX). The fact that no real electric shocks were involved in the study was kept secret from the teacher. An experimenter (or authority figure) was also present in the room with the teacher to help direct the experiment. Only a screen separated the room with the learner from the second room with the teacher and the experimenter.

The learner was required to learn a list of word pairs. The teacher was asked to test the learner by presenting a word on the list and asking for the matching word from four possible choices. For each incorrect answer, the teacher was told to administer a shock of increasing voltage.

The learner intentionally gave wrong answers on the word tests. As a result, the teacher was required to shock the learner for each wrong answer while increasing the voltage each time. As the “shocks” increased in voltage, the teacher was able to hear the learner banging on the wall, screaming, protesting, and begging from the adjoining room. If the teacher refused to shock the learner any further, the experimenter (authority figure) would give the teacher a simple order to continue the experiment.

There were four orders or prods that the experimenter used. If the first order was not obeyed, the experimenter gave the next order on the list. If the second order was also ignored by the teacher, the experimenter moved to the third order, and then the fourth if necessary. These orders included:

  • Please continue.
  • The experiment requires you to continue.
  • It is absolutely essential that you continue
  • You have no other choice but to continue

The experimenter also assured the teacher that he (the experimenter) would assume full responsibility for anything that happened. The experiment was stopped if (1) the teacher refused to shock the learner after the fourth prod from the experimenter, or (2) if the teacher gave three 450 volt shocks to the learner.

Results of Milgram's Obedience Experiment

In Milgram’s obedience experiment all the subjects (teachers) administered shocks of up to 300 volts. Although many were reluctant, 65% of the subjects maintained their obedience to the experimenter and administered the highest possible shock of 450 volts. The experiment was repeated several times with consistent results.

Milgram proposed two theories to explain why normal people would obey an order even if it meant they may severely hurt or kill another person. These are:

  • The Theory of Conformism - A person who is not an expert in a particular field will leave decision making to the group he or she belongs to.
  • The Agentic State Theory - A person may view himself or herself as nothing more than a tool or agent to carry out the wishes of someone else. This means that the agent no longer assumes responsibility for his or her actions. Once the agent accepts this change in perspective, obedience is likely to follow.

Variations of Milgram’s obedience experiment showed that the teacher was more likely to obey instructions to shock the learner if the learner was in a completely separate room. Milgram believed the incremental manner of increasing the shocks also contributed to obedience. Before conducting the experiment, Milgram conducted a poll among the psychology students at Yale to predict what the outcome of the experiment would be. Very few people, including Milgram, expected any of the subjects to administer the most intense shock.

Milgram's experiment remains one of the most talked-about experiments in the history of psychology.

The Lost-Letter Technique

Milgram developed the lost letter technique in April 1963. He hypothesized that people who live in large cities were less responsive to the needs of their fellow man than people who live in small towns. Milgram also wanted to measure people’s attitudes towards various organizations in the country. To test his hypothesis, Milgram and his colleagues left sealed letters in public that were addressed to “Medical Research Associates,” “Friends of the Nazi Party,” “Friends of the Communist Party,” and an individual by the name of “Mr. Walter Carnup.”

While 70% of the letters addressed to the medical facility and the letters to Mr. Carnup were posted, the other letters caused quite a stir in the community. For example, several of the letters addressed to the “Friends of the Nazi Party” were sent to the FBI for fingerprinting. Milgram tried to conduct a second lost-letter experiment in 1964 by dropping the letters from an airplane. However, many of the letters landed on rooftops, in trees, and in rivers, so he was forced to abort the study.

The Small World Problem

In 1967, Milgram designed the “small-world experiment.” He was curious about how many acquaintances it took to connect two people who didn’t know each other. To find the answer, he sent packages to 160 random people in Omaha, Nebraska, with instructions to send the package to someone they knew who they thought could get it closer to its final destination—a stock broker in Boston, Massachusetts. Along with the package, each original sender was also asked to forward the same set of instructions they had received.

By the time the packages arrived in Boston, some chains had as many as ten acquaintances while others had as few as two. Milgram concluded that there were five acquaintances (or six degrees of separation) between the original senders in Omaha and the final recipient in Boston.

Applications of Milgram’s Obedience Theory

Although it was already well known that people have a tendency to obey authority figures, Milgram’s experiment showed that, in certain settings, individuals may obey destructive orders that conflict with their moral principles and do things they would never decide to do on their own. Milgram believed that once these individuals allowed an authority figure to direct their actions, they also gave up the responsibility of distinguishing what was right from what was wrong.

The findings from Milgram’s obedience experiment have been applied to several fields. Some of these include:

  • Criminal justice - to explain why people may commit atrocities if they are placed in certain social contexts. Milgram’s findings have resulted in some condemned criminals receiving life imprisonment rather than the death penalty.
  • Counseling - to give people deeper insight into themselves and help them to understand why and how their behavior can change.
  • Business - to help workers to recognize unethical demands that may be place on them by their employers
  • Law Enforcement - to help people to think for themselves rather than agreeing with everything a police officer says or suggests.

Milgram’s obedience experiment also sheds light on actions people can take to resist unwanted pressure from authority figures. These actions include:

  • Questioning the legitimacy of the authority figure
  • Asking yourself if you would take the suggested action of your own initiative
  • Resisting the urge to comply with small demands that make you uncomfortable because they may escalate to larger demands.
  • Finding an ally in your social group who agrees with your view as being a lone dissenter can be very difficult without social support.

Criticisms of Milgram’s Theory

Despite making a major cultural and scholarly impact, the Milgram Obedience Experiment is one of the most widely criticized psychology studies in history. It has been criticized for three primary reasons:

many critics claim the experiment was unethical because it caused many of the subjects to experience severe distress. Some subjects were seen sweating and trembling due to their discomfort but were told to continue the experiment. Also, Milgram did not debrief the subjects immediately after their participation ended. The subjects were not informed about the true intentions of the study and Milgram did not explain why deception was necessary.

Suggested Relevance to the Holocaust

Milgram claimed he was able to mimic the psychological processes that were at play during the Holocaust in his experiment. However, some critics have pointed out several key differences between the events of the Holocaust and Milgram’s obedience experiment such as (1) Holocaust perpetrators knew they were killing innocent people while subjects in the experiment were assured before the study that no permanent physical damage would be done, (2) Holocaust perpetrators were motivated by racism while subjects in the experiment did not know the identity of their learners, (3) Holocaust perpetrators had a clear goal to exterminate Jewish people while subjects in the experiment were often reluctant to continue, and (4) the Holocaust lasted for years and allowed perpretrators enough time to assess their morals while Milgram’s experiment lasted only an hour with little time for the subjects to think deeply about the consequences of their actions.

some critics are disturbed by discrepancies between what the experiment described and what actually happened. One critic, psychologist Gina Perry, believed Milgram intentionally manipulated the data he collected to make the results appear more impressive.

Stanley Migram Books, Awards, and Accomplishments

  • Milgram wrote several books that describe his various experiments. They include:
  • Obedience to Authority: An Experimental View , 1974
  • Television and Antisocial Behavior: Field Experiments , 1974
  • The Individual in a Social World: Essays and Experiments , 1977
  • He also produced the films Obedience (1965) and The City and the Self (1972).
  • Milgram's other awards and accomplishments include:
  • Awarded the prize for Behavioral Science Research by the American Association for the Advancement of Science, 1964
  • Elected member of the American Association for the Advancement of Science
  • Elected member of the American Academy of Arts and Sciences
  • Elected member of the American Psychological Association

Personal Life

In January 1961, Milgram met Alexandra Menkin at a party in Manhattan and the two got married in December of that year. Alexandra, who Milgram fondly called Sasha, was a social worker and spent much of her time assisting Holocaust survivors. The couple had two children, Michele and Marc, and Milgram enjoyed playing with, and caring for them.

Milgram was very much invested in his family and set aside time for them in the evenings and on weekends. They went on frequent outings to museums, parks and the movies, and took annual trips to destinations in the Caribbean, United States, Europe, Israel, and Morocco. Milgram insisted on having his wife by his side while watching television and enjoyed taking her out to gourmet restaurants. He also enjoyed playing board games like chess and Monopoly with his family.

Philantrophy

Milgram also gave generously of his time and energy to students, colleagues, and even strangers, responding to virtually all of the many letters he received. He developed several interests outside of academia, including painting, drawing, music composition, and film-making. At home, he would develop fictional plot lines for movies in which his children performed as the stars. He also wrote children’s stories and poems and in his later years, became more keenly interested in the religious and spiritual aspects of the Jewish faith.

Milgram maintained his interest in science and tried to keep abreast of the latest scientific breakthroughs and the scientists behind them. He also spent a great deal of time thinking of and carefully recording various inventions and games. Among these was a machine for rewinding carbon ribbons used in typewriters and a board game centered on the world of art, including auctions and collections.

Is Stanley Milgram Still Alive?

In 1980, Milgram experienced the first of several major heart attacks. He died of his fifth heart attack in 1984. He was 51 years of age.

American Psychological Association. (2004). Obeying and resisting malevolent orders . Retrieved from https://www.apa.org/research/action/order

Blass, T. (1996). Stanley Milgram: A life of inventiveness and controversy. In G. A. Kimble, C. A. Boneau, & M. Wertheimer (Eds.), Portraits of pioneers in psychology: Volume 2 (pp.315-331). Washington, DC: American Psychological Association.

Korn, J. H. (1997). Illusions of reality: A history of deception in social psychology . Albany, NY: State University of New York Press.

Milgram, A. (2000). My personal view of Stanley Milgram. In T. Blass (Ed.), Obedience to authority: Current perspectives on the Milgram paradigm (pp.1-7). Mahwah, NJ: Erlbaum

Perry, G. (2013). Behind the shock machine: The untold story of the notorious Milgram psychology experiments . New York: The New Press.

Rogers, K. (n.d.). Stanley Milgram. In Encyclopedia Britannica . Retrieved from https://www.britannica.com/biography/Stanley-Milgram/Later-experiments-and-publications

Stanley Milgram. (n.d.). Retrieved from https://psychology.fas.harvard.edu/people/stanley-milgram

Related posts:

  • 40+ Famous Psychologists (Images + Biographies)
  • The Milgram Shock Experiment
  • The Hofling Nurse Study
  • G. Stanley Hall Biography - Contributions To Psychology
  • The Monster Study (Summary, Results, and Ethical Issues)

Reference this article:

About The Author

Photo of author

Famous Psychologists:

Abraham Maslow

Albert Bandura

Albert Ellis

Alfred Adler

Beth Thomas

Carl Rogers

Carol Dweck

Daniel Kahneman

David Dunning

David Mcclelland

Edward Thorndike

Elizabeth Loftus

Erik Erikson

G. Stanley Hall

George Kelly

Gordon Allport

Howard Gardner

Hugo Munsterberg

Ivan Pavlov

Jerome Bruner

John B Watson

John Bowlby

Konrad Lorenz

Lawrence Kohlberg

Leon Festinger

Lev Vygotsky

Martin Seligman

Mary Ainsworth

Philip Zimbardo

Rensis Likert

Robert Cialdini

Robert Hare

Sigmund Freud

Solomon Asch

Stanley Milgram

Ulric Neisser

Urie Bronfenbrenner

Wilhelm Wundt

William Glasser

stanley milgram prison experiment

PracticalPie.com is a participant in the Amazon Associates Program. As an Amazon Associate we earn from qualifying purchases.

Follow Us On:

Youtube Facebook Instagram X/Twitter

Psychology Resources

Developmental

Personality

Relationships

Psychologists

Serial Killers

Psychology Tests

Personality Quiz

Memory Test

Depression test

Type A/B Personality Test

© PracticalPsychology. All rights reserved

Privacy Policy | Terms of Use

American Psychological Association Logo

Psychologists add caveat to ‘blind conformity’ research

February 2013, Vol 44, No. 2

Print version: page 9

Psychologists add caveat to ‘blind conformity’ research

Two iconic sets of research — Stanley Milgram's 1960s "obedience to authority" studies and Philip Zimbardo's 1971 Stanford Prison Experiment — highlighted the unsavory reality that people can be prodded into harming others. Milgram found that participants were willing to administer apparently lethal electric shocks in the context of a scientific experiment, while Zimbardo demonstrated that some people assigned to the role of prison guard ended up treating prisoners brutally.

Are we all doomed to carry out evil deeds robotically under the right circumstances? Not necessarily, say psychologists S. Alexander Haslam, PhD, of the University of Queensland, and Stephen D. Reicher, PhD, of the University of St. Andrews. In a November essay in PLOS Biology , they offer evidence from history, from Zimbardo's and Milgram's work, and from their own research showing that people who tend to follow authority aren't sheep or robots, but rather people who enthusiastically identify with a group's or leader's agenda.

"We have this model of evil as a slippery slope, as something we fall carelessly into," says Haslam. "But there's plenty of evidence that many people don't go along with paradigms they don't believe in, and that when people do commit harmful actions in a group context, it's because they strongly identify with the cause."

A historical example is Adolf Eichmann, a chief organizer of the Holocaust who is often touted as the exemplar of a bland bureaucrat following orders. But historical texts show he was highly creative, elaborating many of the practical details of the "final solution" himself. What's more, Eichmann expressed no regret during his trial, justifying his decision to send millions of Jews and others to their deaths because he believed it would build a better Germany.

Similarly, a closer look at the Milgram and Zimbardo studies suggests that many participants don't fit the mold of blind conformist. Not all the guards in Zimbardo's study treated prisoners badly, and those who did were unusually ingenious in responding to Zimbardo's initial suggestion that they could create feelings in the prisoners, such as boredom or fear. In Milgram's studies, many subjects refused to deliver the highest level of shock allowed, and many obeyed the experimenter only when he justified their actions in terms of benefiting science — and even then, they were torn.

Newer studies are starting to add empirical teeth to these observations. Haslam and Reicher conducted a version of the Stanford Prison Experiment televised by the British Broadcasting Service in 2002, showing that participants didn't automatically conform to their assigned roles and only acted in line with group membership if they identified with the group. They're conducting other related studies now, as well.

These new perspectives suggest the need for more research questioning the notion that evil is always banal, Haslam says.

"It's important we keep examining this issue because the debate is germane to some central issues in psychology — the nature of group influence, processes and dynamics and the role of the individual in those domains."

— Tori DeAngelis

Letters to the Editor

  • Send us a letter

IMAGES

  1. How the Milgram Experiment Showed That Everyday People Could Commit

    stanley milgram prison experiment

  2. Stanley Milgram, the psychologist who showed the human being his most

    stanley milgram prison experiment

  3. What Really Happened During The Milgram Experiment?

    stanley milgram prison experiment

  4. The Republican Milgram Experiment

    stanley milgram prison experiment

  5. Milgram Shock Experiment: The Most Infamous Psychological Experiment In

    stanley milgram prison experiment

  6. What Really Happened During The Milgram Experiment?

    stanley milgram prison experiment

COMMENTS

  1. Stanley Milgram Shock Experiment

    Stanley Milgram, a psychologist at Yale University, carried out one of the most famous studies of obedience in psychology. He conducted an experiment focusing on the conflict between obedience to authority and personal conscience. Milgram (1963) examined justifications for acts of genocide offered by those accused at the World War II, Nuremberg ...

  2. Milgram experiment

    Beginning on August 7, 1961, a series of social psychology experiments were conducted by Yale University psychologist Stanley Milgram, who intended to measure the willingness of study participants to obey an authority figure who instructed them to perform acts conflicting with their personal conscience.

  3. Milgram Experiment: Overview, History, & Controversy

    Yale University psychologist Stanley Milgram conducted these experiments during the 1960s. They explored the effects of authority on obedience. In the experiments, an authority figure ordered participants to deliver what they believed were dangerous electrical shocks to another person. These results suggested that people are highly influenced ...

  4. Stanford prison experiment

    The Stanford prison experiment ( SPE) was a psychological experiment conducted in August 1971. It was a two-week simulation of a prison environment that examined the effects of situational variables on participants' reactions and behaviors. Stanford University psychology professor Philip Zimbardo led the research team who administered the study.

  5. Milgram experiment

    Milgram experiment. The setup of the "shock generator" equipment for Stanley Milgram's experiment on obedience to authority in the early 1960s. The volunteer teachers were unaware that the shocks they were administering were not real. (more) Milgram included several variants on the original design of the experiment.

  6. The Milgram Experiment: Summary, Conclusion, Ethics

    The Milgram experiment was an infamous study of obedience and authority. Find out about the procedure, conclusions, and recent criticisms. ... In the most well-known version of Stanley Milgram's experiment, the 40 male participants were told that the experiment focused on the relationship between punishment, learning, and memory. The ...

  7. Stanley Milgram

    Stanley Milgram (August 15, 1933 - December 20, 1984) was an American social psychologist known for his controversial experiments on obedience conducted in the 1960s during his professorship at Yale. [2]Milgram was influenced by the events of the Holocaust, especially the trial of Adolf Eichmann, in developing the experiment.After earning a PhD in social psychology from Harvard University ...

  8. The Milgram Experiment: Theory, Results, & Ethical Issues

    The original and classic Milgram experiment was described by Stanley Milgram in an academic paper he wrote sixty years ago. Milgram was a young, Harvard-trained social psychologist working at Yale University when he initiated the first in a series of very similar experiments. The experiments were designed to understand how people could be made ...

  9. Contesting the "Nature" Of Conformity: What Milgram and Zimbardo's

    Abstract. Understanding of the psychology of tyranny is dominated by classic studies from the 1960s and 1970s: Milgram's research on obedience to authority and Zimbardo's Stanford Prison Experiment. Supporting popular notions of the banality of evil, this research has been taken to show that people conform passively and unthinkingly to both the ...

  10. 50 Years On: What We've Learned From the Stanford Prison Experiment

    The Experiment in a Nutshell. In August 1971, I led a team of researchers at Stanford University to determine the psychological effects of being a guard or a prisoner. The study was funded by the ...

  11. Stanley Milgram

    The Milgram Experiment is one of the best-known social psychology studies of the 20th century. With this remarkable accomplishment under his belt, young Dr. Milgram returned to Harvard in 1963 to take a position as Assistant Professor of Social Psychology. ... Stanley Milgram left Harvard in 1967 to return to his hometown, New York City ...

  12. Demonstrating the Power of Social Situations via a Simulated Prison

    The Stanford Prison Experiment has become one of psychology's most dramatic illustrations of how good people can be transformed into perpetrators of evil, and healthy people can begin to experience pathological reactions - traceable to situational forces. ... Research by social psychologist Stanley Milgram, PhD, (1974; ...

  13. Stanley Milgram's Experiment (SOCIAL PSYCHOLOGY)

    Stanley Milgram's Experiment. Stanley Milgram was one of the most influential social psychologists of the twentieth century. Born in 1933 in New York, he obtained a BA from Queen's College, and went on to receive a PhD in psychology from Harvard. Subsequently, Milgram held faculty positions in psychology at Yale University and the City ...

  14. Stanley Milgram

    Stanley Milgram (born August 15, 1933, New York City, New York, U.S.—died December 20, 1984, New York City) was an American social psychologist known for his controversial and groundbreaking experiments on obedience to authority. Milgram's obedience experiments, in addition to other studies that he carried out during his career, generally ...

  15. The shocking truth of Stanley Milgram's obedience experiments

    Stanley Milgram was riveted. He was a 26-year-old assistant professor at Yale University with childhood memories of the war, such as gathering around the radio with his family in their Brooklyn ...

  16. Charting the psychology of evil, decades after 'shock' experiment

    The Stanford Prison Experiment explored the horrors of a prison environment; Next Article in Health » ... The legacy of Stanley Milgram, who died 24 years ago on December 20, reaches far beyond ...

  17. Charting the psychology of evil, decades after 'shock' experiment

    The legacy of Stanley Milgram, who died 24 years ago on December 20, reaches far beyond that initial round of experiments. ... Stanford Prison Experiment. This idea of circumstances driving ...

  18. The Stanford Prison Experiment was massively influential. We just ...

    The Stanford Prison Experiment, one of the most famous and compelling psychological studies of all time, told us a tantalizingly simple story about human nature. The study took paid participants ...

  19. Episode 176: Situationism in Psych: Milgram & Stanford Prison

    On Stanley Milgram's "Behavioral Study of Obedience" (1963), Philip Zimbardo's "Interpersonal Dynamics in a Simulated Prison" (1973), and John Doris's "Persons, Situations, and Virtue Ethics" (1998). ... A more immersive example was provided by The Stanford Prison Experiment, where students took on the roles of guard and prisoner, and ...

  20. Milgram Shock and Stanford Prison

    Psychologist Stanley Milgram wanted to understand if people could be led to do awful things, just by being told to do them. ... The Stanford Prison Experiment ostensibly tells an even more ...

  21. The Milgram Shock Experiment

    History of the Milgram Shock Study. This study is most commonly known as the Milgram Shock Study or the Milgram Experiment. Its name comes from Stanley Milgram, the psychologist behind the study. Milgram was born in the 1930s in New York City to Jewish immigrant parents. As he grew up, he witnessed the atrocities of the Holocaust from thousands ...

  22. Stanley Milgram (Psychologist Biography)

    Stanley Milgram was born on August 15, 1933, in the Bronx, New York. He was the second of three children born to Samuel and Adele Milgram, working-class Jewish immigrants from Eastern Europe. Milgram's mother was from Romania and his father, from Hungary. His father was a baker and his mother worked in the bakery.

  23. Psychologists add caveat to 'blind conformity' research

    Two iconic sets of research — Stanley Milgram's 1960s "obedience to authority" studies and Philip Zimbardo's 1971 Stanford Prison Experiment — highlighted the unsavory reality that people can be prodded into harming others. Milgram found that participants were willing to administer apparently lethal electric shocks in the context of a ...