You are viewing a preview of this course. Sign in to start learning

Lesson 8: Social Biases โ€” Herd Behavior, Groupthink, Authority

Discover how groups and social pressure systematically distort individual judgment, from conformity experiments to corporate disasters.

Lesson 8: Social Biases โ€” Herd Behavior, Groupthink, Authority ๐Ÿง‘โ€๐Ÿคโ€๐Ÿง‘

Introduction: When Smart People Make Stupid Decisions Together

You've learned about individual cognitive biases in previous lessonsโ€”how your brain tricks you when making decisions alone. But what happens when you add other people to the mix? The results can be far more dramatic and dangerous.

In 1961, President John F. Kennedy and his team of brilliant advisorsโ€”including the "best and brightest" minds of their generationโ€”approved the Bay of Pigs invasion, a catastrophically flawed plan to overthrow Fidel Castro. The operation failed within 72 hours, causing international embarrassment and nearly triggering nuclear war. How did such intelligent people make such a terrible decision?

The answer lies in social biasesโ€”systematic errors in thinking that emerge when we're influenced by others. These biases don't just affect political leaders; they shape decisions in boardrooms, operating rooms, investment committees, and even casual family gatherings. Understanding them is essential for anyone who works in teams, leads groups, or wants to make better collective decisions.

๐Ÿ’ก Key Insight: Social biases often override individual judgment. People who think clearly alone can make disastrous decisions in groups.


Core Concept 1: Conformity Bias โ€” Following the Crowd ๐Ÿ‘ฅ

Conformity bias is the tendency to align your beliefs, attitudes, and behaviors with those of a group, even when the group is clearly wrong. This isn't about conscious agreementโ€”it's an automatic psychological response to social pressure.

The Asch Conformity Experiments (1951)

Psychologist Solomon Asch conducted one of the most famous experiments in social psychology. He showed participants a simple task:

Standard Line:        Comparison Lines:
     |                A.  ||
     |                B.  |
     |                C.  |||
     |                

Which comparison line (A, B, or C) matches the standard line? Obviously, it's B.

But here's the twist: Asch placed one real participant in a room with several confederates (actors pretending to be participants). The confederates deliberately gave wrong answers. When it was the real participant's turn, 75% conformed to the group at least once, choosing an obviously incorrect answer.

Think about that: Three-quarters of people denied the evidence of their own eyes because everyone else disagreed.

Why Conformity Happens

Normative social influence: We want to be liked and accepted by the group. Going against the majority risks rejection, ridicule, or conflict.

Informational social influence: We assume the group knows something we don't. "Maybe I'm the one who's wrong. They all seem so confident."

    Individual sees evidence โ†’ Group disagrees โ†’ Internal conflict
                                                         โ†“
                                          "Trust my judgment" vs "Trust the group"
                                                         โ†“
                                              Conformity (75% of people)

๐Ÿ’ก Modern Manifestation: Social media amplifies conformity. When you see thousands of people sharing the same opinion, it's hard to voice disagreementโ€”even when you have good evidence they're wrong.

Factors That Increase Conformity

+---------------------------+----------------------------------------+
| Factor                    | Effect                                 |
+---------------------------+----------------------------------------+
| Group size                | Conformity peaks at 3-5 people, then  |
|                           | plateaus (adding more doesn't help)    |
+---------------------------+----------------------------------------+
| Unanimity                 | One dissenter breaks the spellโ€”        |
|                           | conformity drops dramatically          |
+---------------------------+----------------------------------------+
| Public vs private         | Much higher conformity when responses  |
|                           | are public (face-saving)               |
+---------------------------+----------------------------------------+
| Task difficulty           | More conformity when answer is         |
|                           | ambiguous or complex                   |
+---------------------------+----------------------------------------+
| Status of group           | Higher conformity with high-status     |
|                           | or expert groups                       |
+---------------------------+----------------------------------------+

๐Ÿ”ง Try This: Next time you're in a meeting and disagree with the emerging consensus, speak up early. The first dissenter makes it psychologically easier for others to voice concerns.


Core Concept 2: Authority Bias โ€” Obeying Orders ๐Ÿ‘ฎ

Authority bias is the tendency to overvalue the opinions and directives of authority figures, even when they contradict evidence or ethics. We're hardwired to respect hierarchiesโ€”a trait that helped our ancestors survive but can lead to catastrophic decisions today.

The Milgram Obedience Experiments (1961-1963)

Stanley Milgram conducted perhaps the most disturbing experiments in psychology. Participants were told they were testing the effects of punishment on learning. They were instructed to deliver electric shocks to a "learner" (actually an actor) every time he made a mistake, increasing the voltage with each error.

The shock machine was labeled from 15 volts ("Slight Shock") to 450 volts ("XXX - Danger: Severe Shock"). As voltage increased, the learner screamed, begged to stop, complained of heart problems, and eventually went silent (suggesting unconsciousness or death).

The experimenter, wearing a lab coat, calmly instructed participants to continue: "The experiment requires that you continue" and "You have no other choice; you must go on."

Results: 65% of participants delivered the maximum 450-volt shock, even while showing extreme stress, trembling, and sweating. They continued despite their moral objections because an authority figure told them to.

Why Authority Bias Is So Powerful

Diffusion of responsibility: "I'm just following orders. The authority figure is responsible for the outcome."

Legitimacy: We trust that authorities (doctors, bosses, experts) have knowledge and good reasons for their instructions.

Social conditioning: From childhood, we're taught to obey parents, teachers, police, and bosses. Defiance feels wrong.

        Authority Issues Command
                  โ†“
        Individual Assessment:
        "This seems wrong, but..."
                  โ†“
     โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
     โ†“                         โ†“
  "Authority knows         "I'm just
   something I don't"      following orders"
     โ†“                         โ†“
        Compliance (65%)

๐Ÿค” Did You Know? Milgram's experiments were inspired by the Holocaust defense: "I was just following orders." His research showed that ordinary peopleโ€”not just Nazisโ€”could commit horrific acts under authority pressure.

Authority Bias in Modern Organizations

Medical errors: Nurses and junior doctors often hesitate to question senior physicians, even when they suspect a mistake. Studies show this deference contributes to thousands of preventable deaths.

Aviation disasters: Before "Crew Resource Management" training, co-pilots rarely challenged captains. Multiple crashes resulted from junior officers knowing about problems but not speaking up.

Corporate scandals: Employees at Enron, Theranos, and Wells Fargo ignored ethics because executives ordered them to meet targets "by any means necessary."

๐Ÿ’ก Protection Strategy: The best organizations explicitly give permission to question authority. The phrase "psychological safety" means team members can challenge leaders without fear of punishment.


Core Concept 3: Groupthink โ€” The Illusion of Invulnerability ๐ŸŽญ

Groupthink is a psychological phenomenon where the desire for harmony and consensus in a cohesive group leads to irrational decision-making. Group members suppress dissent, ignore alternatives, and fail to critically analyze plans.

Psychologist Irving Janis coined the term in 1972 after studying major policy fiascoes.

Symptoms of Groupthink

+--------------------------------+------------------------------------------+
| Symptom                        | Description                              |
+--------------------------------+------------------------------------------+
| Illusion of invulnerability    | Excessive optimism; belief group        |
|                                | can't fail                               |
+--------------------------------+------------------------------------------+
| Collective rationalization     | Dismissing warnings that challenge      |
|                                | group assumptions                        |
+--------------------------------+------------------------------------------+
| Belief in inherent morality    | "We're the good guys, so our actions    |
|                                | must be right"                           |
+--------------------------------+------------------------------------------+
| Stereotyping out-groups        | Viewing opponents as too evil or        |
|                                | stupid to negotiate with                 |
+--------------------------------+------------------------------------------+
| Self-censorship                | Members suppress doubts to maintain     |
|                                | harmony                                  |
+--------------------------------+------------------------------------------+
| Illusion of unanimity          | Silence interpreted as agreement        |
+--------------------------------+------------------------------------------+
| Direct pressure on dissenters  | Members who question the plan are       |
|                                | criticized or isolated                   |
+--------------------------------+------------------------------------------+
| Self-appointed mindguards      | Some members shield the group from      |
|                                | adverse information                      |
+--------------------------------+------------------------------------------+

Classic Example: The Bay of Pigs Invasion (1961)

President Kennedy's team exhibited every symptom:

  • Illusion of invulnerability: "The CIA plan must be solid. We're the smartest team ever assembled."
  • Collective rationalization: When experts warned the plan was flawed, advisors dismissed them.
  • Self-censorship: Arthur Schlesinger Jr. had serious doubts but stayed silent, later writing, "I bitterly reproached myself for having kept so silent."
  • Illusion of unanimity: Kennedy asked for objections; none came (though many had private concerns).
  • Mindguards: Some advisors actively prevented dissenting voices from reaching Kennedy.

The result? 1,400 Cuban exiles landed at the Bay of Pigs with no air support, facing 20,000 Cuban troops. The invasion was crushed within three days.

๐Ÿ’ก The Irony: After this disaster, Kennedy implemented procedures to prevent groupthink during the Cuban Missile Crisis (1962). The result? One of history's most successful foreign policy decisions.

Conditions That Foster Groupthink

               High Cohesion
                     +
            Directive Leadership
                     +
           Lack of Procedures
                     +
          External Pressure
                     โ†“
              GROUPTHINK
                     โ†“
    โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
    โ†“                โ†“                โ†“
  Poor           Incomplete      Failure to
  Information    Analysis of     Consider
  Search         Alternatives    Risks

๐Ÿง  Mnemonic for Groupthink Symptoms: "I Can't Believe Students Self-Impose Dumb Mistakes"

  • Illusion of invulnerability
  • Collective rationalization
  • Belief in inherent morality
  • Stereotyping out-groups
  • Self-censorship
  • Illusion of unanimity
  • Direct pressure on dissenters
  • Mindguards

Core Concept 4: Herd Behavior โ€” Financial Bubbles and Panics ๐Ÿ“ˆ

Herd behavior (or herd mentality) is following the actions of a larger group, often irrationally, because "everyone else is doing it." While related to conformity, herd behavior specifically involves actions (buying, selling, fleeing) rather than just opinions.

How Herds Create Bubbles

   Stage 1: Smart Money        Stage 2: Public Awareness
   Early adopters buy          Media coverage increases
   Price: $10 โ†’ $15            Price: $15 โ†’ $30
           โ†“                            โ†“
   Stage 3: Mania              Stage 4: Panic
   "Get rich quick!"           "Get out before it's zero!"
   Everyone buys               Everyone sells
   Price: $30 โ†’ $100           Price: $100 โ†’ $5

Dotcom Bubble (1995-2000): Companies with no revenue traded at billions because "internet" was in their name. Investors who questioned valuations were mocked as "not getting it." When reality hit, trillions evaporated.

Housing Bubble (2003-2008): "House prices never go down." Millions bought homes they couldn't afford because neighbors were getting rich flipping properties. The collapse triggered the Great Recession.

Crypto/NFT Mania (2021-2022): "Have fun staying poor." Social media amplified FOMO (fear of missing out), driving prices to absurd levels before the inevitable crash.

The Psychology Behind Herds

Information cascade: "If everyone's buying, they must know something I don't."

FOMO (Fear of Missing Out): The pain of watching others profit is intense, overriding rational analysis.

Safety in numbers: "If I'm wrong, at least I'm wrong with everyone else. No one can blame me."

๐Ÿ’ก Warren Buffett's Wisdom: "Be fearful when others are greedy, and greedy when others are fearful." The best investors do the opposite of the herd.


Core Concept 5: Bystander Effect โ€” Diffusion of Responsibility ๐Ÿšจ

The bystander effect is the phenomenon where individuals are less likely to help a victim when other people are present. The more bystanders, the less likely anyone is to help.

The Murder of Kitty Genovese (1964)

Kitty Genovese was stabbed to death outside her apartment building in New York City. The New York Times reported that 38 witnesses watched or heard the attack but did nothingโ€”not even calling police.

While later reporting revealed this account was exaggerated, the case sparked psychological research into why people fail to act in emergencies.

The Darley and Latanรฉ Experiments (1968)

Psychologists John Darley and Bibb Latanรฉ tested the bystander effect scientifically:

Setup: A participant hears another "participant" (actually a recording) having a seizure in another room.

Results:

  • When the participant believed they were alone: 85% helped within 60 seconds
  • When they believed one other person was present: 62% helped
  • When they believed four others were present: Only 31% helped
      Help Response Rate
           โ†‘
      85% |โ—
          |
      62% |    โ—
          |
      31% |         โ—
          |________________โ†’
          Alone  1 other  4 others
                Bystanders Present

Why the Bystander Effect Happens

Diffusion of responsibility: "Someone else will help. It's not my job."

Pluralistic ignorance: Everyone looks calm, so maybe it's not an emergency. (But everyone is staying calm for the same reason.)

Evaluation apprehension: Fear of helping incorrectly and looking foolish.

๐Ÿ”ง How to Overcome It: If you need help in an emergency, point to a specific person: "You in the blue shirt, call 911!" This eliminates diffusion of responsibility.


Core Concept 6: In-Group vs. Out-Group Bias ๐Ÿ

In-group bias is the tendency to favor people who belong to your group while discriminating against outsiders. Out-group homogeneity bias is viewing out-group members as "all the same" while recognizing diversity within your own group.

These biases are incredibly powerful and automatic.

The Minimal Group Paradigm

Researcher Henri Tajfel showed that in-group bias forms instantly, even with meaningless distinctions:

  1. Participants were randomly assigned to groups based on trivial criteria (coin flips, aesthetic preferences)
  2. They then distributed rewards between group members
  3. Result: People consistently favored their own group, even knowing the assignment was random and they'd never meet group members

Modern Manifestations

Political polarization: "Our side has diverse views; their side are all extremists."

Sports rivalries: Fans riot over arbitrary team affiliations.

Corporate culture: "Our company is innovative; competitors are bureaucratic."

Nationalism: "Our country's actions are justified; theirs are aggression."

          MY GROUP               OTHER GROUP
             โ†“                        โ†“
    "Smart, diverse,         "Stupid, all the same,
     well-intentioned"        ill-intentioned"
             โ†“                        โ†“
    Attribute success       Attribute success
    to character            to luck
             โ†“                        โ†“
    Attribute failure       Attribute failure
    to circumstances        to character

๐Ÿ’ก Reality Check: When you find yourself thinking "they're all X," that's your brain's out-group bias talking, not objective reality.


Detailed Examples with Explanations

Example 1: The Challenger Disaster (1986) โ€” Groupthink Kills

Background: NASA planned to launch the Space Shuttle Challenger on January 28, 1986. Engineers at Morton Thiokol (contractor for rocket boosters) discovered that O-rings became brittle in cold temperatures.

The Night Before Launch: Temperatures were forecast to drop to 18ยฐFโ€”far below any previous launch. Engineers emphatically recommended postponing.

What Happened:

  • NASA managers pressured Thiokol executives: "We need data proving it's unsafe, not opinions."
  • Thiokol executives overruled their own engineers
  • Groupthink symptoms present:
    • Illusion of invulnerability: "We've launched 24 shuttles successfully"
    • Pressure on dissenters: Engineers were told to "take off your engineering hat and put on your management hat"
    • Mindguards: Some managers didn't inform higher-ups about engineering concerns
    • Collective rationalization: "The data isn't conclusive"

Result: Challenger launched. 73 seconds later, the O-rings failed. The shuttle exploded, killing all seven crew members, including teacher Christa McAuliffe.

Lesson: Groupthink doesn't just lead to bad business decisionsโ€”it kills people. Organizations need processes that encourage dissent, especially from technical experts.


Example 2: The Enron Corporation โ€” Authority and Conformity

Background: Enron was America's seventh-largest company in 2000, praised as the most innovative corporation in America.

The Culture:

  • CEO Jeffrey Skilling created a hyper-competitive environment
  • Employees who questioned accounting practices were fired or marginalized
  • Authority bias: When top executives said "mark-to-market accounting is fine," employees suppressed doubts
  • Conformity: Everyone acted like aggressive accounting was normal, so it must be okay

What Happened:

  • Enron hid billions in debt through complex schemes
  • Executives sold their stock while encouraging employees to buy more
  • When the fraud was exposed in 2001, the company collapsed
  • Thousands lost their jobs and retirement savings

Social Bias Analysis:

  • Authority bias: Junior accountants trusted that executives and Arthur Andersen (auditor) knew what they were doing
  • Conformity: "Everyone here uses aggressive accounting. I must be too conservative."
  • In-group bias: Enron employees saw themselves as smarter than everyone else, dismissing outside criticism

Lesson: When the culture punishes dissent and rewards conformity, even good people participate in fraud.


Example 3: The Gamestop Short Squeeze (2021) โ€” Herd Behavior in Real-Time

Background: GameStop was a struggling video game retailer with declining revenues. Hedge funds heavily shorted the stock (betting it would fall).

What Happened:

  • Reddit forum r/WallStreetBets noticed the short position
  • Users began buying GameStop stock and options
  • Social media amplified the message: "Hold the line!" "Diamond hands!" "To the moon!"
  • Herd behavior accelerated:
    • FOMO drove millions of new investors to buy
    • Price soared from $20 to $483 in two weeks
    • Anyone questioning the rally was attacked: "Paper hands!" "Shill!"

The Aftermath:

  • Price eventually crashed back below $40
  • Early investors made fortunes; late arrivers lost billions
  • Classic herd behavior: those who joined at $300+ because "everyone's getting rich" were left holding the bag

Lesson: Social media dramatically amplifies herd behavior. When investment advice comes with peer pressure and identity ("apes together strong"), rationality disappears.


Example 4: Medical Errors from Authority Bias

The Case: A surgeon prepares to operate on a patient's left knee. The nurse notices the consent form says "right knee." The X-ray on the light board shows "RIGHT" marked. But the surgeon insists: "No, I reviewed the chart. It's definitely the left knee. Let's proceed."

Authority Bias in Action:

  • The nurse is 95% sure the surgeon is wrong
  • But the surgeon is confident and experienced
  • Questioning might make her look incompetent
  • "He's the expert; he must know something I don't"
  • She stays silent

What Happens: The surgeon operates on the wrong knee. The patient requires another surgery and sues for malpractice.

Modern Solution: The WHO Surgical Safety Checklist requires every team member to verify the surgical site before incision. This gives explicit permission for nurses to challenge surgeons.

Traditional Model          Modern Model
      โ†“                         โ†“
   Surgeon                   Team
   decides        โ†’        discusses
      โ†“                         โ†“
   Nurse                   Everyone
   obeys          โ†’        verifies
      โ†“                         โ†“
   Errors                  Fewer errors

Lesson: The best safety systems assume authority bias exists and create procedures to counteract it.


โš ๏ธ Common Mistakes and How to Avoid Them

Mistake 1: Thinking "I'm immune to social biases because I'm an independent thinker"

Why it's wrong: This is overconfidence bias (from Lesson 5) applied to social psychology. Asch's experiments included intelligent, confident peopleโ€”75% still conformed.

The truth: Social biases are automatic and unconscious. Recognizing them doesn't eliminate them.

What to do: Assume you're susceptible. Create external checks:

  • Seek out dissenting opinions before deciding
  • Use pre-commitment devices (write down your view before hearing the group's)
  • Actively ask: "What would I think if everyone disagreed with me?"

Mistake 2: Confusing consensus with correctness

Why it's wrong: Truth isn't determined by vote. The group can be uniformly wrong (remember the housing bubble?).

Example: In the early days of COVID-19, public health officials achieved consensus that masks weren't needed. This consensus was wrong, based on group reasoning errors and institutional thinking, not evidence.

What to do:

  • Distinguish between factual questions (which have right answers) and value questions (which involve preferences)
  • For factual questions, demand evidence, not consensus
  • Ask: "What would prove us wrong?" If the group can't answer, that's a red flag

Mistake 3: Using "team player" as a weapon against dissent

Why it's wrong: This phrase weaponizes conformity. It implies that disagreement equals disloyalty.

How it shows up:

  • "We need team players who buy into the vision"
  • "Are you with us or against us?"
  • "Everyone else is on board; why aren't you?"

What to do as a leader:

  • Explicitly reward dissent: "Thank you for raising that concern"
  • Separate disagreement (which you want) from disloyalty (which you don't)
  • Use phrases like "I want your honest opinion, even if it contradicts mine"

What to do as a team member:

  • Frame disagreement constructively: "I'm raising this because I want the project to succeed"
  • Provide alternatives, not just criticism
  • Recognize when you're being silenced through social pressure

Mistake 4: Believing groupthink only happens in large, formal groups

Why it's wrong: Groupthink happens in families, friend groups, small startups, and even couples making decisions.

Example: A couple decides to buy a house they can't afford. Each has doubts, but neither wants to disappoint the other. They engage in collective rationalization ("Interest rates are low") and ignore risks ("What if one of us loses our job?").

What to do:

  • Even in small groups, appoint a "devil's advocate" to argue against the emerging consensus
  • Take a break: "Let's sleep on this and discuss tomorrow" reduces emotional momentum
  • Separately write down pros and cons before discussing

Mistake 5: Thinking social biases are just about "personality"

Why it's wrong: People blame conformity on weak personality: "I would never go along with the crowd." But Asch's participants included strong-willed individuals who still conformed.

The truth: Social biases are about situation, not just personality. Put almost anyone in the Milgram experiment, and they'll deliver shocks.

What to do:

  • Design situations that reduce bias, rather than relying on personality:
    • Anonymous voting before discussion (reduces conformity)
    • Rotating devil's advocate role (normalizes dissent)
    • Explicit "no-retaliation" policies for whistleblowers
  • Recognize that good people make bad decisions in bad situations

Key Takeaways ๐ŸŽฏ

  1. Conformity is automatic: 75% of people will deny obvious truths when the group disagrees. You're not exempt.

  2. Authority overrides judgment: 65% of people will harm others when an authority figure instructs them to. Organizational hierarchies create this dynamic daily.

  3. Groupthink kills: Cohesive groups make terrible decisions when they suppress dissent. The illusion of unanimity is dangerous.

  4. Herds create bubbles: When everyone's buying, that's the time to be skeptical. Financial manias are herd behavior at scale.

  5. More bystanders = less help: Don't assume someone else will act. Take personal responsibility.

  6. In-group bias is instant: We favor "our side" automatically, even when group membership is meaningless and arbitrary.

  7. The first dissenter breaks the spell: If you disagree, speak up early. You'll make it easier for others to voice concerns.

  8. Create systems that counter biases: Don't rely on people to "be better." Design decision-making processes that encourage dissent:

    • Anonymous initial opinions
    • Designated devil's advocates
    • Red team/blue team structures
    • Explicit permission to challenge authority
  9. Silence doesn't equal agreement: When no one objects, it might mean everyone's self-censoring, not that everyone agrees.

  10. Social pressure beats evidence: In the moment, the desire to fit in or obey authority is more powerful than facts. Prepare for this in advance.


๐Ÿ“‹ Quick Reference Card: Social Biases Cheat Sheet

+------------------+-------------------------+----------------------+
| Bias             | Definition              | Counter-Strategy     |
+------------------+-------------------------+----------------------+
| Conformity       | Agreeing with group     | Speak up early;      |
|                  | despite evidence        | seek one dissenter   |
+------------------+-------------------------+----------------------+
| Authority        | Overvaluing expert      | Question credentials |
|                  | opinions; obeying       | and evidence; use    |
|                  | harmful orders          | "respectful dissent" |
+------------------+-------------------------+----------------------+
| Groupthink       | Suppressing dissent     | Assign devil's       |
|                  | for group harmony       | advocate; leader     |
|                  |                         | withholds opinion    |
+------------------+-------------------------+----------------------+
| Herd Behavior    | Following crowd         | Be contrarian when   |
|                  | actions (buying/selling)| everyone agrees      |
+------------------+-------------------------+----------------------+
| Bystander Effect | Not helping when        | Take personal        |
|                  | others are present      | responsibility       |
+------------------+-------------------------+----------------------+
| In-group Bias    | Favoring your group     | Actively seek out-   |
|                  | over outsiders          | group perspectives   |
+------------------+-------------------------+----------------------+

Warning Signs You're in a Biased Group Decision:

  • โœ‹ No one is voicing concerns
  • โœ‹ Questioners are labeled "not team players"
  • โœ‹ The leader's opinion is stated first
  • โœ‹ The group feels invulnerable or morally superior
  • โœ‹ Outside experts are dismissed without consideration
  • โœ‹ There's time pressure and stress ("We must decide now!")
  • โœ‹ The group is highly cohesive with strong identity

Emergency Response for Better Group Decisions:

  1. ๐Ÿ”ด Leader speaks last, not first
  2. ๐Ÿ”ด Assign someone to argue against the plan
  3. ๐Ÿ”ด Break into subgroups to develop independent recommendations
  4. ๐Ÿ”ด Bring in outside experts with different perspectives
  5. ๐Ÿ”ด Sleep on itโ€”reconvene after 24 hours
  6. ๐Ÿ”ด Anonymous voting before discussion
  7. ๐Ÿ”ด Ask: "What would have to be true for this to be a bad idea?"

๐Ÿ“š Further Study

  1. Irving Janis - "Groupthink" (1982): The definitive book on group decision-making failures, analyzing Bay of Pigs, Pearl Harbor, Vietnam, and Watergate. https://www.amazon.com/Groupthink-Psychological-Studies-Policy-Decisions/dp/0395317045

  2. Solomon Asch Conformity Experiments (Video): Watch actual footage of participants conforming in Asch's line experiments. Powerful and disturbing. https://www.youtube.com/watch?v=TYIh4MkcfJA

  3. Stanley Milgram - "Obedience to Authority" (1974): First-hand account of the obedience experiments, including transcripts of participants' moral struggles. https://www.amazon.com/Obedience-Authority-Experimental-Perennial-Classics/dp/006176521X


"Collective fear stimulates herd instinct, and tends to produce ferocity toward those who are not regarded as members of the herd." โ€” Bertrand Russell

"Whenever you find yourself on the side of the majority, it is time to pause and reflect." โ€” Mark Twain