1) “Average IQ” isn’t the problem—human cognition is the problem
An IQ score near the population average means you’re… normal. It doesn’t mean you’re “bad at thinking,” and it doesn’t mean you can’t reason well. But it does mean you’re operating with the same basic cognitive architecture as everyone else: limited attention, limited working memory, limited time, and a brain that evolved to make decent decisions fast—not to build accurate models of complex systems.
A helpful anchor here is the distinction between intelligence (what IQ tests largely measure) and rationality (forming beliefs and making choices that fit evidence and goals). Keith Stanovich’s work emphasizes that rational thinking skills and “thinking dispositions” are only partly correlated with intelligence—meaning a person can be smart and still be systematically irrational, and a person can be average in IQ and still be impressively rational with the right habits and training.
So the topic isn’t “average people are stupid.” The topic is: what predictable failure modes emerge in normal cognition—especially under complexity, stress, social pressure, and identity commitments?
2) The core bottleneck: you can’t process everything, so you simplify
2.1 Limited bandwidth forces shortcuts
A huge share of everyday reasoning is not careful calculation; it’s compression. We reduce complex realities into manageable stories, categories, and rules of thumb.
This is closely related to the “cognitive miser” idea in social cognition: people conserve mental effort and use shortcuts when possible, because capacity is limited and effort is costly. Contemporary social cognition summaries still describe the field as combining “cognitive economy” with motivation and affect—meaning we don’t just conserve effort; we conserve effort in emotionally strategic ways.
2.2 Dual-process dynamics: fast first answers vs slow checking
Dual-process theories (in their broadest form) describe an interaction between quick intuitive responses and slower deliberative checking. The practical point isn’t “intuition bad, deliberation good.” The point is: deliberation is effortful and often skipped—especially when the stakes feel psychological rather than practical.
Shane Frederick’s Cognitive Reflection Test (CRT) is famous for capturing a specific failure mode: people often blurt a tempting intuitive answer unless they pause to reflect. CRT performance predicts a range of decision-making tendencies because it measures the disposition/ability to resist the first response that comes to mind.
When issues get complex, the “pause and compute” system becomes more expensive. That’s where comfort and discomfort become central.
3) Thinking has a felt cost: comfort, discomfort, and the flight from complexity
We tend to treat “thinking” like a neutral tool. But thinking has a felt experience:
- confusion feels bad
- ambiguity feels unsafe
- being wrong feels humiliating
- changing your mind feels like losing status
- prolonged complexity feels exhausting
So the mind often selects not the most accurate conclusion, but the most emotionally efficient one.
3.1 Need for closure: the craving for a definite answer
Psychologists call one major driver need for cognitive closure—a desire for firm answers and intolerance of ambiguity. The classic scale introduced by Webster and Kruglanski formalizes this as an individual difference: some people are more motivated to “seize” on an answer and “freeze” it in place.
Under cognitive load, need for closure can shape how much effort people invest and how they handle uncertainty. Research on closure and cognitive inhibition suggests that cognitive load and closure motives interact in ways that can increase quick closure and reduce open-ended processing.
When an issue is complex—say, macroeconomics, geopolitics, epidemiology—high ambiguity makes closure-hungry cognition especially vulnerable to simple narratives.
3.2 Need for cognition: some people enjoy effortful thinking, others don’t
“Need for cognition” is essentially the opposite tendency: enjoyment of effortful thought. Cacioppo and Petty’s classic work introduced a scale for this disposition.
A useful way to frame it:
- If thinking feels like pleasure, you keep going.
- If thinking feels like pain, you stop early and accept a story.
Neither disposition is “moral.” But they predict who is likely to tolerate complexity without fleeing into slogans.
4) How typical people gather information: trust heuristics dominate
In real life, we don’t verify most claims by direct evidence. We rely on:
- authority (“experts say…”)
- social proof (“everyone knows…”)
- identity trust (“my side says…”)
- repetition (“I’ve heard it a lot…”)
- coherence (“it fits the story I already believe…”)
These shortcuts are not irrational in principle; they’re necessary when evidence is costly. The weakness is that shortcuts can be exploited by persuasion systems—political messaging, media ecosystems, charismatic leaders, and group norms.
And once complexity rises, the mind becomes more dependent on “trusted interpreters,” which increases vulnerability to ideological capture.
5) Metacognition and miscalibration: Dunning–Kruger, and what’s true (and not true) about it
5.1 The common claim
The popular version of the Dunning–Kruger effect says: “people who are unskilled are too unskilled to recognize their unskill, so they overestimate their competence.” The original paper by Kruger and Dunning (1999) is the classic reference for inflated self-assessment among low performers.
This idea resonates because it matches experience: some people are loudly confident while being wrong.
5.2 The important nuance: part of the pattern can be statistical
There is also a serious methodological debate. Some researchers argue that the classic “bottom quartile overestimates; top quartile underestimates” pattern can arise partly from regression to the mean and a general better-than-average bias. For example, Magnus (2022) provides a statistical explanation emphasizing regression-to-the-mean dynamics and how they can generate asymmetry.
Gignac and Zajenkowski (2020) explicitly argued the Dunning–Kruger effect is “mostly” a statistical artifact in certain individual-differences data analyses.
Other scholars have pushed back, arguing that analysis choices and measurement issues matter and that the “artifact only” account can be overstated.
Also: the effect isn’t universal across all domains. For instance, a 2024 study found no strong support for a Dunning–Kruger effect in creative thinking using newer statistical approaches.
5.3 What survives the debate (and matters for your question)
Even if you throw out the meme version, a robust practical point remains:
- People are often poorly calibrated.
- Self-assessment accuracy varies.
- Low feedback environments produce confident error.
That’s enough to connect to ideology and religion: these are often feedback-poor systems. If a belief system doesn’t expose you to disconfirming evidence—or reinterprets every disconfirmation as “a test”—then miscalibration can persist indefinitely.
6) Why ideology can amplify overconfidence rather than reduce it
Ideology is not just a set of beliefs; it is a problem-solving style. It tells you:
- what counts as evidence
- who is trustworthy
- which questions are legitimate
- which answers are forbidden
- what uncertainty means (weakness? humility? betrayal?)
That makes ideology a powerful amplifier of overconfidence.
6.1 Motivated reasoning: “smart” people can be better at rationalizing
Dan Kahan’s work on ideology and motivated reasoning is a key warning against the naive view that “bias is for low-IQ people.” His study suggests that higher cognitive sophistication (e.g., cognitive reflection) can correlate with greater ideological polarization—because people use their reasoning ability to defend identity-congruent conclusions.
So ideological amplification doesn’t require low ability. It requires:
- identity stakes
- social reward for loyalty
- and a narrative that punishes doubt
6.2 Closure engines: ideology as a relief from ambiguity
Ideologies often function as closure machines:
- they offer complete explanations (“the system is rigged,” “the elites control everything,” “history inevitably moves this way”)
- they simplify moral tradeoffs into purity rules
- they reduce complexity to one causal villain
For a mind that experiences complexity as discomfort, ideology is analgesic. It replaces “I’m not sure” with “I understand.”
That can look like wisdom, but it’s often just certainty substitution.
7) Religion: not “religion” in general, but doctrinal and identity-centered reasoning styles
It’s important to separate:
- religion as community, meaning, ritual, ethical tradition
from - religion as an epistemic style that can become rigid, authority-driven, and closed to revision.
Jon Baron has discussed correlations reported between religious belief, social conservatism, deontological moral reasoning, and (negatively) measures of rational cognitive style, framing this as a cultural divide in how people think.
You don’t need to turn that into “religion makes people dumb.” A more careful claim is:
Certain religious (and ideological) reasoning patterns can increase confidence while decreasing error-correction.
Here are the specific patterns that matter:
7.1 Revelation/authority overrides uncertainty
If the highest form of truth is revelation or sacred authority, then:
- doubt becomes sin
- ambiguity becomes temptation
- independent inquiry becomes arrogance
That’s a perfect recipe for overconfidence, because the system doesn’t require calibration against reality in the same way empirical inquiry does.
7.2 Sacred values are insulated from tradeoffs
When an issue becomes sacred, compromise becomes betrayal. This pushes cognition toward:
- black-and-white categorization
- purity spirals
- refusal to update even when costs rise
That can make people feel morally strong while becoming informationally brittle.
7.3 Community reinforcement replaces feedback
Communities reward confidence. They often reward confident repetition more than careful qualification. In a tight identity group:
- the biggest “error” is disloyalty
- not inaccuracy
So the learning signal becomes social, not epistemic.
8) The average-IQ vulnerability profile under complexity
Pull the threads together and you get a predictable profile—not of “dumb people,” but of normal cognition under strain:
- Cognitive load pushes you toward shortcuts (limited bandwidth).
- Discomfort pushes you toward closure (ambiguity feels bad).
- Low-feedback environments allow confident error (metacognitive miscalibration).
- Identity-stakes convert reasoning into defense (motivated reasoning).
- Doctrinal systems can lock in certainty (authority and sacred values).
This is why modern information environments are so destabilizing: they combine complexity, speed, tribal identity, and endless low-quality claims—exactly the conditions that reward closure and punish careful calibration.
9) A key twist: average IQ isn’t the main driver—thinking dispositions are
If rationality and intelligence are distinct, then the main weakness isn’t “average IQ.” It’s the common absence of trained dispositions:
- actively open-minded thinking
- comfort with “I don’t know yet”
- deliberate seeking of disconfirming evidence
- calibration habits (probabilities, error bars, prediction tracking)
- separating identity from belief (“If this is false, I’m still me.”)
Stanovich’s distinction between intelligence and rationality is basically a call to focus on these dispositions rather than IQ alone.
10) What “amplified Dunning–Kruger” looks like inside ideology and religion
If you want a concrete “amplification model,” it’s this:
- Simplified explanatory template (“Everything is caused by X.”)
- Moral certainty (“Doubt is weakness or betrayal.”)
- Social reinforcement (confidence rewarded; nuance punished)
- Evidence gating (only approved sources count)
- Self-sealing logic (counterevidence becomes proof of the plot/test)
- Low accountability (predictions aren’t tracked; failures aren’t admitted)
Under those conditions, miscalibration isn’t corrected—it becomes part of identity.
And because motivated reasoning can be strongest among the cognitively sophisticated, increasing education or raw intelligence alone doesn’t automatically solve it.
Closing thought
The most realistic critique of “average people’s thinking” isn’t contempt. It’s compassion for how harsh the environment is:
- modern problems are complex
- feedback is delayed and noisy
- social identity is constantly activated
- and the mental cost of sustained reasoning is real
The weakness is that comfort often wins over accuracy.