COGNITIVE BIASES, HEURISTICS AND FALLACIES
COGNITIVE PROCESSES
This is the broadest psychological category. It includes all mental functions involved in acquiring, processing, storing, and using information. Under this, you’ll find:
COGNITIVE BIASES
NEGATIVE SEF BIAS
HEURISTICS
SCHEMATA AND SCHEMA
STEROTYPES
LOGICAL FALLCIES
SO WHY THE CONFUSION?
Some heuristics (like availability) can lead to cognitive biases (like overestimating risk).
Some biases (like confirmation bias) can contribute to fallacious reasoning (like cherry-picking evidence).
But each category has a different function and must be treated separately for teaching, writing, or cognitive modelling.
HEURISTICS
A heuristic is a mental shortcut that our brains use that allows us to make decisions quickly without having all the relevant information. They can be thought of as rules of thumb that allow us to make a decision that has a high probability of being correct without having to think everything through.
A heuristic consists of preferences that help you decide when you do not have enough information or do not care enough to make an informed decision. For example, when you want to buy yogurt but are no nutritionist, you might decide on which yogurt you buy by the familiarity of the brand name (you prefer the familiar; this is called the familiarity heuristic) and other aspects that have nothing to do with the yogurt itself.
Availability Heuristic: Judging the likelihood of events based on how easily examples come to mind. For instance, after hearing about a plane crash, one might overestimate the dangers of air travel due to the vividness of the event in memory.
Representativeness Heuristic: Assessing similarity and assuming outcomes based on how closely something matches a prototype. For example, assuming someone is a librarian because they are quiet and reserved, fitting the stereotype, even if statistically improbable.
Anchoring and Adjustment Heuristic: Relying heavily on the first piece of information (the "anchor") when making decisions and making insufficient adjustments from that starting point. For example, if told a car costs £30,000, a subsequent offer of £27,000 may seem reasonable, even if the car's value is much lower.
Simulation Heuristic: Estimating the likelihood of an event based on how easily one can imagine it. For example, feeling that missing a train by one minute is worse than missing it by five, because it's easier to simulate catching it with a slight change.
Substitution Heuristic: When faced with a complex question, substituting it with a simpler one and answering that instead. For instance, answering "Do I like this politician?" instead of "Is this politician competent?"
Affect Heuristic: Making decisions based on emotions. For example, choosing a vacation destination because it "feels" right, despite lacking concrete information.
Base Rate Heuristic: Ignoring general statistical information (base rates) in favor of specific information. For example, assuming someone is a professor because they wear glasses and read a lot, ignoring the fact that professors are rare in the general population.
Hindsight Bias: Believing, after an event has occurred, that one would have predicted or expected the outcome. This can lead to overconfidence in one's ability to predict events.
Confirmation Bias: The tendency to search for, interpret, and remember information that confirms one's preexisting beliefs, while disregarding contradictory evidence.
SCHEMATA AND SCHEMA
Schemas are organised mental structures that help us interpret and respond to the world. They shape our expectations, attention, memory, and social interactions.
Schemata (plural) develop from experience and are used to predict what will happen in familiar situations.
For example, a “restaurant schema” tells you to expect a menu, a waiter, and a bill at the end.
Schemas can also be self-referential, e.g., beliefs about competence or worth.
When schemas are rigid, inaccurate, or based on negative experiences, they can contribute to bias and maladaptive thinking (e.g., in depression or prejudice).
STEREOTYPES
Stereotypes are a specific kind of schema: oversimplified generalisations about a group of people. They may be partly informed by social experience but become problematic when applied indiscriminately.
E.g., "Women are more emotional" or "Young people are irresponsible."
Stereotypes can lead to prejudice, discrimination, and biased interpretation of others’ behaviour
COGNITIVE BIASES
Biases are systematic errors in judgment.
They are often products of heuristics, shaped by schemas, and reinforced by emotion or past learning.
Cognitive biases are systematic patterns of deviation from norm or rationality in judgment. They arise from the brain's attempt to simplify information processing and can lead to perceptual distortion, inaccurate judgment, or illogical interpretation. While these biases can be helpful in making quick decisions, they often lead to errors.
These affect how we perceive and interpret the world.
Some are emotionally driven (e.g. negativity bias)
Some are memory-driven (e.g. hindsight bias)
Some arise from poor logic (e.g. base rate neglect)
COMMON COGNITIVE BIASES
Below is a list of common cognitive biases, each with a brief explanation:
Affect Heuristic: Relying on current emotions to make decisions quickly.
Ambiguity Effect: Preferring known risks over unknown risks.
Anchoring Bias: Over-reliance on the first piece of information encountered.
Attentional Bias: Focusing more on certain stimuli while ignoring others.
Availability Heuristic: Overestimating the likelihood of events based on their availability in memory.
Bandwagon Effect: Adopting beliefs or behaviors because many others do.
Base Rate Fallacy: Ignoring general information in favor of specific information.
Bottom-Dollar Effect: Reduced satisfaction with a purchase when it depletes one's remaining budget.
Bounded Rationality: Making decisions within the limitations of available information and cognitive capacity.
Bundling Bias: Valuing bundled items differently than individual items.
Cashless Effect: Spending more when using non-cash payment methods.
Category Size Bias: Perceiving items in larger categories as more likely.
Choice Overload Bias: Difficulty making decisions when presented with many options.
Cognitive Dissonance: Discomfort from holding conflicting beliefs or behaviors.
Commitment Bias: Continuing a course of action due to prior commitments.
Confirmation Bias: Favoring information that confirms existing beliefs.
Decision Fatigue: Deterioration of decision quality after making many decisions.
Declinism: Belief that society is declining, often romanticizing the past.
Decoy Effect: Preference changes when a third, less attractive option is introduced.
Default Bias: Tendency to stick with pre-set options.
Distinction Bias: Viewing options as more distinct when evaluated simultaneously.
Dunning–Kruger Effect: Overestimating one's abilities due to lack of knowledge.
Empathy Gap: Difficulty understanding others' emotions when not experiencing them.
Endowment Effect: Valuing owned items more than identical non-owned items.
Extrinsic Incentive Bias: Belief that others are more motivated by external rewards than oneself.
Forer Effect: Accepting vague, general statements as personally meaningful.
Framing Effect: Decisions influenced by how information is presented.
Functional Fixedness: Inability to see objects used in non-traditional ways.
Fundamental Attribution Error: Overemphasizing personal traits over situational factors in others' behaviors.
Gambler’s Fallacy: Belief that past random events affect future ones.
Google Effect: Tendency to forget information easily found online.
Halo Effect: Overall impression influences judgments about unrelated traits.
Hard–Easy Effect: Overestimating performance on hard tasks and underestimating on easy ones.
Hindsight Bias: Viewing events as more predictable after they happen.
Hot-hand Fallacy: Belief in streaks in random events.
Hyperbolic Discounting: Preferring smaller, immediate rewards over larger, delayed ones.
Identifiable Victim Effect: Greater empathy for specific individuals than for anonymous groups.
IKEA Effect: Overvaluing things one has helped to create.
Illusion of Control: Overestimating one's influence over external events.
Illusion of Validity: Overconfidence in judgments despite contradictory evidence.
Illusory Correlation: Perceiving relationships where none exist.
Illusory Truth Effect: Believing information is true after repeated exposure.
In-group Bias: Favoring members of one's own group.
Incentivization: Increased motivation due to rewards.
Just-world Hypothesis: Belief that people get what they deserve.
Lag Effect: Improved memory retention when learning is spaced over time.
Law of the Instrument: Over-reliance on familiar tools or methods.
Less-is-better Effect: Preference for smaller sets when evaluated individually.
Levelling and Sharpening: Memory distortion by emphasizing or minimizing details.
Levels-of-processing Effect: Deeper processing leads to better memory retention.
Look-elsewhere Effect: Ignoring broader context when focusing on specific data.
Loss Aversion: Preference to avoid losses over acquiring equivalent gains.
Mental Accounting: Treating money differently based on subjective criteria.
Mere Exposure Effect: Preference for familiar stimuli.
Motivating-Uncertainty Effect: Increased motivation from uncertain rewards.
Naive Allocation: Evenly distributing resources regardless of effectiveness.
Negativity Bias: Greater sensitivity to negative information.
Noble Edge Effect: Favoring brands perceived as socially responsible.
Nostalgia Effect: Preference influenced by sentimental longing for the past.
Observer-expectancy Effect: Researcher's expectations influencing participants' behavior.
Omission Bias: Preferring inaction over action, even if the outcome is worse.
Optimism Bias: Overestimating positive outcomes and underestimating negatives.
Ostrich Effect: Avoiding negative information.
Peak-end Rule: Judging experiences based on their peak and end moments.
Pessimism Bias: Overestimating the likelihood of negative events.
Planning Fallacy: Underestimating time needed to complete tasks.
Primacy Effect: Better recall of items presented first.
Priming Effect: Exposure to one stimulus influencing response to another.
Projection Bias: Assuming others share one's current thoughts or feelings.
Reactive Devaluation: Devaluing proposals from adversaries.
Regret Aversion: Avoiding decisions that could lead to regret.
Response Bias: Tendency to answer questions untruthfully or misleadingly.
Restraint Bias: Overestimating one's ability to control impulses.
Rosy Retrospection: Recalling past events more positively than they were.
Salience Bias: Focusing on prominent or emotionally striking items.
Self-serving Bias: Attributing successes to oneself and failures to external factors.
Sexual Overperception Bias: Misinterpreting friendliness as sexual interest.
Social Norms: Behaviors influenced by societal expectations.
Source Confusion: Misattributing the origin of a memory.
Spacing Effect: Improved learning when study sessions are spaced out.
Spotlight Effect: Overestimating how much others notice one's actions.
Suggestibility: Incorporating misleading information into memory.
Survivorship Bias: Focusing on successful examples while ignoring failures.
Telescoping Effect: Misjudging the timing of past events.
UNCONSCIOUS BIAS
Unconscious bias, also known as implicit bias, refers to attitudes or stereotypes that affect our understanding, actions, and decisions in an unconscious manner. These biases are activated involuntarily and without an individual's awareness or intentional control.
Zero Risk Bias: Preference for absolute certainty over relative risk reduction.
LOGICAL FALLACIES: DEFINITIONS, EXAMPLES, AND EXPLANATIONS
SLIPPERY SLOPE (DOMINO THEORY)
Definition: Asserts that a relatively small first step leads to a chain of related events culminating in a significant (usually negative) effect, without sufficient evidence for such inevitability.scribbr.com
Example: "If we allow students to redo this test, next they'll want to retake every test, and soon no grades will have any meaning."
Why It's a Fallacy: It assumes a sequence of events without establishing a causal connection between them, often invoking fear rather than logical reasoning.
HASTY GENERALIZATION
Definition: Draws a broad conclusion from a small or unrepresentative sample.
Example: "My two friends got food poisoning at that restaurant; therefore, the restaurant is unsafe."
Why It's a Fallacy: It bases a general rule on insufficient evidence, leading to unreliable conclusions.en.wikipedia.org
SLOTHFUL INDUCTION
Definition: Rejects a reasonable conclusion drawn from evidence, attributing it to coincidence or unrelated factors.
Example: "Even though every time I eat peanuts I get a rash, it's probably just a coincidence."
Why It's a Fallacy: It ignores strong evidence, refusing to acknowledge a likely cause-and-effect relationship.
FALSE CAUSALITY (CORRELATION VS. CAUSATION)
Definition: Assumes that because two events occur together, one causes the other.
Example: "Ice cream sales increase in summer, and so do drowning incidents; therefore, ice cream causes drowning."
Why It's a Fallacy: It confuses correlation with causation, overlooking other variables that may influence both events.
ANECDOTAL EVIDENCE
Definition: Uses personal experience or isolated examples instead of sound arguments or compelling evidence.
Example: "My grandfather smoked his whole life and lived to 97, so smoking can't be that bad."
Why It's a Fallacy: Personal stories are not reliable evidence, as they may not represent typical outcomes and lack scientific rigor.
POST HOC ERGO PROPTER HOC
Definition: Assumes that if one event follows another, the first caused the second.
Example: "I wore my lucky socks and then aced the exam; the socks must be the reason."
Why It's a Fallacy: Temporal succession does not imply causation; other factors likely contributed to the outcome.
GENETIC FALLACY
Definition: Judges something as good or bad based on its origin rather than its current meaning or context.
Example: "That policy originated in ancient Rome; it's outdated and irrelevant today."writingcenter.unc.edu+8trinka.ai+8reddit.com+8
Why It's a Fallacy: The origin of an idea does not determine its current validity or applicability.
BEGGING THE QUESTION (PETITIO PRINCIPII)
Definition: The conclusion is assumed in one of the premises; the argument circles back on itself.
Example: "Reading is beneficial because it's good for you."wired.com+15writingcenter.unc.edu+15quillbot.com+15
Why It's a Fallacy: It lacks independent support for the conclusion, making the argument uninformative.
CIRCULAR ARGUMENT
Definition: Restates the premise rather than providing a conclusion.
Example: "I'm trustworthy because I always tell the truth."
Why It's a Fallacy: It doesn't offer new information or evidence to support the claim.
APPEAL TO IGNORANCE
Definition: Claims something is true because it hasn't been proven false, or vice versa.
Example: "No one has proven aliens don't exist; therefore, they must exist."
Why It's a Fallacy: Lack of evidence against a claim doesn't constitute evidence for it.
FALSE DILEMMA (EITHER/OR FALLACY)
Definition: Presents two options as the only possibilities when others exist.blog.hubspot.com
Example: "You're either with us or against us."en.wikipedia.org
Why It's a Fallacy: It oversimplifies complex issues, ignoring alternative perspectives or solutions.
AD HOMINEM
Definition: Attacks the person making the argument rather than the argument itself.homeschoolconnections.com
Example: "You can't trust his opinion on climate change; he's not a scientist."
Why It's a Fallacy: Personal characteristics are irrelevant to the validity of an argument.
AD POPULUM (BANDWAGON APPEAL)
Definition: Argues that a claim must be true because many people believe it.reddit.com
Example: "Everyone uses this diet plan, so it must be effective."
Why It's a Fallacy: Popularity doesn't equate to truth; widespread belief can be misguided.
RED HERRING
Definition: Introduces irrelevant information to distract from the actual issue.
Example: "Why worry about the environment when there are so many unemployed people?"
Why It's a Fallacy: It diverts attention from the topic at hand, preventing a focused discussion.
STRAW MAN
Definition: Misrepresents someone's argument to make it easier to attack.
Example: "She wants to reduce military spending; she must want to leave the country defenseless."pesec.no
Why It's a Fallacy: It distorts the original argument, leading to refutation of a position not actually held.
MORAL EQUIVALENCE
Definition: Compares minor misdeeds with major atrocities, suggesting they are equally wrong.
Example: "Failing to recycle is just as bad as polluting the ocean."
Why It's a Fallacy: It exaggerates the severity of actions, ignoring significant differences in context and impact.
ALPHABET SOUP
Definition: Overuses acronyms or jargon to appear knowledgeable or to confuse the audience.
Example: "Our Q3 KPIs for the CRM and ERP systems show a positive ROI in the B2B sector."
Why It's a Fallacy: It obscures meaning, potentially misleading or alienating the audience.
TEXAS SHARPSHOOTER
Definition: Cherry-picks data clusters to suit an argument, ignoring data that contradicts it.
Example: "This supplement is effective; five users reported improved health."
Why It's a Fallacy: It focuses on specific data points while disregarding the broader dataset that may not support the conclusion.
MIDDLE GROUND FALLACY
Definition: Assumes that the middle position between two extremes must be correct.en.wikipedia.org+1pesec.no+1
Example: "Some say vaccines are safe; others say they're dangerous. The truth must be somewhere in between."
Why It's a Fallacy: The compromise between two positions isn't necessarily the truth; one side may be entirely correct.
BURDEN OF PROOF
Definition: Places the responsibility of disproving a claim on others rather than providing evidence for it.
Example: "Prove that ghosts don't exist."
Why It's a Fallacy: The person making a claim is responsible for providing evidence; it's not others' job to disprove it.
SENTIMENTAL APPEAL
Definition: Uses emotion to distract from logical reasoning.
Example: "Think of the children; we must ban all video games."
Why It's a Fallacy: Emotional appeals can cloud judgment, leading to decisions not based on facts or logic.
DOGMATISM
Definition: Asserts that one's beliefs are the only acceptable ones, dismissing others without consideration.
Example: "I don't care what the data says; my view is correct.
Why It's a Fallacy:
It shuts down critical thinking and dialogue by treating personal belief as unquestionable truth, refusing to consider alternative perspectives or evidence.
SCARE TACTICS
Definition:
Uses fear, not evidence or logic, to persuade people to accept a conclusion.
Example:
"If we don’t ban immigration, our culture will be destroyed."
Why It's a Fallacy:
It bypasses reasoned debate by appealing to irrational fear, often exaggerating risks without offering evidence.
APPEAL TO FALSE AUTHORITY
Definition:
Cites an "expert" who lacks relevant expertise to support a claim.
Example:
"A famous actor says this supplement works, so it must be effective."
Why It's a Fallacy:
Authority is only valid within one's field of expertise. Using irrelevant or fabricated authority misleads the audience.
EQUIVOCATION
Definition:
Uses ambiguous language or shifts meaning mid-argument.
Example:
"A feather is light. What is light cannot be dark. Therefore, a feather cannot be dark."
Why It's a Fallacy:
It relies on wordplay or shifting definitions rather than consistent logic, leading to false conclusions.
NON SEQUITUR
Definition:
Draws a conclusion that does not logically follow from the premises.
Example:
"She's wearing red. She must be great at maths."
Why It's a Fallacy:
It ignores logic or causality altogether, jumping to unrelated conclusions.
FAULTY ANALOGY
Definition:
Assumes that because two things are alike in one respect, they must be alike in all respects.
Example:
"Employees are like nails. Just as nails must be hit to work, so must employees."
Why It's a Fallacy:
It oversimplifies or misrepresents both elements of the analogy, often ignoring crucial dissimilarities.
APPEALS TO POPULARITY (AD POPULUM)
Definition:
Argues that a belief is true because it’s widely held.
Example:
"Most people believe in astrology, so it must have some truth."
Why It's a Fallacy:
Popularity is not evidence of truth; widespread beliefs can be mistaken or based on flawed reasoning.
WISHFUL THINKING
Definition:
Accepts a claim as true simply because it would be pleasant if it were.
Example:
"I believe I’ll get the job because I really want it."
Why It's a Fallacy:
It replaces rational evaluation with hope or desire, ignoring reality or evidence.
PERFECT SOLUTION FALLACY
Definition:
Rejects a solution because it doesn’t solve the problem completely.
Example:
“Why bother using seatbelts? People still die in car accidents.”
Why It's a Fallacy:
It falsely assumes that unless a solution is perfect, it's useless — ignoring incremental or partial benefits.
GLITTERING GENERALITY
Definition:
Uses emotionally appealing but vague and unprovable statements.
Example:
"We stand for freedom, strength, and a better tomorrow."
Why It's a Fallacy:
Such phrases sound good but lack substance or specificity. They’re designed to evoke emotion rather than convey fact.