Logical fallacies

  1. Agrument from Incredulity. The argument from incredulity is a logical fallacy that occurs when someone concludes that since they can’t believe that something is true, then it must be false, and vice versa. Example: I can’t imagine how human beings evolved from simple, single-celled organisms; it just doesn’t make sense. There is no way that the theory of evolution is right.
  2. Ad Hominem Fallacy. Fallacy of relevance where someone rejects or criticizes another person’s view on the basis of personal characteristics, background, physical appearance, or other features irrelevant to the argument at issue. Example: MacDougal roots for a British football team. Clearly he’s unfit to be a police chief in Ireland.
  3. Strawman argument. In the strawman argument, someone attacks a position the opponent doesn’t really hold. Instead of contending with the actual argument, he or she attacks the equivalent of a lifeless bundle of straw, an easily defeated effigy, which the opponent never intended upon defending anyway. Example: The Senator thinks we can solve all our ecological problems by driving a Prius.
  4. Appeal to Ignorance. Any time ignorance is used as a major premise in support of an argument, it’s liable to be a fallacious appeal to ignorance. Example: We have no evidence that the Illuminati ever existed. They must have been so clever they destroyed all the evidence.
  5. False Dichotomy. This line of reasoning fails by limiting the options to two when there are in fact more options to choose from. Example: Either we go to war, or we appear weak.
  6. Slippery Slope Fallacy. You may have used this fallacy on your parents as a teenager: “But, you have to let me go to the party! If I don’t go to the party, I’ll be a loser with no friends. Next thing you know I’ll end up alone and jobless living in your basement when I’m 30!” The slippery slope fallacy works by moving from a seemingly benign premise or starting point and working through a number of small steps to an improbable extreme.
  7. Circular Argument. When a person’s argument is just repeating what they already assumed beforehand, it’s not arriving at any new conclusion. We call this a circular argument or circular reasoning. If someone says, “The Bible is true; it says so in the Bible”—that’s a circular argument.
  8. Hasty Generalization. A hasty generalization is a general statement without sufficient evidence to support it. Hasty generalization may be the most common logical fallacy because there’s no single agreed-upon measure for “sufficient” evidence. Example: People nowadays only vote with their emotions instead of their brains.
  9. Red Herring Fallacy. A “red herring fallacy” is a distraction from the argument typically with some sentiment that seems to be relevant but isn’t really on-topic. A red herring fallacy can be difficult to identify because it’s not always clear how different topics relate. Example: There is a lot of commotion regarding saving the environment. We cannot make this world an Eden. What will happen if it does become Eden? Adam and Eve got bored there!
    The idea of Adam and Eve getting bored in Eden throws the listeners off the real issue of damaging the environment.
  10. Tu Quoque Fallacy. The “tu quoque,” Latin for “you too,” is also called the “appeal to hypocrisy” because it distracts from the argument by pointing out hypocrisy in the opponent. Example: But, Dad, I know you smoked when you were my age, so how can you tell me not to do it?
  11. Causal Fallacy. The causal fallacy is any logical breakdown when identifying a cause. You can think of the causal fallacy as a parent category for several different fallacies about unproven causes. Example: Jimmy isn’t at school today. He must be on a family trip.
  12. Sunk Costs Fallacy. Sometimes we invest ourselves so thoroughly in a project that we’re reluctant to ever abandon it, even when it turns out to be fruitless and futile. Example: I know this relationship isn’t working anymore and that we’re both miserable. No marriage. No kids. No steady job. But I’ve been with him for seven years, so I’d better stay with him.
  13. Appeal to Authority. This fallacy happens when we misuse an authority. Example: One day robots will enslave us all. It’s true. My computer science teacher says so.
  14. Equivocation. Equivocation happens when a word, phrase, or sentence is used deliberately to confuse, deceive, or mislead by sounding like it’s saying one thing but actually saying something else. Example: His political party wants to spend your precious tax dollars on big government. But my political party is planning strategic federal investment in critical programs.
  15. Appeal to Pity. Argumentum ad misericordiam is Latin for “argument to compassion.” Like the ad hominem fallacy above, it is a fallacy of relevance. Personal attacks, and emotional appeals, aren’t strictly relevant to whether something is true or false.
  16. Bandwagon Fallacy. The bandwagon fallacy assumes something is true (or right, or good) because other people agree with it. Example: If you want to be like Mike, you’d better eat your Wheaties.
  17. False Equivalence. This fallacy is committed when one shared trait between two subjects is assumed to show equivalence, especially in order of magnitude, when equivalence is not necessarily the logical result. Example: They’re both living animals that metabolize chemical energy. Therefore there’s little difference between having a pet cat and a pet snail.

Sources — The Best SchoolsEffectiviology.

Cognitive biases

  1. Causal Reductionism: Things rarely happen for just 1 reason. Usually, outcomes result from many causes conspiring together. But our minds cannot process such a complex arrangement, so we tend to ascribe outcomes to single causes, reducing the web of causality to a mere thread.
  2. Ergodicity: A die rolled 100 times has equal probabilities to 100 dice rolled once; rolling a die is ergodic. But if the die gets chipped after 10 throws so its likelier to roll 4, then 1 die 100 times =/= 100 dice once (non-ergodic). Many treat non-ergodic systems as ergodic.
  3. Dunning-Kruger Effect: Awareness of the limitations of cognition (thinking) requires a proficiency in metacognition (thinking about thinking). In other words, being stupid makes you too stupid to realize how stupid you are.
  4. Emergence: When many simple objects interact with each other, they can form a system that has qualities that the objects themselves dont. Examples: neurons creating consciousness, traders creating the stock-market, simple mathematical rules creating living patterns.
  5. Cultural Parasitism: An ideology parasitizes the mind, changing the hosts behavior so they spread it to other people. Therefore, a successful ideology (the only kind we hear about) is not configured to be true; it is configured only to be easily transmitted and easily believed.
  6. Cumulative Error: Mistakes grow. Beliefs are built on beliefs, so one wrong thought can snowball into a delusional worldview. Likewise, as an inaccuracy is reposted on the web, more is added to it, creating fake news. In our networked age, cumulative errors are the norm.
  7. Survivorship Bias: We overemphasize the examples that pass a visibility threshold e.g. our understanding of serial killers is based on the ones who got caught. Equally, news is only news if its an exception rather than the rule, but since its what we see we treat it as the rule
  8. Simpsons Paradox: A trend can appear in groups of data but disappear when these groups are combined. This effect can easily be exploited by limiting a dataset so that it shows exactly what one wants it to show. Thus: beware of even the strongest correlations.
  9. Condorcet Paradox: a special instance of Simpsons paradox applied to elections, in which a populace prefers candidate A to candidate B, candidate B to C, and yet candidate C to A. This occurs because the majority that favors C is misleadingly divided among different groups.
  10. Limited Hangout: A common tactic by journos & politicians of revealing intriguing but relatively innocent info to satisfy curiosity and prevent discovery of more incriminating info. E.g. a politician accused of snorting cocaine may confess to having smoked marijuana at college.
  11. Focusing Illusion: Nothing is ever as important as what youre thinking about while youre thinking about it. E.g. worrying about a thing makes the thing being worried about seem worse than it is. As Marcus Aurelius observed, We suffer more often in imagination that in reality.
  12. Concept Creep: As a social issue such as racism or sexual harassment becomes rarer, people react by expanding their definition of it, creating the illusion that the issue is actually getting worse. I explain the process in detail here: https://rabbitholemag.com/how-progress-blinds-people-to-progress/
  13. Streetlight Effect: People tend to get their information from where its easiest to look. E.g. the majority of research uses only the sources that appear on the first page of Google search results, regardless of how factual they are. Cumulatively, this can skew an entire field.
  14. Belief Bias: Arguments we’d normally reject for being idiotic suddenly seem perfectly logical if they lead to conclusions we approve of. In other words, we judge an arguments strength not by how strongly it supports the conclusion but by how strongly we support the conclusion.
  15. Pluralistic Ignorance: Phenomenon where a group goes along with a norm, even though all of the group members secretly hate it, because each mistakenly believes that the others approve of it. (See also: Abilene Paradox)
  16. The Petrie Multiplier: In fields in which men outnumber women, such as in STEM, women receive an underestimated amount of harassment due to the fact that there are more potential givers than receivers of harassment. (See also: LotkaVolterra equations)
  17. Woozle Effect: An article makes a claim without evidence, is then cited by another, which is cited by another, and so on, until the range of citations creates the impression that the claim has evidence, when really all articles are citing the same uncorroborated source.
  18. Tocqueville Paradox: As the living standards in a society rise, the peoples expectations of the society rise with it. The rise in expectations eventually surpasses the rise in living standards, inevitably resulting in disaffection (and sometimes populist uprisings).
  19. Ultimate Attribution Error: We tend to attribute good acts by allies to their character, and bad acts by allies to situational factors. For opponents, its reversed: good acts are attributed to situational factors, and bad acts to character.
  20. Golden Hammer: When someone, usually an intellectual who has gained a cultish following for popularizing a concept, becomes so drunk with power he thinks he can apply that concept to everything. Every mention of this concept should be accompanied by a picture of @nntaleb.
  21. Pareto Principle: Pattern of nature in which ~80% of effects result from ~20% of causes. E.g. 80% of wealth is held by 20% of people, 80% of computer errors result from 20% of bugs, 80% of crimes are committed by 20% of criminals, 80% of box office revenue comes from 20% of films
  22. Nirvana Fallacy: When people reject a thing because it compares unfavorably to an ideal that in reality is unattainable. E.g. condemning capitalism due to the superiority of imagined socialism, condemning ruthlessness in war due to imagining humane (but unrealistic) ways to win.
  23. Emotive Conjugation: Synonyms can yield positive or negative impressions without changing the basic meaning of a word. Example: someone who is obstinate (neutral term) can be headstrong (positive) or pig-headed (negative). This is the basis for much bias in journalism.
  24. Anentiodromia: An excess of something can give rise to its opposite. E.g. A society that is too liberal will be tolerant of tyrants, who will eventually make it illiberal. I explain more here: https://quillette.com/2018/09/30/alex-jones-was-victimized-by-one-oligopoly-but-he-perpetuated-another/
  25. Halo Effect: When a person sees an agreeable characteristic in something or someone, they assume other agreeable characteristics. Example: if a Trump supporter sees someone wearing a MAGA cap, hes likely to think that person is also decent, honest, hard-working, etc.
  26. Outgroup Homogeneity Effect: We tend to view outgroup members as all the same e.g. believing all Trump supporters would see someone wearing a MAGA cap, and think that person is also decent, honest, hard-working, etc.
  27. Matthew Principle: Advantage begets advantage, leading to social, economic, and cultural oligopolies. The richer you are the easier it is to get even richer, the more recognition a scientist receives for a discovery the more recognition hell receive for future discoveries, etc.
  28. Peter Principle: People in a hierarchy such as a business or government will be promoted until they suck at their jobs, at which point they will remain where they are. As a result, the world is filled with people who suck at their jobs.
  29. Lokis Wager: Fallacy where someone tries to defend a concept from criticism, or dismiss it as a myth, by unduly claiming it cannot be defined. E.g. God works in mysterious ways (god of the gaps), race is biologically meaningless (Lewontins fallacy). https://twitter.com/G_S_Bhogal/status/1225562651918508041/photo/1
  30. Subselves: We use different mental processes in different situations, so each of us is not a single character but a collection of different characters, who take turns to commandeer the body depending on the situation. There is an office you, a lover you, an online you, etc.
  31. Goodharts Law: When a measure becomes a goal, it ceases to become a measure. E.g. British colonialists tried to control snakes in India. They measured progress by number of snakes killed, offering money for snake corpses. People responded by breeding snakes & killing them.
  32. Radical Phase Transition: Extremist movements can behave like solids (tyrannies), liquids (insurgencies), and gases (conspiracy theories). Pressuring them causes them to go from solid => liquid => gas. Leaving them alone causes them to go from gas => liquid => solid.
  33. Radical Phase Transition: Extremist movements can behave like solids (tyrannies), liquids (insurgencies), and gases (conspiracy theories). Pressuring them causes them to go from solid => liquid => gas. Leaving them alone causes them to go from gas => liquid => solid.
  34. Shifting Baseline Syndrome: Frog says to Fish, hows the water? Fish replies, whats water? We become blind to what were familiar with. And since the world is always changing, and we’re always getting used to it, we can even become blind to the slow march of catastrophe.
  35. Availability Cascade: When a new concept enters the arena of ideas, people react to it, thereby amplifying it. The idea thus becomes more popular, causing even more people to amplify it by reacting to it, until everyone feels the need to talk about it.
  36. Reactance Theory: When someone is restricted from expressing a POV, or pressured to adopt a different POV, they usually react by believing their original POV even more. For a detailed example read my piece on my attempt to deradicalize a neo-Nazi: https://areomagazine.com/2017/10/28/how-not-to-de-radicalize-a-twitter-neo-nazi/
  37. Predictive Coding: There is no actual movement on a TV screen; your brain invents it. There are no actual spaces between spoken words; your brain inserts them. Human perception is like predictive text, replacing the unknown with the expected. Predictive Coding leads to…
  38. Apophenia: We impose our imaginations on arrangements of data, seeing patterns where no such patterns exist. A common form of Apophenia is
  39. Narrative Fallacy: When we see a sequence of facts we interpret them as a story by threading them together into an imagined chain of cause & effect. If a drug addict commits suicide we assume the drug habit led to the suicide, even if it didnt. Another form of Apophenia is
  40. Pareidolia: For aeons predators stalked us in undergrowth & shadow. In such times survival favored the paranoidthose who could discern a wolf from the vaguest of outlines. This paranoia preserved our species, but cursed us with pareidolia, so we now see wolves even in the skies. https://twitter.com/G_S_Bhogal/status/1225562685246492678/photo/1

Source:

https://twitter.com/G_S_Bhogal/status/1225561131122597896