The Secrets of Life, the Universe, and Everything

Betteridge’s Law: “Any headline that ends in a question mark can be answered by the word no.”

Brandolini’s Law: “The amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it.”

Campbell’s Law: “The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.” Similar to Goodhart’s Law: a metric that becomes a target ceases to be a useful metric.

Chesterton’s Fence: “Do not remove a fence until you know why it was put up in the first place… Chesterton’s Fence is not an admonishment of anyone who tries to make improvements; it is a call to be aware of second-order thinking before intervening… Unless we know why someone made a decision, we can’t safely change it or conclude that they were wrong.”

Chihuahua Syndrome: messy data from variations in spelling or input. Chris Groskopf quoted in Seeing with Fresh Eyes—Meaning, Space, Data, Truth by Edward Tufte: “There is no worse way to screw up data than to let a single human type it in, without validation. I acquired a complete dog licensing database. Instead of requiring people registering their dog to choose a breed from a list, the system gave dog owners a text field to type into, so this database had 250 spellings of Chihuahua.” Not to be confused with Small Dog Syndrome.

Cobra Effect: The British Colonial Government came up with a plan to control the population of venomous cobras in Delhi. They offered a bounty for every dead cobra turned in. While many bounties were paid, the program was ineffective at controlling the number of cobras in the city. “Under the new policy, cobras provided a rather stable source of income. In addition, it was much easier to kill captive cobras than to hunt them in the city. So the snake catchers increasingly abandoned their search for wild cobras, and concentrated on their breeding programs.” Once officials discovered what was going on, they stopped the bounty program. “As a final act the breeders, now stuck with nests of worthless cobras, simply released them into the city, making the problem even worse than before! The lesson is that simplistic policies can come back to bite you.” See also Systems Thinking.

Cunningham’s Law: “the best way to get the right answer on the internet is not to ask a question; it’s to post the wrong answer.”

Curley Effect: “in which inefficient redistributive policies are sought not by interest groups protecting their rents, but by incumbent politicians trying to shape the electorate through emigration of their opponents.”

DunningKruger Effect: “a cognitive bias wherein people of low ability have illusory superiority, mistakenly assessing their cognitive ability as greater than it is.”  See also meta-ignorance (ignorance of one’s ignorance).

Enshittification: a term coined by Cory Doctorow, referring to “a three stage process: First, platforms are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves. Then, they die… The services that matter to us, that we rely on, are turning into giant piles of shit.”

Fachidiot (German) or senmon baka 専門バカ (Japanese): A person who has expertise in a narrow field, but is otherwise an idiot. See also Epistemic Trespassing; ultracrepidarian.

Fortune Cookie Effect, aka the Forer Effect, aka the Barnum Effect: “describes when individuals believe that generic information, which could apply to anyone, applies specifically to themselves… This bias takes advantage of our gullibility and well-meaning nature… Commonly seen in marketing and engagement campaigns, elements of the Barnum effect provide customers with the impression of product customization.”

Fredkin’s Paradox: “The more equally attractive two alternatives seem, the harder it can be to choose between them—no matter that, to the same degree, the choice can only matter less.” Similar to Buridan’s Ass: “a hypothetical situation wherein a donkey that is equally hungry and thirsty is placed precisely midway between a stack of hay and a pail of water. Since the paradox assumes the ass will always go to whichever is closer, it dies of both hunger and thirst since it cannot make any rational decision between the hay and water.”

Fundamental Attribution Error: “the tendency people have to overemphasize personal characteristics and ignore situational factors in judging others’ behavior. Because of the fundamental attribution error, we tend to believe that others do bad things because they are bad people. We’re inclined to ignore situational factors that might have played a role.”

Gell-Mann Amnesia Effect: “I believe everything the media tells me except for anything for which I have direct personal knowledge, which they always get wrong.”

Godwin’s Law: “as an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1; that is, if an online discussion (regardless of topic or scope) goes on long enough, sooner or later someone will compare someone or something to Adolf Hitler or his deeds, the point at which effectively the discussion or thread often ends.”

Gurwinder’s Third Paradox: “In order for you to beat someone in a debate, your opponent needs to realize they’ve lost. Therefore, it’s easier to win an argument against a genius than an idiot.”

Hanlon’s Razor: “Never attribute to malice that which is adequately explained by stupidity.”

Hedonic Treadmill: “a hypothesis proposing that people’s happiness tends to return to a preexisting baseline level after positive or negative life events have occurred. According to this concept, positive and negative events may produce short-term shifts in mood, but these shifts tend to erode in a relatively brief period of time.”

Hick’s Law, aka Hick-Hyman Law: “the more choices a person is presented with, the longer the person will take to reach a decision.”

Hofstadter’s Law: “It always takes longer than you expect, even when you take into account Hofstadter’s Law.” — p.152, Gödel, Escher, Bach by Douglas R. Hofstadter

Illusory Truth Effect: “If repeated enough times, the information may be perceived to be true even if sources are not credible.”

Jeane Dixon Effect: “whereby the relatively few correct predictions are heralded and therefore widely remembered, while the much more numerous incorrect predictions are conveniently forgotten or deemphasized.”

Lindy Effect: “the observed lifespan of a non-perishable item like a business is most likely to be at its half-life. So, if a business is 100 years old, [one] should expect it to be around for another 100 years. And a business that has been around for 10 years should be around for another 10 years. Under the Effect, the mortality of a business actually decreases with time.”

Lister’s Law: “People under time pressure don’t think faster.”

Lucretius Problem: “a mental defect where we assume the worst-case event that has happened is the worst-case event that can happen.”

Mandela Effect: “a situation in which a large mass of people believes that an event occurred when it did not.”

McNamara Fallacy: “when a decision is based solely on numbers (e.g. metrics or statistics) and all qualitative factors are ignored. Doing this makes us blind to what is really going on.”

Murphy’s Law: “Anything that can go wrong will go wrong.”

Nut Island Effect: “a phenomenon in organizations whereby teams of talented employees become isolated from managers, making it impossible for the team to perform a key function or task.” Coined by Paul F. Levy former executive director of the Massachusetts Water Resources Authority in a Harvard Business Review article.

Occam’s Razor: Short version: “when you have two competing theories that make exactly the same predictions, the simpler one is the better.” Sometimes modified to: “The explanation requiring the fewest assumptions is most likely to be correct.” Longer version:  If two competing theories explain a single phenomenon, and they both generally reach the same conclusion, and they are both equally persuasive and convincing, and they both explain the problem or situation satisfactorily, the logician should always pick the less complex one.

Orgel’s Second Rule: “Evolution is cleverer than you are.” A reference to common assumptions which are not backed by evidence.

The Overton Window: “a model for understanding how ideas in society change over time and influence politics. The core concept is that politicians… generally only pursue policies that are widely accepted throughout society as legitimate policy options. These policies lie inside the Overton Window. Other policy ideas exist, but politicians risk losing popular support if they champion these ideas. These policies lie outside the Overton Window.”

Parkinson’s Law of Triviality aka The Bike Shed Effect: “The amount of time spent discussing an issue in an organization is inversely correlated to its actual importance in the scheme of things. Major, complex issues get the least discussion while simple, minor ones get the most discussion.” Not to be confused with Parkinson’s Law: “Work expands so as to fill the time available for its completion.”

The Peter Principle: “In a hierarchy, everyone tends to rise to his level of incompetence.”

Pfeffer’s Law: “Instead of being interested in what is new, we ought to be interested in what is true.” — page 29, Hard Facts, Dangerous Half-Truths And Total Nonsense by Jeffrey Pfeffer and Robert I. Sutton

Poe’s Law: “Without a clear indication of the author’s intent, it is difficult or impossible to tell the difference between an expression of sincere extremism and a parody of extremism.”

Pournelle’s Iron Law of Bureaucracy: Organizations are made of two types of people: “First, there will be those who are devoted to the goals of the organization… Secondly, there will be those dedicated to the organization itself. The Iron Law states that in every case the second group will gain and keep control of the organization. It will write the rules, and control promotions within the organization.”

Pratfall Effect: a psychological phenomenon that says that competent people appear more likeable and attractive when they make a mistake than when they are perfect.

Pygmalion Effect: “the phenomenon whereby others’ expectations of a target person affect the target person’s performance. The effect is named after the Greek myth of Pygmalion, a sculptor who fell in love with a statue he had carved.” Also known as the Rosenthal Effect. “A corollary of the Pygmalion effect is the Golem Effect, in which low expectations lead to a decrease in performance; both effects are forms of self-fulfilling prophecy.”

Righteousness Fallacy:  “assuming that just because a person’s intentions are good, they have the truth or facts on their side.” Also known as the Fallacy of Good Intentions.

The Ringelmann Effect: “people’s efforts quickly diminish as team size increases.”

Risk Homeostasis: “If people subjectively perceive that the level of risk is low then they modify their behavior to increase their exposure to risk. Conversely, if they perceive a higher than acceptable risk they will compensate by exercising greater caution.” See also the Pelzman Effect:  “In a controversial 1975 research paper… [economist Sam] Peltzman argued that highway safety regulations were not reducing highway deaths… due to risk compensation, where drivers who felt safer made riskier choices that canceled out the safety benefits.”

Searls’ Law: “Logic and reason sit on the mental board of directors, but emotions cast the deciding votes.”

Second Level Thinking: “First level thinking is simplistic and superficial… The second-level thinker takes a great many things into account: What is the range of likely future outcomes? Which outcome do I think will occur? What’s the probability I’m right? What does the consensus think? How does my expectation differ from the consensus?”

Shirky Principle: “Institutions will try to preserve the problem to which they are the solution.”

Simpson’s Paradox: “Every statistical relationship between two variables X and Y has the potential to be reversed when we include a third variable Z into the analysis.”

Streisand Effect: “the unintended consequence of further publicizing information by trying to have it censored.”

Stendhal Syndrome: refers to a collection of intense physical and mental symptoms you may experience while or after viewing a work of art.

Streetlight Effect: “when people only search for something where it is easiest to look.” — “Often cited in science circles, when researchers are cautioned not to pursue their inquiries only in clear, visible areas of study, but to look to hidden, unexplored places for the truth.”

Sturgeon’s Law: 99% of everything is crap.

Sutton’s Law: “If you think you had a new idea, you are probably wrong. Someone else already had it. And this isn’t my idea either; I stole it from someone else!” — page 29, Hard Facts, Dangerous Half-Truths And Total Nonsense by Jeffrey Pfeffer and Robert I. Sutton

Troxler’s Fading: an optical illusion whereby “an unchanging stimulus away from the fixation point will fade away and disappear.”

Twyman’s Law: “The more unusual or interesting the data, the more likely they are to have been the result of an error of one kind or another.”

Useful Idiot: “a naive or credulous person who can be manipulated or exploited to advance a cause or political agenda.

Von Restorff Effect: a bias for remembering the unusual, also known as the Isolation Effect. “When multiple homogeneous stimuli are presented, the stimulus that differs from the rest is more likely to be remembered.”

Weiler’s Law: Nothing is impossible to the man who doesn’t have to do it himself.

Woods’ Law of Fluency: Expertise hides effort. Skilled people make it look easy. Or as David Woods put it: “Well-adapted work occurs with a facility that belies the difficulty of the demands resolved and the dilemmas balanced.” See also sprezzatura (Italian): “studied nonchalance: graceful conduct or performance without apparent effort.”

Woozle Effect: An article makes a claim without evidence, is then cited by another, which is cited by another, and so on, until the range of citations creates the impression that the claim has evidence, when really all articles are citing the same uncorroborated source. JP Castlin calls this Onion Reasoning.

Zeigarnik Effect: “when the brain more readily recalls an interrupted task than a completed one.”


The title of this page is a reference to The Hitchhiker’s Guide to the Galaxy by Douglas Adams, originally a BBC radio comedy, later a book and a feature film. In the story, the answer to the ultimate question of life, the universe, and everything is 42. But the answer is meaningless because it takes the computer many years to calculate this answer, and nobody remembers what the question was.