The Secrets of Life, the Universe, and Everything
Betteridge’s Law: “Any headline that ends in a question mark can be answered by the word no.”
Brandolini’s Law: “The amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it.”
Campbell’s Law: “The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.” Similar to Goodhart’s Law: a metric that becomes a target ceases to be a useful metric.
Chesterton’s Fence: “Do not remove a fence until you know why it was put up in the first place… Chesterton’s Fence is not an admonishment of anyone who tries to make improvements; it is a call to be aware of second-order thinking before intervening… Unless we know why someone made a decision, we can’t safely change it or conclude that they were wrong.”
Cobra Effect: The British Colonial Government came up with a plan to control the population of venomous cobras in Delhi. They offered a bounty for every dead cobra turned in. While many bounties were paid, the program was ineffective at controlling the number of cobras in the city. “Under the new policy, cobras provided a rather stable source of income. In addition, it was much easier to kill captive cobras than to hunt them in the city. So the snake catchers increasingly abandoned their search for wild cobras, and concentrated on their breeding programs.” Once officials discovered what was going on, they stopped the bounty program. “As a final act the breeders, now stuck with nests of worthless cobras, simply released them into the city, making the problem even worse than before! The lesson is that simplistic policies can come back to bite you.” See also Systems Thinking.
Cunningham’s Law: “the best way to get the right answer on the internet is not to ask a question; it’s to post the wrong answer.”
Dunning–Kruger Effect: “a cognitive bias wherein people of low ability have illusory superiority, mistakenly assessing their cognitive ability as greater than it is.” See also meta-ignorance (ignorance of one’s ignorance).
Fortune Cookie Effect, aka the Forer Effect, aka the Barnum Effect: “describes when individuals believe that generic information, which could apply to anyone, applies specifically to themselves… This bias takes advantage of our gullibility and well-meaning nature… Commonly seen in marketing and engagement campaigns, elements of the Barnum effect provide customers with the impression of product customization.”
Fredkin’s Paradox: “The more equally attractive two alternatives seem, the harder it can be to choose between them—no matter that, to the same degree, the choice can only matter less.” Similar to Buridan’s Ass: “a hypothetical situation wherein a donkey that is equally hungry and thirsty is placed precisely midway between a stack of hay and a pail of water. Since the paradox assumes the ass will always go to whichever is closer, it dies of both hunger and thirst since it cannot make any rational decision between the hay and water.”
Fundamental Attribution Error: “the tendency people have to overemphasize personal characteristics and ignore situational factors in judging others’ behavior. Because of the fundamental attribution error, we tend to believe that others do bad things because they are bad people. We’re inclined to ignore situational factors that might have played a role.”
Gell-Mann Amnesia Effect: “I believe everything the media tells me except for anything for which I have direct personal knowledge, which they always get wrong.”
Godwin’s Law: “as an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1; that is, if an online discussion (regardless of topic or scope) goes on long enough, sooner or later someone will compare someone or something to Adolf Hitler or his deeds, the point at which effectively the discussion or thread often ends.”
Hanlon’s Razor: “Never attribute to malice that which is adequately explained by stupidity.”
Hick’s Law, aka Hick-Hyman Law: “the more choices a person is presented with, the longer the person will take to reach a decision.”
Illusory Truth Effect: “If repeated enough times, the information may be perceived to be true even if sources are not credible.”
Jeane Dixon Effect: “whereby the relatively few correct predictions are heralded and therefore widely remembered, while the much more numerous incorrect predictions are conveniently forgotten or deemphasized.”
Lindy Effect: “the observed lifespan of a non-perishable item like a business is most likely to be at its half-life. So, if a business is 100 years old, [one] should expect it to be around for another 100 years. And a business that has been around for 10 years should be around for another 10 years. Under the Effect, the mortality of a business actually decreases with time.”
Lister’s Law: “People under time pressure don’t think faster.”
Lucretius Problem: “a mental defect where we assume the worst-case event that has happened is the worst-case event that can happen.”
Mandela Effect: “a situation in which a large mass of people believes that an event occurred when it did not.”
Murphy’s Law: “Anything that can go wrong will go wrong.”
Nut Island Effect: “a phenomenon in organizations whereby teams of talented employees become isolated from managers, making it impossible for the team to perform a key function or task.” Coined by Paul F. Levy former executive director of the Massachusetts Water Resources Authority in a Harvard Business Review article.
Occam’s Razor: Short version: “when you have two competing theories that make exactly the same predictions, the simpler one is the better.” Sometimes modified to: “The explanation requiring the fewest assumptions is most likely to be correct.” Longer version: If two competing theories explain a single phenomenon, and they both generally reach the same conclusion, and they are both equally persuasive and convincing, and they both explain the problem or situation satisfactorily, the logician should always pick the less complex one.
Orgel’s Second Rule: “Evolution is cleverer than you are.” A reference to common assumptions which are not backed by evidence.
Parkinson’s Law of Triviality aka The Bike Shed Effect: “The amount of time spent discussing an issue in an organization is inversely correlated to its actual importance in the scheme of things. Major, complex issues get the least discussion while simple, minor ones get the most discussion.” Not to be confused with Parkinson’s Law: “Work expands so as to fill the time available for its completion.”
Poe’s Law: “Without a clear indication of the author’s intent, it is difficult or impossible to tell the difference between an expression of sincere extremism and a parody of extremism.”
Pratfall Effect: a psychological phenomenon that says that competent people appear more likeable and attractive when they make a mistake than when they are perfect.
Pygmalion Effect: “the phenomenon whereby others’ expectations of a target person affect the target person’s performance. The effect is named after the Greek myth of Pygmalion, a sculptor who fell in love with a statue he had carved.” Also known as the Rosenthal Effect. “A corollary of the Pygmalion effect is the Golem Effect, in which low expectations lead to a decrease in performance; both effects are forms of self-fulfilling prophecy.”
Righteousness Fallacy: “assuming that just because a person’s intentions are good, they have the truth or facts on their side.” Also known as the Fallacy of Good Intentions.
Risk Homeostasis: “If people subjectively perceive that the level of risk is low then they modify their behavior to increase their exposure to risk. Conversely, if they perceive a higher than acceptable risk they will compensate by exercising greater caution.” See also the Pelzman Effect: “In a controversial 1975 research paper… [economist Sam] Peltzman argued that highway safety regulations were not reducing highway deaths… due to risk compensation, where drivers who felt safer made riskier choices that canceled out the safety benefits.”
Searls’ Law: “Logic and reason sit on the mental board of directors, but emotions cast the deciding votes.”
Second Level Thinking: “First level thinking is simplistic and superficial… The second-level thinker takes a great many things into account: What is the range of likely future outcomes? Which outcome do I think will occur? What’s the probability I’m right? What does the consensus think? How does my expectation differ from the consensus?”
Shirky Principle: “Institutions will try to preserve the problem to which they are the solution.”
Streisand Effect: “the unintended consequence of further publicizing information by trying to have it censored.”
Sturgeon’s Law: 99% of everything is crap.
Twyman’s Law: “The more unusual or interesting the data, the more likely they are to have been the result of an error of one kind or another.”
Useful Idiot: “a naive or credulous person who can be manipulated or exploited to advance a cause or political agenda.”
Von Restorff Effect: a bias for remembering the unusual, also known as the Isolation Effect. “When multiple homogeneous stimuli are presented, the stimulus that differs from the rest is more likely to be remembered.”
Weiler’s Law: Nothing is impossible to the man who doesn’t have to do it himself.
Woods’ Law of Fluency: Expertise hides effort. Skilled people make it look easy. Or as David Woods put it: “Well-adapted work occurs with a facility that belies the difficulty of the demands resolved and the dilemmas balanced.”
Woozle Effect: An article makes a claim without evidence, is then cited by another, which is cited by another, and so on, until the range of citations creates the impression that the claim has evidence, when really all articles are citing the same uncorroborated source. JP Castlin calls this Onion Reasoning.
The title of this page is a reference to The Hitchhiker’s Guide to the Galaxy by Douglas Adams, originally a BBC radio comedy, later a book and a feature film. In the story, the answer to the ultimate question of life, the universe, and everything is 42. But the answer is meaningless because it takes the computer many years to calculate this answer, and nobody knows what the question was.