Cognitive biases:
System-1 thinking: Our rapid source of ‘gut feelings’, intuitions, wordless and ‘black box’ thinking: the domain of ‘Aha!’ moments.
System-2 thinking: Slower, more deliberate verbal processes of calculation and deduction.
Heuristics belong especially to System-1 thinking, while System-2 thinking uses more conscious and manipulable ‘mental models’.
Why we struggle to make logical decisions
Our clean, square, and climate-controlled environments may be impairing our abilities to distinguish certain visual input, such as knowing that the illusion of two identical sized lines with opposing arrow heads are in fact the same size. Alternatively, we can think of this deficit as an adaptation to our environment rather than a loss. This unnatural symmetry has restructured our WEIRD brains.
Lactose is apparently a functional substitute for vitamin D in promoting the uptake of calcium, which is rare at the poles. Among the desert people, being able to digest milk allowed them to avoid dehydration. Lactase persistence was born of a particular environmental condition, which moved onto a genetic layer.
The false nature versus nurture dichotomy is disruptive, as it interferes with a more nuanced understanding of what we are and the evolutionary forces that have brought us here. The change in susceptibility to optical illusions seen in WEIRD countries is no less evolutionary than the change in ability to digest dairy in European and Bedouin peoples. The latter has a genetic component, and we have no reason to think that the first one does. Yet they are both equally evolutionary.
Carpentered corners create greater susceptibility to certain optical illusions. Overreliance on chairs creates all manner of negative health outcomes. What, then, might deodorants and perfumes have done to our ability to smell the signals emitted by our bodies? What might clocks have done to our sense of time? What have airplanes done to our sense of space, or the internet to our sense of competence? What have maps done to our sense of direction, or schools to our sense of family?
Adaptation and Chesterton’s Fence
Three-part Test of Adaptation
If a trait
then it is presumed to be an adaptation.
This model can produce false negatives, but not false positives. Revealing the sufficient, but not necessary evidence that something is an adaptation. There will always be exceptions, such as the loss of pigment in a polar bear’s fur or the loss of hair on a naked mole rat, but this is due to energy cost saving.
Let’s try the test on the human appendix.
The hygiene hypothesis posits that because we live in ever-cleaner surroundings, and are therefore exposed to ever fewer microorganisms, our immune systems are inadequately prepared, and so develop regulatory problems, such as allergies, autoimmune disorders, and perhaps even some cancers. Our immune systems are not functioning as they evolved to do because we have cleansed our environments too thoroughly.
If we can’t see the use of something, we should not clear it away until we know of its function.
Trade-offs
Broadly speaking, there are two types of trade-offs:
Humans have avoided this trap by building outside of ourselves. We are a broadly generalist species, with the capacity for individuals—and cultures—to go deep and specialize in myriad contexts and skill sets.
But for all of our cleverness, we can’t evade all trade-offs. Presuming that we can is one mistake of Cornucopianism, which imagines a world so full of both resource and human ingenuity that, magically, trade-offs no longer rule. Related to Cornucopianism, or perhaps fueling it, is the fact that the Sucker’s Folly can create the illusion that we have conquered trade-offs by blinding us with the richness and opulence of our short-term gains. This is a mirage. The trade-offs are still there, and the cost for all that wealth will be paid, either by those who live elsewhere or by our descendants.
Trade-offs are unavoidable, but this has a remarkable upside: it drives the evolution of diversity.
Everyday Costs and Pleasures
We moderns have a hard time imagining the risks worth taking to find more food, the lengths one might reasonably go to protect what they have, and the value that might accompany technological innovations that allow people to stretch the value of food they have already acquired.
While we tend to think that the goal of cooking is to make food taste better, much of the world’s many culinary traditions have more practical goals—detoxifying foods, amplifying their nutritional value, and protecting them from microbial competitors as we carry them across space or preserve them over time. We salt and smoke meats to ensure that microorganisms that attempt to steal them will die of dehydration. We make fruit preserves with high concentrations of sugar for much the same reason. We pasteurize and freeze perishable vegetables to kill the microbes already on them and to exclude all newcomers.
Milk evolved to nourish babies straight from their mothers’ mammary glands. As such, milk is full of nutrients. But because milk is meant to be consumed immediately, with little or no contact with the outside world, milk has no defense against environmental bacteria, and we moderns must go to extreme lengths—pasteurization, hermetic sealing followed by refrigeration—just to preserve milk for a week or two. Clearly, an ancestor who needed to preserve milk over a long and unproductive winter would have needed a better solution.
We are all born with basic rules of thumb about what we should and shouldn’t eat. A peach smells good. A clam that has been sitting in the sun smells bad. Grilled meat smells good. Carrion smells bad. These rules are an initial guess at the net value of a potential food, but if one stops there, then a lot of nutritious, edible things will be missed.
Our long-evolved warning system—if it smells bad, be wary—is unreliable in two ways: (1) many solvents smell good to some people, and (2) smelling them is sufficient to cause physiological harm.
Our detectors for CO2 are so ancient and deeply wired that even people with brain damage to the amygdala, such that they never panic under other fear-inducing circumstances, find themselves triggered into panic by high concentrations of CO2. By comparison with CO2, carbon monoxide (CO) is extremely dangerous: it binds to hemoglobin, displaces oxygen, and brings on a quiet sleep from which people do not wake. Why, then, do we have an internal detector for CO2, which is dangerous at high densities but nontoxic, but we don’t have a detector for carbon monoxide, a deadly toxin?
Smell is no longer a sufficient early warning system for hazards, because detection and harm are now simultaneous in many cases. We face novel levels of novelty, and selection simply can’t keep up.
The Corrective Lens
Falsification / Confirmation Bias
What a man wishes, he also believes. Similarly, what we believe is what we choose to see. This is commonly referred to as the confirmation bias. It is a deeply ingrained mental habit, both energy-conserving and comfortable, to look for confirmations of long-held wisdom rather than violations. Yet the scientific process – including hypothesis generation, blind testing when needed, and objective statistical rigor – is designed to root out precisely the opposite, which is why it works so well when followed.
The modern scientific enterprise operates under the principle of falsification: A method is termed scientific if it can be stated in such a way that a certain defined result would cause it to be proven false. Pseudo-knowledge and pseudo-science operate and propagate by being unfalsifiable – as with astrology, we are unable to prove them either correct or incorrect because the conditions under which they would be shown false are never stated.
Tendency to Stereotype
The tendency to broadly generalize and categorize rather than look for specific nuance. Like availability, this is generally a necessary trait for energy-saving in the brain.
Failure to See False Conjunctions
Most famously demonstrated by the Linda Test, the same two psychologists showed that students chose more vividly described individuals as more likely to fit into a predefined category than individuals with broader, more inclusive, but less vivid descriptions, even if the vivid example was a mere subset of the more inclusive set. These specific examples are seen as more representative of the category than those with the broader but vaguer descriptions, in violation of logic and probability.
Fear of the unknown, messiah complex
Ignorance
“If you feel overwhelmed and confused by the global predicament, you are on the right track. Global processes have become too complicated for any single person to understand. How then can you know the truth about the world, and avoid falling victim to propaganda and misinformation?”
Democracy was founded on the idea that the voter knows best, free-market capitalism believes the customer is always right, and liberal education teaches students to think for themselves. However, it is a mistake to put too much trust into the “rational individual”. Most human decisions are based on emotions and heuristic shortcuts, which were great in the stone age but not silicon valley.
Individuality is a myth. What gave us the edge over other animals was our ability to operate in groups. We rely on the expertise of others to get by with almost all of our daily tasks and yet we believe to know more than we really do. We have the illusion of knowledge but it works well for us. It is energy efficient to not have to learn everything. Then again, people have extremely strong views on many things that fall outside their expertise and rarely appreciate their ignorance. Most Americans who have strong aggressive beliefs about the Middle East can’t even find any of their countries on the map. The power of groupthink is so hard to dispel that you would sooner anger people with opposing facts than sway them. Most people don’t like to feel stupid, especially if it goes against a group ideology that they belong to.
Great power inevitably distorts the truth. With power all problems appear to be overcome with it. Anybody who talks with you will have a conscious or unconscious agenda, so you can never have full faith in what they say. Great power thus acts like a black hole that warps everything around it. Revolutionary knowledge rarely makes it way to the center, because the center is built on pre-existing knowledge. The periphery however is also filled with conspiracies and superstitious nonsense, which makes it very hard to distinguish between great ideas and great loads of shit. It takes too much time wading through it to find the gems so a person of power doesn’t have that kind of luxury.
Anchoring: Just seeing a number will affect how you predict or decide some quantity. By offering an example you can prime somebody’s perception on what something is worth.
Fundamental Attribution Error: We tend to ascribe other people’s behavior to their personality, instead of looking at the situation and the context in which their behavior occurs. We might excuse our own actions more easily. In reality, context is everything.
Self-serving Bias: The tendency to believe that if the project is a success, I’m responsible. If I failed, I’m not.
Need for closure: We are not comfortable with doubt and uncertainty – so much so that we’ll go to great lengths to resolve open issues. Resolving a problem early just masks it rather than solves it.
Confirmation Bias: Everyone looks for choice facts to fit your own preconceptions and theories.
Exposure Effect: We tend to prefer things because they are familiar.
Hawthorne Effect: People change their behavior when they know they are being studied. Discipline is high, the excitement of something new fuels the effort. Eventually it wears off.
False Memory: It’s easy for your brain to confuse imagined events with real memories. We’re susceptible to the power of suggestion. Memory may be rewritten and changed with age, experience, worldview, focus, etc.
Symbolic Reduction Fallacy: Quick symbolism to represent something complicated, which loses the nuances and truth.
Nominal Fallacy: Labeling a thing means you can explain it or understand it but a label is just that; and naming it alone does not offer any useful understanding.
How to overcome your biases:
Anything in the world when you’re born is normal and ordinary and a part of how the world works. Anything invented between 15-35 is new and exciting and revolutionary and you can probably get a career in it. Anything after 35 is against the natural order of things.
The biases that drive you change over time and will be different across generations. Some folks value job stability at the expense of abuse from their boss whereas others will leave at the slightest perceived offense. We tend not to wonder why we value the things we do. They could be instilled from parents, peers, or models. You are a product of your times. You and the rest of your cohort are united by shared memories, common habits, and popular styles, as well as age and station in life at the time. Examples are:
Four generational archetypes and their dominant characteristics:
Archetypes produce opposing archetypes in the following generation. Nomad->X-er, Prophet->Boomer, Artist->Silent (1925) and Homeland, Hero->GI (1901-24) and millennial.
Hedge your bets by embracing diversity. This prevents being pigeonholed and falling to your biases.
Myers Briggs Type Indicator:
Lizard Logic. How to act like a lizard:
Notice how long it takes you to get over your initial reaction to a perceived threat. How does your reaction change once you “think about it”?
Act on that impulse but not immediately. Plan for it; schedule it. Does it make sense later?
Write a new movie. If you’re troubled by a given film that keeps replaying in your head, sit down and craft a new one with a happy ending.
Smile. There’s evidence to suggest smiling is as effective as antidepressant medication.
“The fact that we live at the bottom of a deep gravity well, on the surface of a gas-covered planet going around a nuclear fireball 90 million miles away and think this to be normal is obviously some indication of how skewed our perspective tends to be.” Douglas Adams, The Salmon of Doubt
Trust intuition, but verify. If you believe your way intuitively feels better, great, but make sure it isn’t a cognitive bias at play first. Get some feedback, create a prototype, run some tests, and chart some benchmarks. Do what you need to to prove your idea is good, because your intuition may be wrong.
If you are dead solid convinced of something, ask yourself why. How do you know> Says who? Compared to whom?
Is there anything you can actually measure? Get hard numbers on? Any statistics? What happens when you talk this over with a colleague or someone with a very different perspective to you? Do they passively agree? Is that a danger sign? Do they violently oppose the idea? Does that give it credibility or not?
If you think you’ve defined something, try to define its opposite. This can help to avoid nominal fallacy. If all you have is a label it is hard to pin down its opposite in any detail. Contrast a behavior, an observation, a theory, etc. in detail. This gives you a deeper look at your definition with a more critical and attentive eye. Expectations color reality. This is why certain faux news channels use sensationalism.
When in conflict, consider basic personality types, generational values, your own biases, others’ biases, the context, and the environment. Examine your own position carefully.