The Human Operating Manual

Heuristics Basics

Cognitive biases:

  • Selective perception
  • Confirmation bias
  • False-consensus effect
  • Cognitive inertia (the inability to see perspectives other than one’s own)
  • Conservation bias
  • Information bias
  • Recency illusion
  • Clustering illusion
  • Bandwagon effect
  • Blind-spot spot
  • Anchoring (attachment to first source of information)
  • Cognitive ease/strain
    • Susceptibility to believe claims expressed with simpler words
  • Priming (suggestions that direct the attention to information repeated later on)
  • Intuitive heuristics (choosing the easiest solution to a difficult problem)

System-1 thinking: Our rapid source of ‘gut feelings’, intuitions, wordless and ‘black box’ thinking: the domain of ‘Aha!’ moments.
System-2 thinking: Slower, more deliberate verbal processes of calculation and deduction.

Heuristics belong especially to System-1 thinking, while System-2 thinking uses more conscious and manipulable ‘mental models’.

Hunter Gatherer Notes

Why we struggle to make logical decisions

Our clean, square, and climate-controlled environments may be impairing our abilities to distinguish certain visual input, such as knowing that the illusion of two identical sized lines with opposing arrow heads are in fact the same size. Alternatively, we can think of this deficit as an adaptation to our environment rather than a loss. This unnatural symmetry has restructured our WEIRD brains.

Lactose is apparently a functional substitute for vitamin D in promoting the uptake of calcium, which is rare at the poles. Among the desert people, being able to digest milk allowed them to avoid dehydration. Lactase persistence was born of a particular environmental condition, which moved onto a genetic layer.

The false nature versus nurture dichotomy is disruptive, as it interferes with a more nuanced understanding of what we are and the evolutionary forces that have brought us here. The change in susceptibility to optical illusions seen in WEIRD countries is no less evolutionary than the change in ability to digest dairy in European and Bedouin peoples. The latter has a genetic component, and we have no reason to think that the first one does. Yet they are both equally evolutionary.

Carpentered corners create greater susceptibility to certain optical illusions. Overreliance on chairs creates all manner of negative health outcomes. What, then, might deodorants and perfumes have done to our ability to smell the signals emitted by our bodies? What might clocks have done to our sense of time? What have airplanes done to our sense of space, or the internet to our sense of competence? What have maps done to our sense of direction, or schools to our sense of family?

Adaptation and Chesterton’s Fence

Three-part Test of Adaptation

If a trait

  • 1. is complex,
  • 2. has energetic or material costs, which vary between individuals, and
  • 3. has persistence over evolutionary time,

then it is presumed to be an adaptation.

This model can produce false negatives, but not false positives. Revealing the sufficient, but not necessary evidence that something is an adaptation. There will always be exceptions, such as the loss of pigment in a polar bear’s fur or the loss of hair on a naked mole rat, but this is due to energy cost saving.

Let’s try the test on the human appendix.

  • The appendix, which is found in a smattering of mammals, including some primates, rodents, and rabbits, is an outpouching from the large intestine, and it harbors intestinal microorganisms with which we are in a mutualistic relationship. From us, those intestinal flora get room and board; from them, we get the ability to repel infectious disease, and are facilitated in digestion and the development of our immune system.
  • Furthermore, the appendix is not made of the same material as the surrounding gut—it contains immune tissue.
  • Is it complex? Check. It also takes energetic and physical resources to grow and maintain, and has variation in size and capacity both between individuals and between species (check). Finally, it has a history, in mammals, that is more than fifty million years old (check).
  • The human appendix, therefore, is presumably an adaptation. We just don’t know what it is an adaptation for.
  • Possibly to repopulate the gut with mutualistic gut flora after a purging of gastrointestinal illness with diarrhea. In the non-WEIRD world, diarrhea is common and a significant cause of mortality. Appendicitis, on the other hand, is almost unknown to those non-WEIRD countries. It is a disorder of the WEIRD world. Just like many allergies and autoimmune disorders.

The hygiene hypothesis posits that because we live in ever-cleaner surroundings, and are therefore exposed to ever fewer microorganisms, our immune systems are inadequately prepared, and so develop regulatory problems, such as allergies, autoimmune disorders, and perhaps even some cancers. Our immune systems are not functioning as they evolved to do because we have cleansed our environments too thoroughly.

If we can’t see the use of something, we should not clear it away until we know of its function.

Trade-offs

Broadly speaking, there are two types of trade-offs:

  • Allocation trade-off: Because many things in biology are zero-sum (there are a finite number of resources), something has got to give.
  • Design-constraint: These are insensitive to supplementation. Meaning, you can’t just add more of something to solve a problem. In the example of a bat, you can either specialize in flying fast or flying agile, or be a generalist.

Humans have avoided this trap by building outside of ourselves. We are a broadly generalist species, with the capacity for individuals—and cultures—to go deep and specialize in myriad contexts and skill sets.

But for all of our cleverness, we can’t evade all trade-offs. Presuming that we can is one mistake of Cornucopianism, which imagines a world so full of both resource and human ingenuity that, magically, trade-offs no longer rule. Related to Cornucopianism, or perhaps fueling it, is the fact that the Sucker’s Folly can create the illusion that we have conquered trade-offs by blinding us with the richness and opulence of our short-term gains. This is a mirage. The trade-offs are still there, and the cost for all that wealth will be paid, either by those who live elsewhere or by our descendants.

Trade-offs are unavoidable, but this has a remarkable upside: it drives the evolution of diversity.

  • Photosynthesis, the process by which plants convert sunlight into sugar, occurs in the vast majority of plants in a form known as C3. C3 works best under conditions that are easy for plants—moderate temperatures and sunlight, and ample water. Because C3 photosynthesis requires that the pores on the leaves—the stomata, which allow intake of carbon dioxide—be open at the same time that sunlight is fueling photosynthesis, C3 photosynthesis comes at the cost of substantial water loss through the stomata. C3 plants, therefore, don’t do well where water is limited.
  • As plants began moving into more marginal environments, such as deserts, C3 photosynthesis posed a particular problem, and two new forms of photosynthesis evolved. One of them is CAM photosynthesis, which allows plants to separate in time when they open their stomata to take in carbon dioxide from when sunlight is fueling their photosynthesis. Having their stomata open at night, when temperatures and therefore evaporative loss are lower, allows CAM plants, like cacti and orchids, to conserve water.
  • CAM is more metabolically expensive to accomplish than C3 photosynthesis. But, in environments where sunlight is plentiful but water is not, CAM wins hands down against C3.
  • As an organism decreases its surface area to volume ratio, becoming ever more sphere-like, the amount of water lost from its surface is reduced. More spherical cacti lose less water than long and lean cacti because they have less surface area relative to their volume from which to lose water.

Everyday Costs and Pleasures

We moderns have a hard time imagining the risks worth taking to find more food, the lengths one might reasonably go to protect what they have, and the value that might accompany technological innovations that allow people to stretch the value of food they have already acquired.

While we tend to think that the goal of cooking is to make food taste better, much of the world’s many culinary traditions have more practical goals—detoxifying foods, amplifying their nutritional value, and protecting them from microbial competitors as we carry them across space or preserve them over time. We salt and smoke meats to ensure that microorganisms that attempt to steal them will die of dehydration. We make fruit preserves with high concentrations of sugar for much the same reason. We pasteurize and freeze perishable vegetables to kill the microbes already on them and to exclude all newcomers.

Milk evolved to nourish babies straight from their mothers’ mammary glands. As such, milk is full of nutrients. But because milk is meant to be consumed immediately, with little or no contact with the outside world, milk has no defense against environmental bacteria, and we moderns must go to extreme lengths—pasteurization, hermetic sealing followed by refrigeration—just to preserve milk for a week or two. Clearly, an ancestor who needed to preserve milk over a long and unproductive winter would have needed a better solution.

  • By rotting milk carefully, using specially cultivated bacteria and fungi that are not pathogenic to humans, milk can be preserved indefinitely. Cheese is such an elegant solution to the problem that, once made, even a block of cheese that is colonized by bad bacteria on the outside can have a thin layer of its surface removed to reveal the fresh, untainted cheese beneath.
  • The catch is that humans are programmed to be repulsed by the smell of spoiled milk because, in general, it is a bad idea to consume any substance that has been overrun by microbes.
  • We could tell a similar story about “thousand-year-old eggs,” sauerkraut, kimchi, or myriad other carefully preserved foodstuffs.

We are all born with basic rules of thumb about what we should and shouldn’t eat. A peach smells good. A clam that has been sitting in the sun smells bad. Grilled meat smells good. Carrion smells bad. These rules are an initial guess at the net value of a potential food, but if one stops there, then a lot of nutritious, edible things will be missed.

  • There has evolved, therefore, a secondary system that allows us to remap foods according to empirical information that may be picked up from kin (via culture), or perhaps discovered in hunger-driven desperation (via consciousness). We are constantly remapping foods based on their actual value, rather than on our initial reactions. We may acquire a taste for coffee because it stimulates us, and for beer because it carries the nutrition of bread without the short shelf life.

Our long-evolved warning system—if it smells bad, be wary—is unreliable in two ways: (1) many solvents smell good to some people, and (2) smelling them is sufficient to cause physiological harm.

  • Some truly toxic and otherwise dangerous substances encountered in the modern world have no detectable smell to them at all. Natural gas and propane are gases that have no smell we can detect, and each is capable of concentrating in ways that the tiniest spark can create a massive explosion.
  • Before propane and natural gas are piped into your home, or delivered to a tank outside of it, they have tert-Butyl mercaptan added to them. This compound gives these otherwise stealthy gases a unique sulfurous smell—like dirty socks or rotten cabbage—that we easily recognize, and with guidance now find alarming.

Our detectors for CO2 are so ancient and deeply wired that even people with brain damage to the amygdala, such that they never panic under other fear-inducing circumstances, find themselves triggered into panic by high concentrations of CO2. By comparison with CO2, carbon monoxide (CO) is extremely dangerous: it binds to hemoglobin, displaces oxygen, and brings on a quiet sleep from which people do not wake. Why, then, do we have an internal detector for CO2, which is dangerous at high densities but nontoxic, but we don’t have a detector for carbon monoxide, a deadly toxin?

  • A detector that causes an animal to become antsy, anxious, and in need of going elsewhere as their cave fills with CO2 is essential equipment. While it would be great to have a similar detector for carbon monoxide, that need is primarily modern, a consequence of industrial combustion. There is no reason to think that a CO detector would have been harder for natural selection to create, but the value of it is simply too recent to be in our hardware yet.

Smell is no longer a sufficient early warning system for hazards, because detection and harm are now simultaneous in many cases. We face novel levels of novelty, and selection simply can’t keep up.

The Corrective Lens

  • Become skeptical of novel solutions to ancient problems, especially when that novelty will be difficult to reverse if you change your mind later. New and audacious technologies—from experimental surgery, to the cessation of human development using hormones, to nuclear fission—may be wonderful and risk-free. But chances are, there are hidden (and not-so-hidden) costs.
  • Recognize the logic of trade-offs, and learn how to work with them. Division of labor allows human populations to beat trade-offs that individuals cannot. And by specializing in different habitats and niches, the human species beats trade-offs that no single population can.
  • Become someone who recognizes patterns about yourself. Hack your habits and your physiology. What stimulates you to eat? To exercise? To check social media? Understanding the patterns in your behaviors gives you a better chance of controlling those behaviors.
  • Look out for Chesterton’s fence and invoke the Precautionary principle when messing with ancestral systems. Remember this: “just because you can, doesn’t mean you should.”

21 Lessons Notes

Falsification / Confirmation Bias

What a man wishes, he also believes. Similarly, what we believe is what we choose to see. This is commonly referred to as the confirmation bias. It is a deeply ingrained mental habit, both energy-conserving and comfortable, to look for confirmations of long-held wisdom rather than violations. Yet the scientific process – including hypothesis generation, blind testing when needed, and objective statistical rigor – is designed to root out precisely the opposite, which is why it works so well when followed.

The modern scientific enterprise operates under the principle of falsification: A method is termed scientific if it can be stated in such a way that a certain defined result would cause it to be proven false. Pseudo-knowledge and pseudo-science operate and propagate by being unfalsifiable – as with astrology, we are unable to prove them either correct or incorrect because the conditions under which they would be shown false are never stated.

Tendency to Stereotype 

The tendency to broadly generalize and categorize rather than look for specific nuance. Like availability, this is generally a necessary trait for energy-saving in the brain.

Failure to See False Conjunctions

Most famously demonstrated by the Linda Test, the same two psychologists showed that students chose more vividly described individuals as more likely to fit into a predefined category than individuals with broader, more inclusive, but less vivid descriptions, even if the vivid example was a mere subset of the more inclusive set. These specific examples are seen as more representative of the category than those with the broader but vaguer descriptions, in violation of logic and probability.

Fear of the unknown, messiah complex

Ignorance

“If you feel overwhelmed and confused by the global predicament, you are on the right track. Global processes have become too complicated for any single person to understand. How then can you know the truth about the world, and avoid falling victim to propaganda and misinformation?”

Democracy was founded on the idea that the voter knows best, free-market capitalism believes the customer is always right, and liberal education teaches students to think for themselves. However, it is a mistake to put too much trust into the “rational individual”.  Most human decisions are based on emotions and heuristic shortcuts, which were great in the stone age but not silicon valley.

Individuality is a myth. What gave us the edge over other animals was our ability to operate in groups. We rely on the expertise of others to get by with almost all of our daily tasks and yet we believe to know more than we really do. We have the illusion of knowledge but it works well for us. It is energy efficient to not have to learn everything. Then again, people have extremely strong views on many things that fall outside their expertise and rarely appreciate their ignorance. Most Americans who have strong aggressive beliefs about the Middle East can’t even find any of their countries on the map. The power of groupthink is so hard to dispel that you would sooner anger people with opposing facts than sway them. Most people don’t like to feel stupid, especially if it goes against a group ideology that they belong to.

Great power inevitably distorts the truth. With power all problems appear to be overcome with it. Anybody who talks with you will have a conscious or unconscious agenda, so you can never have full faith in what they say. Great power thus acts like a black hole that warps everything around it. Revolutionary knowledge rarely makes it way to the center, because the center is built on pre-existing knowledge. The periphery however is also filled with conspiracies and superstitious nonsense, which makes it very hard to distinguish between great ideas and great loads of shit. It takes too much time wading through it to find the gems so a person of power doesn’t have that kind of luxury.

Pragmatic Thinking Notes

Anchoring: Just seeing a number will affect how you predict or decide some quantity. By offering an example you can prime somebody’s perception on what something is worth.

Fundamental Attribution Error: We tend to ascribe other people’s behavior to their personality, instead of looking at the situation and the context in which their behavior occurs. We might excuse our own actions more easily. In reality, context is everything.

Self-serving Bias: The tendency to believe that if the project is a success, I’m responsible. If I failed, I’m not.

Need for closure: We are not comfortable with doubt and uncertainty – so much so that we’ll go to great lengths to resolve open issues. Resolving a problem early just masks it rather than solves it.

Confirmation Bias: Everyone looks for choice facts to fit your own preconceptions and theories.

Exposure Effect: We tend to prefer things because they are familiar.

Hawthorne Effect: People change their behavior when they know they are being studied. Discipline is high, the excitement of something new fuels the effort. Eventually it wears off.

False Memory: It’s easy for your brain to confuse imagined events with real memories. We’re susceptible to the power of suggestion. Memory may be rewritten and changed with age, experience, worldview, focus, etc.

Symbolic Reduction Fallacy: Quick symbolism to represent something complicated, which loses the nuances and truth.

Nominal Fallacy: Labeling a thing means you can explain it or understand it but a label is just that; and naming it alone does not offer any useful understanding.

How to overcome your biases:

  • Do not discount unobserved or rare phenomena as impossible. Like discovering the existence of a black swan. Watch the outliers: “rarely” doesn’t mean “never”.
  • Defer closure. Don’t fixate on a decision prematurely because you will reduce your options, perhaps to the point of eliminating the successful choice. Be comfortable with uncertainty.
  • You don’t remember very well. Memory is unreliable and old memories change over time. The palest ink is better than the best memory.

Anything in the world when you’re born is normal and ordinary and a part of how the world works. Anything invented between 15-35 is new and exciting and revolutionary and you can probably get a career in it. Anything after 35 is against the natural order of things.

The biases that drive you change over time and will be different across generations. Some folks value job stability at the expense of abuse from their boss whereas others will leave at the slightest perceived offense. We tend not to wonder why we value the things we do. They could be instilled from parents, peers, or models. You are a product of your times. You and the rest of your cohort are united by shared memories, common habits, and popular styles, as well as age and station in life at the time. Examples are:

  • Risk taker vs. risk adverse
  • Individualism vs. teamwork
  • Stability vs. freedom
  • Family vs. work

Four generational archetypes and their dominant characteristics:

  • Prophet: Vision, values
  • Nomad: Liberty, survival, honor
  • Hero: Community, affluence
  • Artist: Pluralism, expertise, due process

Archetypes produce opposing archetypes in the following generation. Nomad->X-er, Prophet->Boomer, Artist->Silent (1925) and Homeland, Hero->GI (1901-24) and millennial.

Hedge your bets by embracing diversity. This prevents being pigeonholed and falling to your biases.

Myers Briggs Type Indicator:

  • Extravert (E) vs. Introvert. Inward or outward orientation. E is energized by people and socializing and the introvert is territorial and need private mental and environmental space. 75% of population are E.
  • Sensing (S) vs. Intuition (N): How you obtain information. The sensing emphasizes practicality and facts and N are imaginative, appreciate metaphor, and are innovative. N may skip off to a new activity before the first is over. S see this as flighty and N see them as plodding. 75% are S.
  • Thinking (T) vs. Feeling (F). How you make decisions. T make decisions based on rules. F evaluate the personal and emotional impact in addition to the applicable rules. T’s view may seem cold blooded to most F folks. The T see F as bleeding hearts. 50:50 with more females as F.
  • Judging (J) vs. Perceiving (P). Decisions are closed or open-ended. Judge quickly or keep perceiving. J are uneasy until they make a decision. P are uneasy when they make a decision. 50:50.

Lizard Logic. How to act like a lizard:

  • Fight, flight, or fright. Being fully aroused immediately. Be ready to start swinging or run. If it is really bad just freeze with fear.
  • Get it now. Everything is immediate and automatic. Don’t think or plan; just follow your impulses and focus on excitement. Use sports metaphors A LOT. Answer emails or surf the web instead of work.
  • Be dominant. Claw and scratch to the leader of the pack so you can abuse everyone you know.
  • Defend the territory. Never share info or space. Mark your territory and protect your interests. If someone does something without you, cry foul and demand to know why you weren’t included.
  • If it hurts, hiss. Don’t fix the problem, spend your energy fixing the blame to someone else instead. Let everyone know it is not fair.
  • Like me = good; not like me = bad. Your side is good and the other evil. Explain this to your teammates often.

Notice how long it takes you to get over your initial reaction to a perceived threat. How does your reaction change once you “think about it”?

Act on that impulse but not immediately. Plan for it; schedule it. Does it make sense later?

Write a new movie. If you’re troubled by a given film that keeps replaying in your head, sit down and craft a new one with a happy ending.

Smile. There’s evidence to suggest smiling is as effective as antidepressant medication.

“The fact that we live at the bottom of a deep gravity well, on the surface of a gas-covered planet going around a nuclear fireball 90 million miles away and think this to be normal is obviously some indication of how skewed our perspective tends to be.” Douglas Adams, The Salmon of Doubt

Trust intuition, but verify. If you believe your way intuitively feels better, great, but make sure it isn’t a cognitive bias at play first. Get some feedback, create a prototype, run some tests, and chart some benchmarks. Do what you need to to prove your idea is good, because your intuition may be wrong.

If you are dead solid convinced of something, ask yourself why. How do you know> Says who? Compared to whom?

  • How do you know?
  • Says who?
  • How specifically?
  • How does what I’m doing cause you to…?
  • Compared to what or whom?
  • Does it always happen? Can you think of an exception?
  • What would happen if you did (or didn’t)?
  • What stops you from…?

Is there anything you can actually measure? Get hard numbers on? Any statistics? What happens when you talk this over with a colleague or someone with a very different perspective to you? Do they passively agree? Is that a danger sign? Do they violently oppose the idea? Does that give it credibility or not?

If you think you’ve defined something, try to define its opposite. This can help to avoid nominal fallacy. If all you have is a label it is hard to pin down its opposite in any detail. Contrast a behavior, an observation, a theory, etc. in detail. This gives you a deeper look at your definition with a more critical and attentive eye. Expectations color reality. This is why certain faux news channels use sensationalism.

When in conflict, consider basic personality types, generational values, your own biases, others’ biases, the context, and the environment. Examine your own position carefully.

JayPT +