The Human Operating Manual

The History of Science

Hunter Gatherer Notes

We no longer rely on tight-knit communities, we fail to grasp the concept of “local knowledge” and a deeper understanding of our terrain, and we rely on global systems to feel safe. However, safety can be a façade. Our packaged food, health-care systems, and economy do little for us.

Technology, medicine, and education developments have accelerated dramatically, changing our geographic, social, and personal environments with them. The positive effects they have brought with them have made it difficult to understand the complex problems that have arisen due to their whirlwind entrance. This change is hyper-novel and the adaptability that helped us to develop it and take over the world can no longer keep up with the rate of change.

Science is a method that oscillates between induction and deduction. We observe patterns, propose explanations, and test them to see how well they predict things. When we do scientific work correctly our models predict more than before, assume less, and fit with other models.

As cells begin to function in coordinated ways, becoming organisms made up of distinct tissues, the degree of mystery compounds. The unpredictability jumps again in animals, governed by sophisticated neurological feedbacks that themselves investigate and predict the world, and once again as animals become social and begin to pool their understanding and divide their labor.

First principles are those assumptions that cannot be deduced from any other assumption. They are foundational (like axioms, in math), and so thinking from first principles is a powerful mechanism for deducing truth, and a worthy goal if you are interested in fact over fiction.

Adaptation and Lineage

Fitness is about reproduction and persistence. A successful population can ebb and flow but extinction is failure.

Whether cultural, genetic, or a mixture of the two, sex roles inherited from a long line of ancestors are biological solutions to evolutionary problems. They are, in short, adaptations that function to facilitate and ensure lineage persistence into the future.

Currently, some people, and even some scientists, are in denial of potential adaptations that may appear to be ugly. Refusing to investigate anything that might not be positive in the current cultural context.

Against Reductionism

The modern approach to medicine (practically reductionist), reveals itself clearly in scientism. Friedrich Hayek observed that, too often, the methods and language of science are imitated by institutions and systems not engaged in science, such that the resulting efforts are generally not scientific at all. Not only do we see words like theory and analysis wrapped around distinctly untheoretical and unanalyzed (and often unanalyzable) ideas, but—worse—we see the rise of a kind of fake numeracy, in which anything that can be counted is, and once you have measurement, you tend to forgo all further analysis.

Once we have a proxy for something, we think we know it. This is particularly true if the proxy is quantifiable, no matter how flawed those numbers might be. Furthermore, once we have a category, we often stop looking outside of the categories for meaning, as our formal system of carrots and sticks exists solely within the categories.

This is the engineer’s approach to what humans are, and it vastly underappreciates how complex and variable we are. Everyone is susceptible to this error: We look for metrics, and once we find one that is both measurable and relevant to the system we are trying to affect, we mistake it for the relevant metric. Calories, psych drugs, etc. We also forget that our bodies are variable and that as long as a system works (blood vessels traveling different paths to the same place) it doesn’t matter. 

Considering the Risks of Reductionism as We Choose What to Put in Our Bodies

We often mistake an effect (e.g., of an action, a treatment, a molecule) for our understanding of the effect. What a thing does, and what we think (or know) that it does, are not the same thing. An example is believing vanillin is the same as vanilla or that THC or CBD is the same as marijuana. The parts are different to the whole.

From fluoridated drinking water to shelf-stable foods with unintended consequences, from the myriad issues with sun exposure, to whether GMOs are safe—we are constantly seduced by reductionist thinking, led astray by the fantasy of simplicity where the truth is complex. Reductionism, particularly with respect to our bodies and minds, is harming us. 

Early in the 20th century, fluoride was discovered to be correlated with fewer cavities. So fluoride was put in many municipal water supplies to decrease tooth decay. The fluoride in drinking water is a by-product of industrial processes, though, not a molecular form that appears in nature or has ever been part of our diet. Furthermore, we find neurotoxicity in children who are exposed to fluoridated drinking water; a correlation between hypothyroidism and fluoridated water; and, in salmon, a loss of the ability to navigate back to their home stream after swimming in fluoridated water. The quest for magic bullets, for simple answers that are universally applicable to all humans in all conditions, is misguided. If it were that easy, selection would almost certainly have found a way. Look for the hidden costs.

Bringing Evolution Back to Medicine

Ernst Mayr, one of the 20th century’s great evolutionary biologists, formalized the distinction between proximate and ultimate levels of explanation. In attempting to tease apart cause and effect in biology, he distinguishes two branches within biology, which many scientists themselves may not be aware of.

  • Functional biology, Mayr argues, is concerned with how questions: How does an organ function, or a gene, or a wing? The answers to these are proximate levels of explanation.
  • Evolutionary biology is concerned with why questions: Why does an organ persist, why is a gene in this organism but not that one, why is the swallow’s wing shaped the way that it is? The answers to these are ultimate levels of explanation.

How questions—that is, proximate levels of analysis, questions of mechanism—are more easily pointed to, observed, and quantified than the underlying question of why, mechanism has become most of what is studied in science and medicine. How questions also tend to be what are reported, in breathless sound bites, by the media. Too often, these proximate questions are imagined to be the level at which the scientific conversation needs to be had. This serves nobody—not those who are interested in the study of why, nor those who are interested in the study of how.

Combine a tendency to engage only proximate questions, with a bias toward reductionism, and you end up with medicine that has blinders on. Even the great victories of Western medicine—surgery, antibiotics, and vaccines—have been over-extrapolated, applied in many cases where they shouldn’t be. 

Those who break bones often get them put in a cast. This immobilizes it and causes the muscle to atrophy. For some situations a splint is all you need. The recovery process happens much quicker and the pain subsides quicker without the blunting of excessive pain medication. The pain, heat, and swelling of an injury is communication about your progress. As long as you don’t die of infection or by being eaten by carnivores (our ancestors’ typical problems), your bones should heal. If you need surgical intervention to fix the break, that’s advisable. Just don’t expect a reductionist approach to solve the whole issue. Modern medicine can be lifesaving. That doesn’t mean it can completely replace the body’s healing response.

Whom to Believe in the Era of Reductionism and Hyper-Novelty

Relying on cultural rules and reductionist thinking instead of consciousness is much more energy efficient. However, in this time of hyper-novelty we are required to be more flexible to avoid being led around. Many people’s faith in authority has been shook after recent events, due to their wavering demands.

We need a less reductionist approach, while still being able to take advantage of what incredible solutions we have engineered in the past. We need to use the hammer when a nail is the problem, rather than be forced to use a hammer for all situations.

Why we’re not wired to think scientifically (Peter Attia)

Formal logic arrived with Aristotle 2,500 years ago; the scientific method was pioneered by Francis Bacon 400 years ago. Shortly following the codification of the scientific method—which defined exactly what “good” science meant—the Royal Society of London for Improving Natural Knowledge was formed. So, not only did we know what “good” science was, but we had an organization that expanded the application, including peer review, and existed to continually ask the question, “Is this good science?”

While the Old Testament makes references to the earliest clinical trial—observing what happened to those who did or did not partake of the “King’s meat”—the process was codified further by 1025 AD in The Canon of Medicine, and formalized in the 18th century by James Lind, the Scottish physician who discovered, using randomization between groups, the curative properties of oranges and lemons—vitamin C, actually—in treating sailors with scurvy. Hence the expression, “Limey.”

The concept of statistical significance is barely 100 years old, thanks to Ronald Fisher, the British statistician who popularized the use of the p-value and proposed the limits of chance versus significance.

for 2 million years we have been evolving—making decisions, surviving, and interacting—but for only the last 2,500 years (0.125% of that time) have we had “access” to formal logic, and for only 400 years (0.02% of that time) have we had “access” to scientific reason and understanding of scientific methodologies.

Whatever a person was doing before modern science—however clever it may have been—it wasn’t actually science. And along the same vein, how many people were practicing logical thinking before logic itself was invented? Perhaps some were doing so prior to Aristotle, but certainly it was rare compared to the time following its codification.

Options for problem-solving are limited to the tools available. The arrival of logic was a major tool. So, too, was the arrival of the scientific method, clinical trials, and statistical analyses. Yet for the first 99.98% of our existence on this planet as humans—literally—we had to rely on other options—other tools, if you will — for solving problems and making decisions.

So what were they?

We can make educated guesses. If it’s 3,000 BC and your tribemate Ugg never gets sick, all you can do to try to not get sick is hang out where he hangs out, wear similar colors, drink from the same well—replicate his every move. You are not going to figure out anything from first principles because that isn’t an option, any more than traveling by jet across the Pacific Ocean was an option. Nothing is an option until it has been invented.

So we’ve had millions of years to evolve and refine the practice of:

Step 1: Identify a positive trait (e.g., access to food, access to mates),

Step 2: Mimic the behaviors of those possessing the trait(s),

Step 3: Repeat.

Yet, we’ve only had a minute fraction of that time to learn how to apply formal logic and scientific reason to our decision making and problem solving. In other words, evolution has hardwired us to be followers, copycats if you will, so we must go very far out of our way to unlearn those inborn (and highly refined) instincts to think logically and scientifically.

Recently, neuroscientists (thanks to the advent of functional MRI, or fMRI) have been asking questions about the impact of independent thinking (something I think we would all agree is “healthy”) on brain activity. I think this body of research is still in its infancy, but the results are suggestive, if not somewhat provocative.

To quote the authors of this work, “if social conformity resulted from conscious decision-making, this would be associated with functional changes in prefrontal cortex, whereas if social conformity was more perceptually based, then activity changes would be seen in occipital and parietal regions.” Their study suggested that non-conformity produced an associated “pain of independence.” In the study-subjects the amygdala became most active in times of non-conformity, suggesting that non-conformity—doing exactly what we didn’t evolve to do—produced emotional distress.

From an evolutionary perspective, of course, this makes sense. I don’t know enough neuroscience to agree with their suggestion that this phenomenon should be titled the “pain of independence,” but the “emotional discomfort” from being different—i.e., not following or conforming—seems to be evolutionarily embedded in our brains.

Good solid thinking is really hard to do as you no doubt realize. How much easier is it to economize on all this and just “copy & paste” what seemingly successful people are doing? Furthermore, we may be wired to experience emotional distress when we don’t copy our neighbor! And while there may have been only 2 or 3 Ugg’s in our tribe 5,000 years ago, as our societies evolved, so too did the number of potential Ugg’s (those worth mimicking). This would be great (more potential good examples to mirror), if we were naturally good at thinking logically and scientifically, but we’ve already established that’s not the case. Amplifying this problem even further, the explosion of mass media has made it virtually, if not entirely, impossible to identify those truly worth mimicking versus those who are charlatans, or simply lucky. Maybe it’s not so surprising the one group of people we’d all hope could think critically—politicians—seems to be as useless at it as the rest of us.

JayPT +