Modern humans dealing with modern problems require modern analysis. And we ain’t so good at it.
To be fair, our Serengeti-honed brains deal well with reactionary solutions to obvious dangers.
When a problem is connecting the dots between an empty stomach and the nutritious plants and animals around us (“yummy!”) or as obvious as a charging tiger (“oh shit!”), we humans are pretty good at finding solutions.
But.
When the problems are modern enough that the cause and effect are separated by distance or time (pandemics, climate change) we have to use more modern techniques (and our intellect) to address them. And that, we aren’t so good at.
Three things that we fail to process intuitively are counterfactuals, denominators, and exponential growth. Each of those failures fucks us up.
Counterfactuals
What if counterfactuals didn’t exist?
A counterfactual is considering what would have happened had a decision been made differently.
Lots of times pissed-off commenters will point to bad outcomes and highlight the decisions that preceded those outcomes. Sometimes these are legitimate criticisms, but all too often the counterfactuals are not taken into account.
What about an NFL coach deciding whether to go for it on fourth and 2 from midfield? If the conversion fails it’s easy to say, afterwards, “they should have punted!” — but the dilemma, at the time of the decision, is comparing the cost/benefit of going for it against the cost/benefit of the counterfactual. Sloppy sports commentators have built careers on the after-the-fact blame game. Taking these counterfactuals into account would make for more boring-ass (but more accurate) commentary.
Counterfactuals In A Medical Context
There are last-ditch medical interventions that include risk. If one of these has a bad outcome, blaming the medical intervention isn’t reasonable unless you also consider the risk of not trying that intervention.
CPR can injure people, and it doesn’t always work; but the counterfactual, not using CPR on someone who’s not breathing, is likely to have a much worse outcome. Over the decades of widespread CPR training, we have internalized this a bit, but you still see the occasional CPR survivor pissed off about a broken rib. That anger is misunderstanding counterfactuals in a nutshell.
Putting a severely ill Covid patient on a ventilator is the same sort of last-ditch intervention — risky and prone to failure, but infinitely better than the counterfactual — and yet there are broken people out there who believe that doctors using ventilators are trying to kill patients.
Poor Decision Making
Not understanding counterfactuals leads to poor decision making, in sports and in real life. If you’re a leader and you’re more interested in avoiding criticism than doing a full risk/benefit analysis, you will end up making poor decisions … like going for it on fourth down too rarely, or thinking you can placidly wait out political gridlock while underlying problems continue to worsen.
Denominators
Why anecdotes aren’t data.
When we hear a great story, it can be compelling … but the telling of a story doesn’t tell how common it is. Indeed, “man bites dog” is an idiom for a reason: the uncommon makes for a better story. And if a rare occurrence becomes a persuasive story, that can end up being harmful.
Friends of mine, years ago, had an accident in a pickup truck that threw them clear of the vehicle before the cab was crushed. Thank goodness they were not wearing their seatbelts! But what’s sometimes hard for our lizard brains to absorb is that my friends’ experience doesn’t disprove seatbelts’ safety. Sometimes rare things happen.
And I’ve always tried to remember that the denominator of “people I know” is a much smaller denominator than “the population of people in vehicles.”
When it comes to policy and politics, kooks can be loud. But the population of loud kooks is lower than that of quiet normal people. Half a century ago, the phrase “the silent majority” was coined (and used, by partisan assholes, disingenuously). But in today’s social-media driven environment, the loud kooks seem even louder, and the reasonable and vast and silent majority doesn’t get clicks or media attention. But still, the population (or denominator) of a reasonable quiet contingent is often much higher than the denominator of a highly-visible outraged minority.
Taking denominators into account is absolutely necessary to understand widespread health risks. Ensuring that, as much as possible, health risks are converted to a single-person equivalent makes it easier to understand, at a visceral level, these risks.
During the Covid-19 pandemic, immediately after vaccinations were introduced you could say, accurately, that more than 99.9% of Covid deaths were of unvaccinated people. Ignoring the denominator (before vaccines were available, all Covid deaths happened to unvaccinated people!) gave an overly-rosy view of vaccine efficacy. The flip side is after a community has 90% vaccination, half of that community’s Covid deaths might be of vaccinated people. Again, without a denominator, it would be easy to come away with a wrong conclusion.
That’s why comparing Covid risks using a common denominator is so important: by forcing a consistent denominator you can be sure of comparing apples to apples.
(My personal favorite is to think about the actuarial ramifications. That’s described more in Appendix 2 but we can think about it as converting everything to have a denominator of one. Unfortunately none of the popular media have asked me about my preference so they stick with Covid fatality or case numbers per million or per hundred thousand. 🤷♂️)
A corollary of all this is that if someone is distributing information and the denominators are conspicuously missing, chances are good that some charlatan is trying to deceive and mislead their audience. For observant and reasoning information consumers, it’s usually pretty easy to see whether the denominator is being obscured, and to infer that people drawing attention away from denominators are swindlers and bamboozlers.
A deep understanding of probability is something that doesn’t come naturally to most of us, and one of the confounding factors to misunderstanding probability is ignoring denominators.
Exponential Growth
By the time you notice it might be too late.
Many of us have seen brain teasers like this one:
The amount of surface algae in a pond doubles every day. If the algae spread began two months ago and the pond got completely covered with algae today, when was the pond half covered?
Yesterday is not an intuitive answer to that question.
But it’s the right answer, of course. Almost by definition, exponential growth is too small to notice, then suddenly it’s overwhelming.
Wildfires or home fires are examples of dangerous exponential growth. A bucket of rags might be smoldering for hours before a garage fire explodes and it may be just a few minutes before an entire house is engulfed.
The derivative of an exponential function is also an exponential function. A relevant example of this 🤯 characteristic is the early spread of an unchecked virus. The number of people infected is an exponential function, which, fine. And in the early phase of a pandemic (or a wave of a pandemic), the number of new cases per day is itself an exponential function! If you’re the administrator of a hospital watching your available bed count decrease as a pandemic spreads, exponential growth should scare the shit out of you.
This Will Fuck Us Up
There are costs to innumeracy.
I wouldn’t say that notions like counterfactuals, denominators, and exponential growth are “the three most important topics in math” but I do think it’s fair to consider them three concepts that are simultaneously unintuitive and important. … so maybe “the three math topics that, if not understood, will fuck you up”?
What’s clear is that if we all understood these three things better, we would be less susceptible to the bullshitters and grifters who have an agenda to sow discord and chaos.
We had all of pre-history to develop solutions to hunter-gatherer challenges. We’ve only had a few thousand years since the Agricultural Revolution to suss out how to solve civilization-level problems, and a measly one or two hundred years to internalize germ theory and other modern global patterns. Our lizard brains have a hard time keeping up with current-day reasoning.
Addressing serious problems with modern solutions needs more analysis, more modeling, more strategy. More brainpower. And sometimes that feels counterintuitive to the instincts and intuitions our species built up before civilization was even a thing.
But in a world that includes counterfactuals, denominators, and exponential growth, we’re stuck with tools like thinking, modeling, reasoning, and hypothesizing.
The Good Lord didn’t give us these humongous brains just to keep our skulls from imploding. We can reason.
Let’s.