← Timeline
Avatar placeholder
tigra
About probabilities

Two views on probability.

Original story.

Once upon a time there was a boy who cried "there's a 𝟱% 𝗰𝗵𝗮𝗻𝗰𝗲 𝘁𝗵𝗲𝗿𝗲'𝘀 𝗮 𝘄𝗼𝗹𝗳!"
The villagers came running, saw no wolf, and said "He 𝘴𝘢𝘪𝘥 there was a wolf and there was not. Thus his probabilities are wrong and he's an alarmist."
On the second day, the boy heard some rustling in the bushes and cried "there's a 20% chance there's a wolf!"
Some villagers ran out and some did not.
There was no wolf.
The wolf skeptics who stayed in bed felt smug.
"That boy is always saying there is a wolf but there isn't. He is an alarmist and is too confident in his abilities to predict the future, unlike us sober skeptics."
"I didn't say there 𝘸𝘢𝘴 a wolf!" Cried the boy. "I was estimating the probability at low, but high enough.
"A false alarm is much less costly than a missed detection when it comes to dying! The expected value is good!"
The villagers didn't understand the boy and ignored him.
On the third day, the boy heard some sounds he couldn't identify but seemed wolf-y. "There's a 5% chance there's a wolf!" he cried.
No villagers came.
It was a wolf.
They were all eaten.
Because the villagers did not become statistically literate.

The moral of the story is that 𝘄𝗲 𝘀𝗵𝗼𝘂𝗹𝗱 𝗲𝘅𝗽𝗲𝗰𝘁 𝘁𝗼 𝗵𝗮𝘃𝗲 𝗮 𝗺𝗶𝗹𝗹𝗶𝗼𝗻 𝗳𝗮𝗹𝘀𝗲 𝗮𝗹𝗮𝗿𝗺𝘀 𝗯𝗲𝗳𝗼𝗿𝗲 𝗮 𝗰𝗮𝘁𝗮𝘀𝘁𝗿𝗼𝗽𝗵𝗲 𝗵𝗶𝘁𝘀 and 𝘁𝗵𝗮𝘁 𝗶𝘀 𝗻𝗼𝘁 𝗲𝘃𝗶𝗱𝗲𝗻𝗰𝗲 𝗮𝗴𝗮𝗶𝗻𝘀𝘁 𝗶𝗺𝗽𝗲𝗻𝗱𝗶𝗻𝗴 𝗰𝗮𝘁𝗮𝘀𝘁𝗿𝗼𝗽𝗵𝗲.
Each time somebody put low, but high enough probability on a pandemic being about to start, they weren't wrong when it didn't pan out. H1N1 and SARS etc didn't become global pandemics, but they could have. They had a low probability, but high enough to raise alarms.
The problem is that people then thought to themselves "Look! People freaked out about those last ones and it was fine, so people are terrible at predictions and alarmist and we shouldn't worry about pandemics" And then 2020 happened.
This will happen again for other things.
People will be raising the alarm about something, and in the media, the nuanced thinking about probabilities will be washed out.
You'll hear people saying that _______ will definitely fuck everything up very soon.
And it doesn't.
And when the catastrophe doesn't happen, don't over update.
Don't say "They cried wolf before and nothing happened, thus they are no longer credible"
Say "I wonder what probability they or I should put on it? Is that high enough to set up the proper precautions?"
When somebody says that nuclear war hasn't happened yet despite all the scares,
when somebody reminds you about the AI winter where nothing was happening in it despite all the hype,
remember the boy who cried a 5% chance of wolf.

Answer:

You can not take arbitrarily expensive countermeasures against all the millions of things that could go wrong. That's why this analogy falls flat.
Maybe in the story, they should have started a low cost investigation to get a better estimate, for example scout the area for track marks. The boy should also have explained how he arrived at his estimate.
We now have an exaggerated perception of the risk of Cov-Sars-2, merely because of 24/7 media hype. Other things are just as risky or worse (like being obese), but don't receive the same attention. That should give us pause to reflect on actual relevant risks.
Worse, it is not only the risk of the wolf that is in question here, but also the counter measures.
What if I told you that wearing a tinfoil hat has a 2% chance of staving off the Corona virus?
Can you even give any rational argument, in your logic, against wearing a tinfoil hat? Since the possible damage is infinite (your death), any cost should be acceptable to avoid it, right? So even if a tinfoil hat is only 0.01% effective, you should absolutely wear it!
I think there is even a real chance that tinfoil hats could work: people might avoid you, which could lead to reduced exposure to the virus.
So we have established, by "tail risk logic", that you should absolutely wear a tinfoil hat at all times!

To react or comment  View in Web Client