Vague Apprehensions
"There is a possibility of a terrorist attack." When officials make such statements, most people interpret them as meaning "There is a high probability of a terrorist attack." This interpretation is fundamentally wrong, yet it shapes how we think about risk, make decisions, and respond to information.
The distinction matters because possibility and probability are entirely different concepts. A possibility simply means something can happen—it has a non-zero chance. Probability tells us how likely that something is to happen. Yet people routinely conflate the two, almost always overestimating the likelihood of any event once it's framed as "possible."
This confusion stems from a basic psychological tendency: once a possibility is raised, it becomes mentally real regardless of its actual likelihood. The mere act of considering a scenario—a terrorist attack, a market crash, a rare disease—makes it feel more probable than it actually is. Our brains struggle to maintain the distinction between "can happen" and "likely to happen."
Media coverage amplifies this confusion deliberately. News outlets accompany possibility-based warnings with dramatic footage of prior attacks, police sirens, SWAT teams, and helicopters—all designed to make possibilities feel like probabilities. The same outlets that breathlessly cover remote terrorist threats rarely spend equivalent time explaining that your chances of winning the lottery are roughly 1 in 300 million, even though both are technically "possibilities."
Politicians and policymakers exploit this confusion strategically. Speaking in terms of possibilities allows them to sound logically coherent while being deliberately vague. They can invoke the "possibility" of various threats to justify policy positions, knowing audiences will mentally inflate these possibilities into probabilities. This approach offers a crucial advantage: you can never be technically wrong while encouraging others to draw incorrect conclusions.
The same pattern appears in everyday life. People casually exaggerate threats they want to emphasize (crime, various dangers) while downplaying risks they prefer to ignore (speeding, drunk driving). Speaking in possibilities rather than probabilities lets them have it both ways.
Yet there's a simple remedy for this flawed thinking. In conversations with both experts and laypeople, I've found that asking people to assign actual probabilities to events they describe as "possible" immediately changes their reasoning. When pressed for numbers, they almost always provide much lower estimates than their initial language suggested—and their judgments shift accordingly. The act of quantification forces more careful thinking.
This isn't about eliminating uncertainty from our discourse. Many situations genuinely involve unknown probabilities, and acknowledging that uncertainty is important. Rather, it's about being honest about what we know and don't know, and resisting the tendency to treat all possibilities as if they were significant probabilities.
Vagueness in language can mask vagueness in thinking, including our own. When we hear possibility-based claims—or catch ourselves making them—we should ask: What's the actual probability? How do we know? What evidence supports this estimate? These questions cut through strategic ambiguity and reveal whether there's genuine substance behind the rhetoric.
The stakes are higher than mere linguistic precision. Clear thinking about probability versus possibility affects everything from personal decisions to public policy to democratic discourse. When we confuse the two, we become vulnerable to manipulation, make poor choices, and support misguided policies based on inflated fears or deflated concerns.
Avoidable vagueness ought to be avoided. Our reasoning—and our democracy—depends on it.