“Kent was floored. A phrase that looked informative was so vague as to be almost useless. Or perhaps it was worse than useless, as it had created dangerous misunderstandings.” – Superforecasting
In the early part of Superforecasting, the authors talk about the importance of being able to keep score of forecast accuracy. Only by keeping score can one close the feedback loop to assess forecasts (and forecasters) and calibrate predictive models.
But one thing that typically stymies such activities is the ambiguity of language. What does it mean when someone says there is a “fair chance” of something happening in the “near future”?
This topic has importance well beyond the realm of forecasting, of course, and I don’t think we (in general) recognize just how significant the implications are. We’re going through life thinking our words have specific meaning, and that they’re heard by others to mean what we intend, but is that really the case?
It’s probably no surprise that this topic piqued my interest: my academic background is engineering and my professional career has mostly been spent in marketing. Engineering is hugely dependent upon accuracy and precision (and large safety margins when either is lacking), and marketing frequently involves conveying meaning and information.
So I’m coming at this from the perhaps rare perspective of a marketer who’s keenly interested in precision.
Let’s look at what got this whole thing going…
A Serious Possibility
Here’s the example from Superforecasting (p54-55):
In the late 1940s, the Communist government of Yugoslavia broke from the Soviet Union, raising fears the Soviets would invade. In March 1951 National Intelligence Estimate (NIE) 29-51 was published. “Although it is impossible to determine which course of action the Kremlin is likely to adopt,” the report concluded, “we believe the extent of [Eastern European] military and propaganda preparations indicates that an attack on Yugoslavia in 1951 should be considered a serious possibility.” By most standards, that is clear, meaningful language. No one suggested otherwise when the estimate was published and read by top officials throughout the government. But a few days later, [Sherman] Kent was chatting with a senior State Department official who casually asked, “By the way, what did you people mean by the expression ‘serious possibility’? What kind of odds did you have in mind?” Kent said he was pessimistic. He felt the odds were about 65 to 35 in favor of an attack. The official was startled. He and his colleagues had taken “serious possibility” to mean much lower odds.
Disturbed, Kent went back to his team. They had all agreed to use “serious possibility” in the NIE so Kent asked each person, in turn what he thought it meant. One analyst said it meant odds of about 80 to 20, or four times more likely than not that there would be an invasion. Another thought it meant odds of 20 to 80 – exactly the opposite. Other answers were scattered between those extremes. Kent was floored. A phrase that looked informative was so vague as to be almost useless. Or perhaps it was worse than useless, as it had created dangerous misunderstandings.
As a solution to this problem, Kent suggested mapping particular words to particular numerical probabilities:
|Certainty||The General Area of Possibility|
|93% (give or take about 6%)||Almost certain|
|75% (give or take about 12%)||Probable|
|50% (give or take about 10%)||Chances about even|
|30% (give or take about 10%)||Probably not|
|7% (give or take about 5%)||Almost certainly not|
While doing a bit of research (both in general, and for this post), I came across Words of Estimative Probability, over at the CIA’s website. At the top of the article is this blurb:
This classic piece on the need for precision in intelligence judgments was originally classified Confidential and published in the Fall 1964 number of Studies in Intelligence. Although Sherman Kent’s efforts to quantify what were essentially qualitative judgments did not prevail, the essay’s general theme remains important today.
In it, you can read Kent’s first-hand account of the story presented above and – perhaps more interestingly – his overall thinking about conveying degrees of uncertainty.
It’s really fascinating, and I recommend you pop over and spend a few minutes reading it, as he addresses important topics including alternative language and aesthetic objections (from the “poets” as he disparagingly refers to them).
For instance, here’s what Kent has to say about aesthetic opposition:
What slowed me up in the first instance was the firm and reasoned resistance of some of my colleagues. Quite figuratively I am going to call them the “poets”–as opposed to the “mathematicians”–in my circle of associates, and if the term conveys a modicum of disapprobation on my part, that is what I want it to do. Their attitude toward the problem of communication seems to be fundamentally defeatist. They appear to believe the most a writer can achieve when working in a speculative area of human affairs is communication in only the broadest general sense. If he gets the wrong message across or no message at all–well, that is life.
“They appear to believe the most a writer can achieve when working in a speculative area of human affairs is communication in only the broadest general sense. If he gets the wrong message across or no message at all–well, that is life.” – Sherman Kent
OK, so what?
Well, I guess we should all just be aware that between what we write/say/mean and what others read/hear/interpret can exist a massive range of meaning. Political communication consultants know this, that’s why the subtitle of Frank Luntz’ Words that Work is, “it’s not what you say, it’s what people hear”.
Sometimes, the stakes are low; other times, the stakes are extremely high (for instance, Superforecasting includes examples of the language used when President Kennedy was deciding whether or not to authorize the Bay of Pigs invasion, and when President Obama was considering the military action in Pakistan on the compound that housed Bin Laden).
As a communicator – and indeed as a practiced-and-still-aspiring leader – it’s important to me that people know what I mean, as precisely as possible.
[…] p56 talks about Sherman Kent’s attempts to designate numerical meanings to forecast language; I wrote about it in The Ambiguity of Language. […]
[…] I wrote about The Ambiguity of Language. Words are fuzzy, their meaning is fuzzy, everything’s fuzzy, […]