Despite my best intentions to get through an ever-growing stack of books, a brand new one crept into the mix and demanded my immediate attention, so here goes, with a few notes on it:
Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock and Dan Gardner, (Crown Publishers: New York, 2015).
In this book, the authors, Tetlock, a professor of psychology, political science and business and Gardner, a journalist and author, note that “we are all forecasters,” in the sense that we need to make decisions that involve uncertainty (as when we buy a home or make an investment or decide to relocate, etc.).
When it comes to really big events, like market crashes, wars, etc., however, we expect to turn to “experts.” Unfortunately, according to the authors’ research results, the experts we might most expect to be able to “forecast” events with precision are less able to do so (against certain types of problems) than “ordinary” well-informed people who are not experts in the subject matter.
These “ordinary” people have some extraordinary characteristics, the authors realized when they analyzed their research results. These include an ability to step outside of themselves and get a different view of reality, something the authors note is really hard to do. But the ordinary people who did the best in the forecasting tournaments run by the authors, exhibited a remarkable ability to do just this:
“Whether by virtue of temperament or habit or conscious effort, they [the successful forecasters] tend to engage in the hard work of consulting other perspectives.”
In conducting U.S. government-backed research, the authors found that people such as a retired computer programmer with no special expertise in international affairs could successfully answer very specific questions such as “Will the London Gold Market Fixing price of gold (USD per ounce) exceed $1850 on 30 September 2011?” People they worked with, such as this individual, were enabled by the rules of the research project to update their forecasts in real time, incorporating new information in their estimates as they came across it. (The process is explained in detail in the book.) Over time, “superforecasters,” such as this retired computer programmer stood out among the pack. Such people, write the authors:
“…have somehow managed to set the performance bar so high that even the professionals have struggled to get over it…”
The results made the authors inquire into the reasons for the “superforecasters'” better performance. They write that “It’s hard not to suspect that [so-and-so’s] remarkable mind explains his remarkable results.”
Indeed, some of their superforecasters have multiple degrees in various subjects from various top-notch universities, speak several languages, and lived or worked abroad, and are voracious readers. But, assuming that knowledge and intelligence drive strong forecasting performance would send us down the wrong path, concluded the researchers. To be a superforecaster “does not require a Harvard PhD and the ability to speak five languages,” they concluded. Many very well-educated and intelligent participants in their study “fell far short of super forecaster accuracy.” They continue: “And history is replete with brilliant people who “made forecasts that proved considerably less than prescient [citing Robert McNamara — defense secretary under Presidents Kennedy and Johnson as one example].” So, the authors conclude:
“Ultimately, it’s not the [data/brain etc] crunching power that counts. It’s how you use it.”
Well, duh, you might say. Isn’t this obvious? Apparently not.
Dragonfly Forecasting So how do these superforecasters do it? What do they have in common? The authors survey a number of case studies from their research to provide some insights. What they discovered is a capability they call “dragonfly forecasting.” The researchers observed that the super forecasters, while “ordinary” people, have an ability to synthesize a large number of perspectives and to cope with a lot of “dissonant information.” They have more than two hands, write the authors, because they are not limiting themselves to “on the one hand or the other hand thinking.” (Sidebar: I just attended a seminar on energy and climate challenges where one of the speakers, an engaging, colorful and normally compelling orator, clearly), made the comment that “on one hand you have total environmental disaster or, on the other hand, total commercial disaster,” concluding that “we need to get on the right side of this.”
This sort of binary thinking can be quite limiting, particularly when there is no “right side” as is the case, more often than not, when facing a world of increasingly complex challenges. I heard more examples of this “either-or” thinking problem again just yesterday in an all-day conference, with people literally saying that they don’t see an option beyond the frame they’re in.)
“I’ve Looked At Things From Both Sides Now”
By contrast, the dragonfly eye in operation, according to the authors, is “mentally demanding.” (Already,in this mere statement, we run up against some cultural and cognitive realities in many large organizations where everyday urgent matters and matters only perceived as urgent (possibly because of this very binary winners vs. losers thinking) take up almost all available bandwidth.)
Superforecasters “often think thrice–and sometimes they are just warming up to do a deeper-dive analysis.” Forecasting is their hobby, write the authors. They do it for fun and also because they score high in “need-for-cognition” tests. These tests rate people who have a tendency to “engage in and enjoy hard mental slogs.”
There also is an element of personality likely involved, they conclude. The traits involve “openness to experience” which includes “preference for variety and intellectual curiosity.”
The authors conclude, however, that this dragonfly eye capability, which involves synthesizing a growing number of perspectives, has “less to do with the traits someone possesses and more to do with behavior.” These behaviors include “an appetite for questioning basic, emotionally charged beliefs.” Interestingly, the researchers have concluded that, without this behavior, individuals (forecasters or not) “will often be at a disadvantage relative to a less intelligent person who has a greater capacity for self-critical thinking. [emphasis added]”
Those with a dragonfly eye cultivate their ability to encounter different perspectives. They are “actively open-minded,” write the authors. There is an actual psychological concept around this cognitive behavior. For superforecasters, therefore, “beliefs are hypotheses to be tested, not treasured to be guarded,” conclude the authors.
There are too many implications of this work–important implications–to cover in a blogpost. But it must be said that the book raises implicitly at least as many questions about how to proceed in a complex interconnected world as it attempts to answer. For instance, fewer enduring problems of real consequence can be addressed with a simple forecast, no matter how accurate, in a bounded time-wise constraint. Inherently complex “super wicked problems” discussed earlier on this blog do not lend themselves to this sort of forecasting. Tougher choices involve immersing ourselves in deeper questions of values and longer-term perspectives.
Nonetheless, what the authors have demonstrated with their research offers us the opportunity to pursue these challenges with greater awareness of individuals’ different cognitive and philosophical outlooks, and perhaps–from a corporate human resources point of view–to allocate jobs and tasks to people based on comparative evaluations of their cognitive and behavioral strengths.
As more and more issues require deeper thinking and appreciation of systemic interconnections, it may become ever more important (even if not acknowledged in organizational priorities) to find ways to incorporate “dragonfly eye” sense-making behaviors. The authors have observed that “belief perseverance” can make people “astonishingly intransigent–and capable of rationalizing like crazy to avoid acknowledging new information that upsets their settled beliefs.” When people have a greater investment in their beliefs, it is harder for them to change their views.
There is important stuff in this book which requires a great deal more reflection. So, this thread of inquiry will continue in the next post’s look at another new book called Nonsense: The Power of Not Knowing, by Jamie Holmes (Crown Publishers, New York, 2015). Not at all “nonsense,” thinking about thinking matters. Even if these books fail to provide us with concrete next steps, the relevance of these works to current challenges facing decisionmakers, and their advisors, in all sectors cannot be overstated.