Can Businesses Learn 'Superforecasting'? Easier Said Than Done
Want to get smarter about biotech clinical trials, R&D strategy, and financial modeling? Join the thousands of biopharma execs and professionals who subscribe to our free email newsletter. Read a sample issue, and then sign up here.
[This post originally appeared on Forbes.com on October 4, 2015.]
Many business decisions hinge on fact-based predictions of the future, and careers are made or broken by the outcomes. No wonder, then, that companies value “expertise” at all levels – after all, more experience and education should lead to more informed guesses, and hence better decisions.
But how reliable are the forecasts of so-called “experts”? Not very, it turns out. About a decade ago, Wharton professor Philip Tetlock analyzed the predictions that almost 300 respected authorities in politics and economics made over two decades – and as The New Yorker’s Louis Menand summed up the results, “Human beings who spend their lives studying the state of the world … are poorer forecasters than dart-throwing monkeys.”
Tetlock spent the next phase of his career questioning that simplistic summary of his earlier work. Even if the “experts” are essentially guessing, are some spear-slinging simians more accurate than others? The title of his latest book, Superforecasting (Crown), which he co-authored with Dan Gardner, foreshadows the answer: “superforecasters” do, in fact, walk among us – and despite their lack of “expertise” in any traditionally-defined sense, they consistently out-perform on predicting the future.
Tetlock based his conclusion on findings from the Good Judgment Project, a multi-year study in which he and his colleagues asked thousands of crowdsourced subjects to predict the likelihood of a slew of future political and economic events: Would North Korea invade South Korea by January 1? Would gold fall below $1,200 per ounce by September? Would revolution break out in Syria by the end of April? The judgers assigned percentage odds to each prediction coming true, updated their forecasts over time as new details emerged, and were scored as their forecasts came true (or didn’t).
Sure enough, among these non-experts, a small cadre significantly out-predicted both their peers and, when parts of the GJP were incorporated into a larger U.S. national intelligence effort, teams of top professional researchers. And their “superforecasting” wasn’t just beginner’s luck – over time, their advantage over their competitors not only persisted, but grew. In analyzing the commonalities of these “superforecasters”, Tetlock found that “[i]t’s not really who they are. It is what they do.” Among other traits, they break complex problems into smaller, more tractable ones, search for comparators to guide their views, and try to avoid over-reacting to particular pieces of evidence. And perhaps most importantly, they rigorously analyze their past performances to figure out how to avoid repeating mistakes or over-interpreting successes.
If these behaviors sound challenging to adopt, that’s because they are – which is why not everyone is a “superforecaster”. But one of Tetlock’s key points is that these aren’t innate skills: they can be both taught and learned. A 60-minute tutorial on the traits of high-performance predictors increased participants’ accuracy by about 10 percent over the course of the entire subsequent year. That may not sound like a big deal, but compounded over time, it would yield a huge impact from a relatively low-effort intervention – one that may be within reach not only of most individuals, but of many organizations as well. In fact, based on these data alone, Tetlock’s “Ten Commandments For Aspiring Superforecasters” should probably have a place of honor in most business meeting rooms, right next to (or in place of?) the ubiquitous “corporate values” posters.
So, companies may be able to nurture more “superforecasters” – but how can they maximize their impact within the organization? One logical strategy might be to assemble these lone-wolf prediction savants into “superteams” – and in fact, coalitions of the highest-performing predictors did outperform individual “superforecasters”. However, this was only true if the groups also had additional attributes, like a “culture of sharing” and diligent attention to avoiding “groupthink” among their members, none of which can be taken for granted, especially in a large organization. “A busy executive might think “I want some of those” and imagine the recipe is straightforward,” Tetlock wryly observes about these “superteams”. “Sadly, it isn’t that simple.” A bigger question for companies is whether even individual “superforecasters” could survive the toxic trappings of modern corporate life. The GJP’s experimental bubble lacked the competitive promotion policies, dysfunctional managers, bonus-defining annual reviews and forced rankings that complicate the pure, single-minded quest for fact-based decision-making in many organizations. All too often, as Tetlock ruefully notes, “the goal of forecasting is not to see what’s coming. It is to advance the interest of the forecaster and the forecaster’s tribe,” [original emphasis] and it’s likely many would find it difficult to reconcile the key tenets of “superforecasting” with their personal and professional aspirations.
This challenge is perhaps most pronounced at the highest levels of management, where there is a perceived (and possibly quite real) conflict between thoughtful, humble open-mindedness and decisive leadership. In a chapter aptly-titled “The Leader’s Dilemma," Tetlock describes how the German Wehrmacht harmonized those divergent mandates by eschewing rigid top-down control in favor of a system where senior leaders set the strategic course, but empowered troops in the field to use their best powers of fact-based analysis and judgment to make decisions on the fly. Tetlock suggests (with all appropriate apologies and caveats) that this could be a path forward in modern organizations – which is logical, but neither particularly novel nor supported by any new experimental data from the GJP. Although this chapter on reconciling “superforecasting” with the challenges of leadership is pleasing to read, it’s more anecdotal than analytical – and thus somewhat less satisfying than the rest of the book.
But these difficulties don’t make the main findings of Superforecasting any less relevant in the corporate world. To the contrary, Tetlock’s book shows that good analytical judgment depends not on any external trappings of expertise, but on a set of discrete skills and approaches that can be learned and taught – and any business that is serious about improving the quality of its decision-making should invest in discussing and disseminating its key messages. It's not easy to create analytical powerhouses, but Superforecasting could be the foundation to help individuals and organizations bring more rigor to how they use data to predict the future.
[Disclosure: Thanks to Crown Publishing for generously providing me with a review copy of Superforecasting.]