Everyone probably already knows the clear winner of last week’s US election, it was a resounding victory for analytics over “intuition” and “expertise.” New York Time blogger Nate Silver, who uses statistical models to analyze polling and economic data, correctly projected which candidate would win each of the fifty states (and District of Columbia). Conversely, not one expert (often referred to as pundit) came close to predictably as accurately, the election. Moreover, many missed by a huge margin while mocking Silver before the election. This is the second consecutive Presidential election where Silver was uncannily accurate (he predicted 49 states correctly in 2008), showing he was not just lucky. As you may have noticed, I have long been a fan of Silver’s and incorporated his RSS feed into this blog over a year ago.
Silver and Moneyball
What happened in the political arena mirrors the lessons from Michael Lewis’ Moneyball that I have written about several times (my original Moneyball post and my follow-on when the Oakland A’s made the playoffs). In this case, everyone had looked to experts (both the professionals running the campaigns and the electorate) to interpret poll results and other factors and project how the election would unfold. The experts allegedly had the intuition to better predict these events than a non-expert. Campaigns also used this expertise to plan their strategy: which states to focus on, where to spend money, which issues to stress, whether to write concession speeches, etc.
The similarities between baseball and politics is uncanny. Moneyball brilliantly portrayed the smoke filled room where baseball scouts and executives would evaluate talent based on their experience and intuition. They felt they know what “tools” led to prospects becoming successful and could tell who “looks like a player.” Billy Beane, the Oakland A’s General Manager, however, implemented an analytics-based system to judge players based on proprietary metrics that contribute to winning. The old-school scouts and executives scoffed at Beane, allowing him to generate great results for the A’s (which he again repeated this year).
In the election, you also had experts who looked at the candidates, processed their debate performances, “understood” the mood of the country and then used their experience and intuition to project how people would vote. Karl Rove, the architect of George Bush’s victory, was so confident based on his expertise that Mitt Romney would win Ohio that even when the television networks picked President as the winner of Ohio, he refused to accept the decision. Joe Scarborough, a very popular commentator and ex-Congressman, was so sure he understood the electorate better than Nate Silver he repeatedly derided Silver on air and on Twitter. You could easily of swapped Karl Rove for Art Howe (Billy Beane’s first manager with the Oakland A’s), showing how the lessons from Moneyball can be applied anywhere.
Implications for social game companies
Although fun to observe, Silver’s success over the experts has many practical lessons for social game companies. It reinforces that accurately interpreting data is the most effective way to predict results. Thus, you need to build your organization and invest in the necessary tools that generate the appropriate data and then effectively analyze the data to drive decision making. This process also needs to be the central part of decision making; it cannot be done in a small room by data scientists who then email their results to management. Instead, it must be embraced by all executives and drive the decision-making process.
You also have to overcome the very strong desire to rely on intuition and experience. Several years ago I committed to analytics-based decision making after reading Thomas Davenport’s seminal work, Competing on Analytics. Even after embracing analytics, I have found myself many times being pulled to make decisions based on intuition. It is very difficult in the game industry to abandon intuition when deciding what games to make (e.g., the green light process). I have sat in many meetings, saying to myself, “That is an awesome concept (or awful concept) that would do great (or fail)/ We need to do it (abandon it) immediately.” We all feel we know what is going to work, and like Karl Rove we think we can better predict the market. The reality is none of us can (not me, not Peter Molyneux, not Mark Pincus, and not you); that is why more than 75 percent of game projects fail.
Another area that is hard to resist relying on intuition is recruiting. A 30-minute interview is incredibly powerful and will often overpower the candidate’s experience, successes, education, etc. The reality is that an interview is a very inexact science. Some great candidates interview poorly while there are those I would consider professional interviewers who will ace the interview and fail to live up to expectations. Just as most game projects fail, many hires do not work out and the reason is probably reliance on intuition over analysis.
The evidence keeps coming in showing the strength of analytics over intuition. The Oakland A’s surprising division championship this year, Nate Silver’s flawless prediction of an election considered too close to call, etc. If you are focused on building a successful company, you need to grasp that intuition is not the strategy to rely on, but that you should apply the Moneyball approach to your organization’s decisions.