One of the critical mistakes you can make when making analytics-based decisions is mistaking the unfamiliar with the improbable. A few weeks ago, I wrote about how Nate Silver influenced my understanding of how to incorporate uncertainty into your LTV calculation, Silver also did a great job of showing that we must consider contingencies we may not even have thought of. There is a tendency in our planning to confuse the unfamiliar with the improbable. The contingency we have not considered seriously looks strange; what looks strange is thought improbable; what is improbable need not be considered seriously.
Avoid anosognosia
There is a medical condition called “anosognosia,” in which a person who suffers a certain disability seems unaware of the existence of the disability. When a possibility is unfamiliar, we do not even think about it. Instead we develop a sort of mind-blindness to it. The relevant version of this syndrome for professionals in the game industry requires us to do one of the things that goes most against our nature: Admit what we do not know.
In his book, Silver used the attack at Pearl Harbor as a prime example of anosognosia within the US government. He outlined the myriad reasons why the Japanese attack had been such a surprise to our military and intelligence officers. Worse than being unprepared, we had mistaken our ignorance for knowledge and made ourselves more vulnerable as a result. In advance of Pearl Harbor we had a theory that sabotage was the most likely means by which our planes and ships would be attacked. We stacked our planes wingtip to wingtip, and our ships stern to bow, on the theory that it would be easier to monitor one big target than several smaller ones. Meanwhile we theorized that if Japan seemed to be mobilizing for an attack, it would be against Russia or perhaps against the Asian territorial possessions of the United Kingdom since Russia and the UK were already involved in the war. We had not seen the conflict through Japan’s eyes.
For a game company, the challenge is to think of possibilities that are outside the conventional wisdom. What if everyone switches to Google Glass? What if a competitor starts giving away its game and instead of in-app purchases makes it ad based? What it Twitter turns into the top gaming platform? I cannot come close to listing all the possibilities; the important issue is to start thinking that your world will not necessarily act the way you expect it to.
Find the unknown unknowns
An unknown unknown is when we have not really thought to ask a question in the first place, or an unconsidered contingency. In his book, Silver quotes Donald Rumsfeld, “there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns— there are things we do not know we don’t know.”
Anytime you are able to enumerate an unpredictable element, you are expressing a known unknown. To articulate what you don’t know is a mark of progress. It is also not a sign of weakness to acknowledge that there are things you and your team do not know, it helps make your company stronger.
An example of missing an unknown unknown: The 9/11 attacks
Admittedly, creating games, although stressful sometimes, is not nearly as important as many other uses of analytics. So using 9/11 as an example should not in any way equate social gaming to such an event, but 9/11 does point to how important it is to creatively think of unknown (but possible) situations.
The 9/ 11 Commission Report identified four types of systemic failures that contributed to our inability to appreciate the importance of the signals in the data, including failures of policy, capabilities, and management. The most important category was failures of imagination. The data signals were not consistent with the familiar hypothesis about how terrorists behaved, and they went in one ear and out the other without our really registering them. This type of pattern—a very small number of cases causing a very large proportion of the total impact— is characteristic of a power-law or heavy-tail distribution, the type of distribution I wrote about when discussing monetization.
As Silver points out, the data is easier to comprehend when we plot it on a logarithmic scale. What had once seemed chaotic and random is now revealed to be rather orderly. When plotted on a double-logarithmic scale, the relationship between the frequency and the severity of terror attacks appears to be a straight line. This is, in fact, a fundamental characteristic of power-law relationships: when you plot them on a double-logarithmic scale; the pattern that emerges is as straight as an arrow.
Heavy-tail distributions have some important properties when it comes to making predictions about the scale of future risks. In particular, they imply that disasters much worse than what society has experienced in the recent past are entirely possible, albeit infrequent.
What is particularly valuable from this example is that you have some control over the outcome even when the unknown unknown becomes reality. Because Israel has identified the unknown unknown, it certainly does not tolerate the potential for large-scale terrorism (as might be made more likely, for instance, by one of their neighbors acquiring weapons of mass destruction). If you plot the fatality tolls from terrorist incidents in Israel using the power-law method, you would find that there have been significantly fewer large-scale terror attacks than the power-law would predict; no incident since 1979 has killed more than two hundred people. The fact that Israel’s power-law graph looks so distinct is evidence that strategic choices do make some difference.
Inverse of blue ocean strategy
Recognizing the unfamiliar and identifying unknown unknowns is very similar to what I consider the best technique for creating strategy: Identifying blue ocean opportunities. In a blue ocean strategy, you are identifying opportunities in the market place nobody else has identified. Successfully applying analytics to scenarios you may not have considered also requires looking outside the conventional wisdom. The virtue is that you will force yourself to stop and smell the data: Slow down, and consider the imperfections in your thinking. Over time, you should find that this makes your decision making better.