Skip to content

The Business of Social Games and Casino

How to succeed in the mobile game space by Lloyd Melnick

Tag: decision making

Why I Am Morally Opposed to Intuition

Why I Am Morally Opposed to Intuition

I recently told a colleague I was morally opposed to intuition and it more than raised an eyebrow, it largely shocked the person I was speaking with. As someone who has been in the game industry for too many years, I am often asked to use my intuition to review game or business ideas, prioritize product features and evaluate potential employees, so it would be easy to move my agenda forward by relying on intuition. Unfortunately, it would also be a mistake.

Intuition is often used as an excuse for not having facts to prove your case. During my first year of university, I learned that I should not rely on intuition. As I developed an appreciation for analytics and statistics, I began to understand the best way to approach making optimal decisions. Bayes Theorem further strengthened the fallacy that decisions should be taken on intuition. Then, as I have I read more about heuristics and biases in decision-making, my opposition to intuition has solidified.

Common Sense and Intuition

In my first year at University, one lesson had a very long-term impact. An English professor, discussing that common sense and intuition should not be used to argue for one side in a debate, pointed out that in the 1800s one of the primary justifications of slavery was that it was “common sense” that blacks were inferior to whites. He further attacked using common sense as an argument, showing that many of the great tragedies and mistakes were justified by common sense and intuition: The Inquisition, Holocaust, Smoot Hawley, etc.

Intuition and common sense are often an excuse for not having facts

If you are trying to get a desired decision but do not have strong data to support your decision, intuition and common sense are ways people try to still get their proposed course of action approved. Many times you are pursuing a fast decision and feel there is insufficient time to collect data. Other times data is not easily available to analyze a situation. Intuition and common sense provide an argument for moving forward quickly on a decision even when there is not data to support it.

This problem is magnified if you are in a senior position. A direct or indirect insubordinate is unlikely to argue their bosses’ or the CEO’s intuition is flawed. It is particularly dangerous for those in the senior position, who will see their “intuition” supported by subordinates, further confirming to them that it is an appropriate course of action. This confirmation replaces data in driving decisions forward, and then the leader has to deal with the consequences.

Decision making biases

Reinforcing the issues with relying on intuition and common sense is the many decision making biases people exhibit. I have written frequently about consumer behavior, particularly Daniel Kahneman’s work, that shows how people often make faulty decisions. These biases include:

  • Confirmation bias.Confirmation bias is when you ignore information that conflicts with what you believe and only select the information that confirms your beliefs.
  • The Linda Problem.When given a story about a fictional person and then potential careers for that person, virtually everyone (from students to very successful professionals) chose a persona that was a subset of a broader persona, thus impossible that the former was more likely.
  • Status quo bias. People to prefer for things to stay the same by doing nothing or by sticking to a previous decision, even if the previous decision will lead to a worse outcome.
  • The narrative fallacy. People try to comprehend information in stories, rather than looking at just the facts they create a story that links them together even if there is not really a link.
  • Dunning-Kruger effect. The Dunning-Kruger Effect is when incompetent or somewhat unskilled people think they are more skilled than they are. As the article quotes, “incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are.”
  • Backfire effect. The backfire effect is after analyzing something that you or your company are doing, if the results are negative and the action was bad, you or your colleagues refuse to accept the results.
  • Bandwagon effect. The bandwagon effect is what you would assume, the tendency to do things because many other people are doing it. People will rally around a cause, an idea, a candidate, a variation, or a strategy simply because it is popular.
  • Endowment effect.The endowment effect is how people value items they own more than they would if they objectively viewed the item. Somebody might not accept $10,000 for their used car, but if a car dealer offered the same car to them they would not pay $8,000 for it.

Rather than focus on our biases when making decisions, the key takeaway is that people often do not make optimal or rational decisions. When relying on intuition or common sense, data that can offset these biases is neglected.

Common sense usually forgets Bayes Theorem

Related to the biases people experience when relying on intuition is the inability to process statistics well. I have written frequently about Bayes Theorem and how people often infer incorrect probability of a certain result. Bayes’ Theorem is a rigorous method for interpreting evidence in the context of previous experience or knowledge. Bayes’ Theorem transforms the probabilities that look useful (but are often not), into probabilities that are useful. It is important to note that it is not a matter of conjecture; by definition a theorem is a mathematical statement has been proven true. Denying Bayes’ Theorem is like denying the theory of relativity.

By way of an example, I will repeat one I used in 2014. Say you wake up with spots all over your face. You rush to the doctor and he says that 90 percent of the people who have smallpox have the symptoms you have. Since smallpox is often fatal, your first inclination may be to panic. Rather than freak out, you then ask your doctor what is the probability you have smallpox. He would then respond 1.1 percent (or 0.011). Although still not great news, it is much better than 90 percent—but more importantly it is useful information.

The key here is that people do not understand the actual likelihood that something will occur. In this case, your intuition might say you have smallpox. If this then prompts you to fly to a smallpox clinic, you probably made a bad decision. The same happens professionally, where people make decisions by inferring the wrong probability of a possible outcome.

Slide1

When to use your intuition?

NEVER. Intuition is an excuse to make decisions without data or without putting the work into looking at the data. If your intuition is actually based on past experience, then this experience is data and you should looked at it as a data point (but only one data point and one that is not more valid because you experienced it personally). To make good decisions, you should review the data and make decisions that increase the probability of the optimal outcome. You also need to just say no when somebody asks you to make an intuition based decision, you can help provide data but never should make the decision on what your gut says.

Key takeaways

  • Intuition is a very flawed way of making decisions but is often the default, particularly for leaders with extensive experience. It is actually a way to ignore data (or avoid the work collecting it) and leads to poor decisions and priorities.
  • Intuition and common sense were often the justification for some of the worst decisions in history, from slavery to the protectionist Smoot Hawley Tariffs.
  • Intuition should never be used to make decisions. Instead, spend time collecting and analyzing data and deriving decisions with the highest probably for a positive outcome.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on April 16, 2019March 20, 2019Categories Analytics, General Social Games Business, General Tech BusinessTags bias, Common Sense, decision making, Intuition4 Comments on Why I Am Morally Opposed to Intuition

The risk of status quo bias

The risk of status quo bias

One of the most dangerous, and common, biases in our decision making is status quo bias, popularized by Nobel Prize winning economist Richard Thaler. This bias in decision-making, also commonly called inertia, prompts people to prefer for things to stay the same by doing nothing or by sticking to a previous decision. This bias becomes a problem when the expected value of a change, one that may only have small transition costs, is higher than the reward for sticking with the status quo. It is also a considerable problem with big decisions, where the benefits of change could be quite substantial.

Slide1

Why is there a Status Quo Bias

People do not intentionally make sub optimal decisions, so that leaves the question of why Status Quo Bias is so prevalent. First is the concept of loss aversion, people place a higher value on avoiding loss than acquiring gain. Many people would rather not lose $5 than win $10 and would not take such a bet with a 50 percent chance of either outcome, even though over time you would be much better off taking the bet. Status Quo Bias is tied to loss aversion because by diverging from the status quo, you often run the risk of losing something you currently have (even if the expected outcome is better).

Second is the concept of sunk cost. A sunk cost is a cost that has already been incurred and thus cannot be recovered. It should not enter into your decision making process because this cost will be the same regardless of outcome. You should only look at new costs you would incur versus the expected benefit, thus the ROI on the new, not total, costs. Sunk cost is intertwined with status quo bias because changing direction can negate previous investments, even if the expected outcome is better than sticking with the past decisions.

Third is the concept of commitments. Diverging from the status quo could force people to withdraw from previously made commitments. Individuals are likely to keep commitments to avoid reputation damage or cognitive dissonance. In the latter case, breaking with committed strategy would be subconsciously inconsistent with the initial commitment to the strategy or product and the reasoning that drove the commitment.

Let’s not forget politics

One other area that drives the status quo bias, particularly in a corporate setting, is politics. People are reluctant to pursue a strategy or product that breaks from existing dogma because they fear if a change they support fails it will be blamed on them while benefits from a successful shift will not be directly attributed. They are thus making a rational, albeit sub-optimal, decision not to support changes to the status quo.

When does it happen

There are many situations where Status Quo Bias leads to sub-optimal decision making. One area is product changes, particularly to a successful product, as the product managers or designers are reluctant to change something that is working even if the new option would be better. A car manufacturer may be reluctant to change the styling on a popular model even if overall tastes are changing. By not making the change, in the long run they will lose market share. A game designer may not want to change the user experience for fear of alienating current players but a new design could make the product much more appealing to new players and also generate more revenue long-term from existing customers.

Another area where Status Quo Bias has a destructive effect is on business models. In the video game industry, many successful game companies rejected the free-to-play model because they made millions, or even billions, of dollars based on their existing business model even though free-to-play was gaining share at a rapid rate. Now companies like THQ no longer exist because of Status Quo Bias.

New products are another area where Status Quo Bias leads to sub-optimal decisions. Companies may not introduce a new product because they fear it will negatively impact their existing products, even though the net impact would be positive. Conversely, a company that has invested significantly in a new product may continue to invest in it even if testing shows it will be a failure because they do not want to change the decision to pursue that product strategy.

How to avoid status quo bias

Admitting you have a problem is the first step in eliminating any bias, including Status Quo Bias. Recognizing there is a bias favoring inertia allows you to look at decisions more objectively. You should then focus on choosing the path that leads to the highest expected value, whether or not it represents a change.

Key takeaways

  • Status Quo Bias is when you make decisions to avoid change even when the change would have a positive expected value.
  • People often prefer the status quo because of an aversion to losses (they overvalue losing something they already have to making a gain), sunk costs and previous commitments, while internal office politics also have a strong impact on sub-optimal decisions.
  • When making decisions, you should look objectively at optimizing expected value, whether that value comes from something new or old.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on April 24, 2018March 18, 2018Categories General Social Games Business, General Tech BusinessTags decision making, Richard Thaler, status quo biasLeave a comment on The risk of status quo bias

Thinking, Fast and Slow, Part 3: The Invisible Gorilla

I have written several times about the work of Kahneman and Tversky, highlighted in the book Thinking, Fast and Slow, and how helpful it is in understanding decision-making and consumer behavior. One of the most enlightening experiments done Kahneman and Tversky, the Invisible Gorilla experiment, shows the difference between tasks that require mental focus and those we can do in the background.

Invisible gorilla

The Invisible Gorilla experiment

In this experiment, people were asked to watch a video of two teams playing basketball, one with white shirts versus one with black shirts (click to see Invisible Gorilla experiment). The viewers of the film need to count the number of passes made by members of the white team and ignoring the players wearing black.

This task is difficult and absorbing, forcing participants to focus on the task. Halfway through the video, a gorilla appears, crossing the court, thumps its chest and then continues to move across and off the screen.

The gorilla is in view for nine seconds. Fifty percent, half, of the people viewing the video do not notice anything unusual when asked later (that is, they do not notice the gorilla). It is the counting task, and especially the instruction to ignore the black team, that causes the blindness.

While entertaining, there are several important insights from this experiment

  • One important insight is that nobody would miss the gorilla if they were not doing the task. When you are focusing on a mentally challenging task, which can be counting passes or doing math or shooting aliens, you do not notice other actions nor can you focus on them.
  • A second insight is we do not realize the limitations we face when focused on one task. People are sure they did not miss the gorilla. As Kahneman writes, “we are bind to our blindness.”

System 1 and System 2

The Invisible Gorilla also serves as a framework to understand the two systems people use to think. System 1 operates automatically and quickly, with liitle or no effort and no sense of voluntary control. An example of System 1 thinking would be taking a shower (for an adult), where you do not even think about what you are doing.

System 2 thinking is deliberate, effortful and orderly, slow thinking. System 2 allocates attention to the effortful mental activities that demand I, including complex computations. The operations of System 2 are often associated with subjective experience of agency, choice, and concentration. The highly diverse operations of System 2 have one feature in common: they require attention and are disrupted when attention is drawn away .

The automatic operation of System 1 generates surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps.

Implications

Understanding System 1 and System 2 has several implications. First, if you are involved in an activity requiring System 2 thought, do not try to do a second activity requiring System 2 thought. While walking and chewing bubble gum are both System 1 for most people and can be done simultaneously, negotiating a big deal while typing an email are both System 2 and should not be done at the same time.

Second, do not create products that require multiple System 2 actions concurrently. While System 2 is great for getting a player immersed in a game, asking them to do two concurrently will create a poor experience. A third implication is when onboarding someone to your product, only expose them to one System 2 activity at a time.

Example from our world, Urbano’s Failed App

I like to use examples from the game space to illustrate how understanding Kahneman and Tversky’s work can impact your business. In this example, Urbano runs product design for a fast growing app company at the intersection of digital and television. He has built a great sports product that allows players to play a very fun game while watching any sporting activity on television. Unfortunately, Urbano’s company is running out of funds and the next release needs to be a hit or else they will not survive. Although the product has tested well, Urbano is nervous because of the financial situation and decides to add more to the product, to make the app based on what happens the past three minutes during the televised match. They launch the app and although players initially start playing, they never come back and the product fails.

Another company buys the rights to the product and conducts a focus test. They find out users forgot what happened on television because they were focusing on the app and then could not complete the game. They take out the part requiring attention to the televised match and the product is a huge success. The difference was that the latter did not require multiple System 2 thinking simultaneously, it left television watching as a System 1 activity.

Key Takeaways

  1. In a famous experiment, people watching a basketball game who had to count passes one team made missed the appearance of a gorilla on the video. The experiment showed when you are focusing on something, you do not notice what else is happening.
  2. We are blind to things in the background. We are blind to our blindness. In the Invisible Gorilla experiment, not only did people not see the gorilla, they refused to believe that they missed a gorilla.
  3. There are two types of mental activities, System 1 that are automatic and reflexive, and System 2, that requires deliberate, effortful and orderly thinking.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on May 31, 2017May 29, 2017Categories General Social Games Business, General Tech Business, thinking fast and slowTags Amos Tversky, attention span, Daniel Kahneman, decision making, Invisible gorilla, Product design, thinking fast and slow1 Comment on Thinking, Fast and Slow, Part 3: The Invisible Gorilla

Thinking, Fast and Slow, Part 2: Why we should care about judgment errors

As promised last month, I will spend a few blog posts summarizing Thinking, Fast and Slow by Daniel Kahneman. Before diving into the heuristics, biases and processes that he and his colleague Amos Tversky identified, it is important to understand why he wrote the book and why it is so useful. Fortunately, he largely does this in his introduction so it is a great place to start.

Let’s not beat ourselves up but make ourselves better

First, Kahneman points out that the goal of his research is not to prove we are idiots but to help us minimize bad decisions. Understanding flaws in human decision-making is no more insulting or denigrating than writing about diseases in a medical journal belittles good health. Rather, our decision-making is generally quite good, most of our judgments are appropriate most of the time, but there are systemic biases that if we understand can make our decision-making more effective.

By understanding Kahneman’s work, you will be better able to identify and understand errors of judgment and choice, in others and then yourself. As Kahneman points out, “an accurate diagnosis may suggest an intervention to limit the damage that bad judgments and choices often cause.”

Is our judgment flawed?

At its roots, Kahneman began his career trying to determine if people consistently made biased judgments. Long story short, we do.

One example drove this determination home to Kahneman and Tversky and probably will to you also. Early in their collaboration, both Kahneman and Tversky realized they made the same “silly” prognostications about careers that toddlers would pursue when they became adults. They both knew an argumentative three year old and felt it was likely that he would become a lawyer, the empathetic and mildly intrusive toddler would become a psychotherapist and the nerdy kid would become a professor. They, both smart academics, neglected the baseline data (very few people became psychotherapists, professors or even lawyers compared with other professions) and instead believed the stories in their head about who ended up in what careers was accurate. This realization drove their ensuing research, that we are all biased.

How understanding Kahneman’s work impacts the world

The broad impact of Kahneman and Tversky’s work drives home it’s importance to everyone. When they published their first major paper, it was commonly accepted in academia that:

  1. People are generally rational, and their thinking is normally sound
  2. Emotions such as fear, affection and hatred explain most of the occasions on which people depart from rationality

Not only did these two assumptions drive academia (particularly economics and social sciences) but also their acceptance often drove business and government decisions. The work laid out in Thinking, Fast and Slow, however, disproved these two assumptions and thus drove entirely different decisions to generate strong results.

Scholars in a host of disciplines have found it useful and have leveraged it in other fields, such as medical diagnosis, legal judgment, intelligence analysis, philosophy, finance, statistics and military strategy. Kahneman cites an example from the field of Public Policy. His research showed that people generally assess the relative importance of issues by the ease with which they are retrieved from memory, and this is largely determined by the extent of coverage in the media. This insight now drives everything from election strategy to understanding (and countering) how authoritarian regimes manipulate the populace.

Kahneman and Tversky were also careful to ensure the subject of their experiments were not simply university students. By using scholars and experts as the subject of their experiments, thought leaders gained an unusual opportunity to observe possible flaws in their own thinking. Having seen themselves fail, they became more likely to question the dogmatic assumption, prevalent at the time that the human mind is rational and logical. I found the same myself and am confident that you will also. The idea that our minds are susceptible to systematic errors is now generally accepted.

Why it is called Thinking, Fast and Slow

Slide1

While Kahneman and Tversky’s early work focused on our biases in judgment, their later work focused on decision-making under uncertainty. They found systemic biases in our decisions that consistently violated the rules of rational choice.

Again, we should not discount our decision-making skills. Many examples of experts who can quickly make critical decisions, from a chess master who can identify the top 20 next moves on a board as he walks by to a fireman knowing what areas to avoid in a burning building, experts often make critical decisions quickly.

What Kahneman and Tversky identified, though, is that while this expertise is often credited with good decision making, it is more of retrieving information from memory. The situation serves as a cue or trigger for the expert to retrieve the appropriate answer.

This insight helps us avoid a problem where our experience (which we consider intuition) does not actually help but hinders. In easy situations, intuition works. In difficult ones, we often answer the wrong questions. We answer the easier question, often without noticing the substitution.

If we fail to come to an intuitive solution, we switch to a more deliberate and effortful form of thinking. This is the slow thinking of the title. Fast thinking is both the expert and heuristic.

Example from our world, The Allan Mistake

Many of my readers are experienced “experts” from the mobile game space, so I will start with a hypothetical example that many of us can relate to. In this example, Allan is the GM of his company’s puzzle game division. He has been in the game industry over twenty years and has seen many successful and failed projects. The CEO, Mark, comes to Allan and says they are about to sign one of three celebrities to build a game around.

Allan knows the demographics of puzzle players intimately and identifies the one celebrity who is most popular with Allan’s target customers. Nine months later they launch the game and it is an abysmal failure. Allan is terminated and wonders what he did wrong.

Allan then looks over his notes from when he read Thinking, Fast and Slow, and realizes his fundamental mistake. When Mark came to him and asked which celebrity to use, Allan took the easy route and analyzed the three celebrity options. He did not tackle the actual question, whether it was beneficial to use a celebrity for a puzzle game and only if that was positive to pick between the three. If he had answered the more difficult question (difficult also because it would have set him against Mark), he would have found that celebrity puzzle games are never successful, regardless of the celebrity. Although it may have created tensions at the time with Mark, he probably would have been given an opportunity to create a game with a higher likelihood of success and still be in his position.

Key takeaways

  • Our decision making is not bad but by understanding our systemic biases we can be more efficient.
  • Understanding that people are not regularly rational and that this irrationality is not driven by emotion allows us to make better decisions in fields as diverse as medical diagnosis, legal judgment, intelligence analysis, philosophy, finance, statistics and military strategy.
  • Fast thinking refers to quick decisions and judgments based on our experience while slow thinking is the analysis of difficult questions.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on May 3, 2017May 11, 2017Categories General Social Games Business, General Tech Business, thinking fast and slowTags Amos Tversky, Daniel Kahneman, decision making, thinking fast and slowLeave a comment on Thinking, Fast and Slow, Part 2: Why we should care about judgment errors

Thinking, Fast and Slow, Part 1: The Linda Problem

I recently finished Michael Lewis’ most recent book, The Undoing Project: A Friendship that Changed the World and it motivated me to revisit Daniel Kahneman’s Thinking, Fast and Slow. Lewis’ book describes the relationship between Daniel Kahneman and Amos Tversky, two psychologists whose research gave birth to behavioral economics, modern consumer behavior theory and the practical understanding of people’s decision making. He explains the challenges they faced and the breakthroughs that now seem obvious.

As I mentioned, The Undoing Project reminded me how important Kahneman’s book was, probably the most important book I have ever read. It has helped me professionally, both understand consumer behavior and make better business decisions. It has helped me in my personal life, again better decision making in everything from holiday choices to career moves. It helps even to explain the election of Donald Trump or how the situation in North Korea has developed.

In the Undoing Project, two things drove home the importance of Kahneman’s work. First, despite being a psychologist, Kahneman won the Nobel Prize for Economics in 2002. It is difficult enough to win a Nobel Prize (I’m still waiting for the call), but to do it in a field that is not your practice is amazing. The second item that proved the value of Kahneman’s (and his colleague Amos Tversky) work was the Linda Problem. I will discuss this scenario later in this post, but the Linda Problem proved how people do not make rational decisions, myself included. It convinced the mainstream that people, including doctors and intellectuals, consistently made irrational decisions.

Despite the value I derived from Thinking, Fast and Slow, I never felt I learned all I could from it. I found it very difficult to read, the exact opposite of a Michel Lewis book, and did not digest all the information Kahneman provided. Even when I recommended the book to friends, I often caveat the recommendation with a warning it will be hard to get through.

Given the importance of Kahneman’s work and the challenge I (and probably others) have had in fully digesting Thinking, Fast and Slow, I will be writing a series of blog posts, each one summarizing one chapter of Kahneman’s book. I hope you find it as useful as I know I will.

The Linda Problem

As discussed above, the Linda Problem is the research by Kahneman and Tversky that largely proved people thought irrationally, or at least did not understand logic. While I normally like to paraphrase my learnings or put them into examples relevant for my audience, in this case it is best to show the relevant description from The Undoing Project, as the Linda Project was a scientific study that I do not want to misrepresent:

Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Linda was designed to be the stereotype of a feminist. Danny and Amos asked: To what degree does Linda resemble the typical member of each of the following classes?

  1. Linda is a teacher in elementary school.
  2. Linda works in a bookstore and takes Yoga classes.
  3. Linda is active in the feminist movement.
  4. Linda is a psychiatric social worker.
  5. Linda is a member of the League of Women voters.
  6. Linda is a bank teller.
  7. Linda is an insurance salesperson.
  8. Linda is a bank teller and is active in the feminist movement.

Danny [Kahneman] passed out the Linda vignette to students at the University of British Columbia. In this first experiment, two different groups of students were given four of the eight descriptions and asked to judge the odds that they were true. One of the groups had “Linda is a bank teller” on its list; the other got “Linda is a bank teller and is active in the feminist movement.” Those were the only two descriptions that mattered, though of course the students didn’t know that. The group given “Linda is a bank teller and is active in the feminist movement” judged it more likely than the group assigned “Linda is a bank teller.” That result was all that Danny and Amos [Tversky] needed to make their big point: The rules of thumb people used to evaluate probability led to misjudgments. “Linda is a bank teller and is active in the feminist movement” could never be more probable than “Linda is a bank teller.” “Linda is a bank teller and active in the feminist movement” was just a special case of “Linda is a bank teller.” “Linda is a bank teller” included “Linda is a bank teller and activist in the feminist movement” along with “Linda is a bank teller and likes to walk naked through Serbian forests” and all other bank-telling Lindas.

One description was entirely contained by the other. People were blind to logic. They put the Linda problem in different ways, to make sure that the students who served as their lab rats weren’t misreading its first line as saying “Linda is a bank teller NOT active in the feminist movement.” They put it to graduate students with training in logic and statistics. They put it to doctors, in a complicated medical story, in which lay embedded the opportunity to make a fatal error of logic. In overwhelming numbers doctors made the same mistake as undergraduates.

The fact that almost everyone made the same logic mistakes shows how powerful this understanding is. It proves that our judgment, and thus decision making, is often not logical but does contain flaws. This understanding helps explain many things in life and business that sometimes do not seem to makes sense.

Linda Problem

The implications

Once you understand how our judgment is biased, it can help you make better decisions. It can also provide insights into how your customers view different options and why people behave as they do. In future posts, I will explore all of Kahneman and Tversky’s major findings and how they apply.

Key Takeaways

  • In the Undoing Project, Michael Lewis writes about the relationship and research of Daniel Kahneman and Amos Tversky, two psychologists who changed the way we understand decision making
  • The Linda Problem proved to the non-believers that people made illogical judgments. When given a story about a fictional person and then potential careers for that person, virtually everyone (from students to very successful professionals) chose a persona that was a subset of a broader persona, thus impossible that the former was more likely.
  • By understanding how people make judgments and decisions, we can improve our own decision making process and better understand our friends, family and customers.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on April 19, 2017April 20, 2017Categories General Social Games Business, General Tech Business, Lloyd's favorite posts, thinking fast and slowTags Amos Tversky, Daniel Kahneman, decision making, Linda Problem, michael lewis, thinking fast and slow1 Comment on Thinking, Fast and Slow, Part 1: The Linda Problem

How to manage your own biases

I have always been interested in decision making and how people often are not logical in not only their preferences but even how they remember and look at facts. The most useful book I ever read was Thinking, Fast and Slow by Daniel Kahneman, (highly recommend it if you haven’t read it yet) and one of my favorite academics is behavioral economist Dan Ariely. Not only does Kahneman and Ariely’s research help you understand consumer behavior, it helps you understand your own decision making and, most importantly, mistakes most of us make.

A recent guest blog post on the Amplitude Blog, 5 Cognitive Biases Ruining Your Growth, does a great job of describing five biases that can greatly impact your business. While I will try to avoid just repeating the blog post, below are the five biases and some ways they may be impacting you:

  1. Confirmation bias. Confirmation bias is when you interpret or recall information in a way that confirms your preexisting beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities.This bias occurs regularly in the game space, especially with free to play games.

    A product manager may have driven a new feature, maybe a new price point on the pay wall. Rather than running an AB test (maybe insufficient traffic or other changes going on), they then review the feature pre and post launch. Game revenue per user increased 10 percent so they create a Powerpoint and email the CEO that there new feature had a 10 percent impact. Then the company adds this feature to all its games. The reality is that at the same time the feature was released the marketing team stopped a television campaign that was attracting poorly monetizing players. The latter is actually what caused the change in revenue. As someone who has known a lot of product managers, I can confirm this bias in the real world.

  2. The narrative fallacy. People try to comprehend information in stories, rather than looking at just the facts they create a story that links them together even if there is not really a link. If you watch business news, when the stock market goes up 5 points, the narrative may be the market has rebounded from its Brexit blues. If the market goes down 5 points, the same story would be the market is still suffering from Brexit. The reality is that 5 points is statistically insignificant (the market is an aggregate of multiple stocks) so neither narrative is more likely in either scenario. The key issue here is that we attribute causation where there is none.An example in the game world.

    Two branded games are in the top 5 of new releases. All of the analysis is that branded games are now what customers are looking for. The realities is that the two games, totally unrelated, had strong mechanics and were just that lucky 10% of games that succeed. Allowing the Narrative Fallacy to win, however, you then put your resources to branded games, which are no more popular than before the launch of the two successful titles.

  3. Dunning-Kruger Effect. Before the Amplitude post, I had not heard of this bias, at least with this name, but once you read about it I am sure you will know cases of it. The Dunning-Kruger Effect is when incompetent or somewhat unskilled people think they are more skilled than they are. As the article quotes, “incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are.”

    Again, for the example from the game industry. Let’s say you want to port your game to a new VR platform. You go to your development team and they say it won’t be a problem. You sign up for the project, give them the specs, six months later they still cannot get the game to run on the VR platform as they have no idea how to develop VR (this is a nicer example than some others I can remember).

  4. Backfire effect. The backfire effect is after analyzing something that you or your company are doing, if the results are negative and the action was bad, you or your colleagues refuse to accept the results. As they write in the blog post, “the exact definition of the Backfire Effect [is]: ‘When people react to disconfirming evidence by strengthening their beliefs.’”

    As an example, you decide to analyze how your company has been calculating LTV. You look back at the analysis done the last two years and see how actual LTV tracked with projections at that time. You discover that you underestimated actual spend by 50 percent. Should be great news, will allow you to ramp up dramatically your user acquisition. Instead, when you present this data to your analytics team, they refuse to accept it, saying your analysis is flawed because you are not looking at the right cohorts.

  5. Bandwagon effect. The bandwagon effect is what you would assume, the tendency to do things because many other people are doing it. People will rally around a cause, an idea, a candidate, a variation, or a strategy simply because it is popular.

    Given that I want to keep this blog post under 500 GB, I will not list all the examples of the bandwagon effect I have seen in the game industry. Product strategy, however, is the most obvious culprit. When the free to play game industry started to evolve to mobile, everyone started porting its Facebook games over to mobile. Since Zynga and the other big companies were doing it, all of the smaller companies as well as newly funded ones also tried to bring the same core mechanics from Facebook over to mobile. Mechanics that worked on Facebook, however, did not work on mobile but companies continued doing it because everyone else was. Rather than identify the market need and a potential blue ocean, companies just joined the bandwagon.

Slide1

Avoid these biases

The key to making the right decisions is not to assume you do not have biases, but always to be diligent in reviewing your decisions and making sure you are thinking rationally. All of these biases can lead to personal or company failure, so the inability to identify them can have extreme consequences.

Key takeaways

  1. Understanding our biases allows us to not only understand our customers but make better decisions.
  2. A core bias you see in the game industry is confirmation bias, where someone looks at data to prove their hypothesis (or brilliance), even if the data does not really support it.
  3. Another critical bias is the narrative fallacy, where we create a story to explain an event even if the story is not the cause of the event.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on March 29, 2017March 28, 2017Categories General Social Games Business, General Tech Business, UncategorizedTags bias, decision making1 Comment on How to manage your own biases

Bayes’ Theorem Part 1: Why Bayes’ Rule is the key to good decision making and success

Making the right decision, in business and in life, is the most important thing you can do. Wrong decisions can haunt you your entire life while the right decision can mean making your company worth billions, years of happiness, etc. Imagine if Travis Kalanick, CEO of Uber, had decided to focus on connecting buses with passengers and not taxis, or if Trip Hawkins would have focused 3DO on creating software and not a hardware platform. Understanding Bayes’ Theorem (also known as Bayes’ Rule, two terms I will use interchangeably) increases the chance you use data the right way to make your decisions.

This post is the first in a series I will be writing on Bayes’ Rule. This post and most of the background I discuss is based on the best book I have found about Bayes’ Rule, A Tutorial to Bayesian Analysis by James Stone. Last year, I wrote several posts on Lifetime Value (LTV), given how crucial it is to the success of any business, from the newest technology to the oldest brick and mortar enterprise. This year, we will be tackling Bayes’ Theorem. As you will see in the next few posts, by understanding Bayes’ Theorem you can then make optimal decisions about what games or projects to green light, how to staff your company, what to invest in, which technology to use, who to sell your company to, what areas of your company need to be fixed/improved, etc. Bayes’ Theorem is the single most important rule for good decision-making, both in your professional and business life.
James Stone's book on Bayes' Rule

What is Bayes’ Theorem?

Bayes’ Theorem is a rigorous method for interpreting evidence in the context of previous experience or knowledge. Bayes’ Theorem transforms the probabilities that look useful (but are often not), into probabilities that are useful. It is important to note that it is not a matter of conjecture; by definition a theorem is a mathematical statement has been proven true. Denying Bayes’ Theorem is like denying the theory of relativity.

Some examples of Bayes’ Rule

The best way to understand Bayes’ Rule is by example (I will touch on the math later). Again, much of this is based on Stone’s tutorial on Bayesian analysis. First, look at probability as the informal notion based on the frequency with which particular events occur. If a bag has 100 M&Ms, and 60 are green and 40 are red, the probability of reaching into the bag and grabbing a green M&M is the same proportion as green M&Ms in the bag (i.e., 60/100=0.6). From this, it follows that any event can adopt a value between zero and one, with zero meaning it definitely will not occur and one that it definitely will occur. Thus, given a series of mutually exclusive events, such as the outcome of choosing an M&M, the probabilities of those events must add up to one. Continue reading “Bayes’ Theorem Part 1: Why Bayes’ Rule is the key to good decision making and success”

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on January 21, 2014February 6, 2014Categories Analytics, Bayes' Theorem, General Social Games Business, Lloyd's favorite postsTags analytics, Bayes' Rule, Bayes' Theorem, decision making8 Comments on Bayes’ Theorem Part 1: Why Bayes’ Rule is the key to good decision making and success

How to know who is good and who is bad

simpsons_devilEarly in my professional career, a colleague gave me a piece of advice: you see who truly is good and who is not when times are difficult. Of all the “sage wisdom” I have received over the years, this advice is the one that has been proven again and again. It is an invaluable tool in understanding better your colleagues, investors, Board, business partners, customers and vendors. I have passed this advice on to friends recently and felt it was worth blogging about, as you will not only find it very useful in understanding others but it also will tell you a lot about yourself.

The underlying principle is that it is easy to do the right thing when everything is going well but you can understand a person’s true character in how they act in difficult times. Many people seem great when they do not have cash flow issues, when their company is hitting or exceeding its targets, etc. They will often talk about win-win relationships and seem great to work with.

Most of these people, however, show their true colors when the cost of doing good becomes very significant. If doing the appropriate action takes money out of their pocket, hampers their career, potentially risks new investment, makes them look weak, etc., many will take the path of least resistance and do things you would not expect of them. Following are some (but definitely not all) situations that provide insight into the true character of people you associate with: Continue reading “How to know who is good and who is bad”

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on December 4, 2012December 6, 2012Categories General Social Games Business, Lloyd's favorite postsTags business ethics, decision making7 Comments on How to know who is good and who is bad

Get my book on LTV

The definitive book on customer lifetime value, Understanding the Predictable, is now available in both print and Kindle formats on Amazon.

Understanding the Predictable delves into the world of Customer Lifetime Value (LTV), a metric that shows how much each customer is worth to your business. By understanding this metric, you can predict how changes to your product will impact the value of each customer. You will also learn how to apply this simple yet powerful method of predictive analytics to optimize your marketing and user acquisition.

For more information, click here

Follow The Business of Social Games and Casino on WordPress.com

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 1,296 other followers

Lloyd Melnick

This is Lloyd Melnick’s personal blog.  All views and opinions expressed on this website are mine alone and do not represent those of people, institutions or organizations that I may or may not be associated with in professional or personal capacity.

I am a serial builder of businesses (senior leadership on three exits worth over $700 million), successful in big (Disney, Stars Group/PokerStars, Zynga) and small companies (Merscom, Spooky Cool Labs) with over 20 years experience in the gaming and casino space.  Currently, I am on the Board of Directors of Murka and GM of VGW’s Chumba Casino

Topic Areas

  • Analytics (113)
  • Bayes' Theorem (8)
  • behavioral economics (8)
  • blue ocean strategy (14)
  • Crowdfunding (4)
  • General Social Games Business (452)
  • General Tech Business (189)
  • Growth (88)
  • International Issues with Social Games (50)
  • Lloyd's favorite posts (100)
  • LTV (54)
  • Machine Learning (10)
  • Mobile Platforms (37)
  • Social Casino (50)
  • Social Games Marketing (104)
  • thinking fast and slow (5)
  • Uncategorized (31)

Social

  • View CasualGame’s profile on Facebook
  • View @lloydmelnick’s profile on Twitter
  • View lloydmelnick’s profile on LinkedIn

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 1,296 other followers

Archives

  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • November 2019
  • October 2019
  • September 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • December 2010

Most Recent Posts

  • Interview with Jay Powell on trends from 2020 and expectations for 2021
  • Summary of posts September to December 2020
  • 2021 Pre-Mortem: What went wrong in 2021
  • The Power of Content

RSS

RSS Feed RSS - Posts

RSS Feed RSS - Comments

Categories

  • Analytics (113)
  • Bayes' Theorem (8)
  • behavioral economics (8)
  • blue ocean strategy (14)
  • Crowdfunding (4)
  • General Social Games Business (452)
  • General Tech Business (189)
  • Growth (88)
  • International Issues with Social Games (50)
  • Lloyd's favorite posts (100)
  • LTV (54)
  • Machine Learning (10)
  • Mobile Platforms (37)
  • Social Casino (50)
  • Social Games Marketing (104)
  • thinking fast and slow (5)
  • Uncategorized (31)

Archives

  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • November 2019
  • October 2019
  • September 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • December 2010
January 2021
S M T W T F S
 12
3456789
10111213141516
17181920212223
24252627282930
31  
« Dec    

by Lloyd Melnick

All posts by Lloyd Melnick unless specified otherwise
The Business of Social Games and Casino Website Powered by WordPress.com.
Cancel
%d bloggers like this: