Skip to content

The Business of Social Games and Casino

How to succeed in the mobile game space by Lloyd Melnick

Tag: bias

People Analytics for Online Gaming

People Analytics for Online Gaming

Last month, I wrote about some applications online gaming companies can take from the world of operations analytics, which are primarily used by traditional and retail businesses, and a course on People Analytics from Wharton showed some ways this area of analytics could be used to improve our businesses. While people analytics is often the domain of HR professionals, there are valuable elements for managers across tech businesses (many of whom do not have robust HR teams). Below are some of the most important takeaways from the course.

Identifying the noise and improving performance evaluations

A critical role for any leader or manager is accurately evaluating performance of your employees. Accuracy is important to ensure you provide useful feedback that helps people improve, assists you in putting the right people in the correct roles and identifies the skills needed for success in specific functions.

The fundamental challenge in performance evaluation is that performance measures are very noisy. There is a range of outcomes possible outside of the employee’s control. The challenge is separating skill and effort from luck so that you understand true performance.

In the course, the instructors highlight how often people confuse skill with luck. They start with an example from sports, showing that professional American football teams ability to draft (select out of university) players is almost primarily luck. While some teams have had a string of success, success in one year has no predictive ability on success in future years. If skill were a key factor, then you would expect a team to repeat its success.

It also holds true with investment analysts. An analyst who has a great year is no more likely to have above market results the next year than one of the poorest performing analysts.

There are many reasons we confuse this luck with skill:

  • Interdependence. I have found a humbling amount of work depends on other people, if they are great we look great, if they are not, we look bad. You should not attribute individual performance to something that is at the group level. In these cases, performance should be evaluated as a group. Conversely, reliable individual evaluation requires seeing people on other teams (for example, Tom Brady’s play on the Buccaneers will help assess whether his performance was due to him or the environment).
  • Outcome bias. We tend to believe good things happen to those who work hard and judge by outcome, not by process.
  • Reverse causality. When we see two correlated factors, we tend to believe one caused the other. In reality, one there may be no causality or it may be in the other direction. This leads us to see things that do not exist and can prompt us to give people undeserved credit or blame. One example cited in the course was research that showed charisma did not impact whether a CEO was successful, but successful leaders were considered more charismatic.
  • Narrative seeking. We want to make sense of the world and tell a causal story.
  • Hindsight bias. Once we have seen something occur, it is hard to anticipate we did not see it coming. We rewrite the history in our minds the history of the process.
  • Context. We tend to neglect context when evaluating performance. We over attribute performance to personal skills and under attribute it to environmental factors such as the challenge of the problem the employee faced, quality of their team, etc. In psychology, this issue is referred to as the Fundamental Attribution Error, blaming or crediting personality traits to situational traits.
  • Self-fulfilling prophesies. People tend toward performing consistent with expectations. High expectations increase performance, low expectations decrease performance
  • Small samples. Small samples lead to greater variation, what we see in a small sample may not be representative of a large population.
  • Omitted variable bias. There could be an additional reason that is driving both what the performance and what we think is causing the performance. For example, we may think higher compensation is leading to better performance. The truth might be that extra effort is causing both higher compensation and superior performance, thus the key variable (effort) had been omitted.

When you are looking at evaluating performance, there are several tools to improve your accuracy. You need to focus on the process the employee (or potential employee) took rather than only the outcome; we normally omit almost 50 percent of the objectives that we later identify as relevant to success. Thus, you should look at a much broader set of objectives that impact the business. This process includes determining what increases the likelihood for superior performance, rather than traditional outcomes are there four or five things that may not be obvious but contribute to overall success. A few years ago, I wrote how one basketball player (Shane Battier) was much more valuable than many players who scored more points or otherwise had flashier statistics, the same holds true in traditional business.

You need to look carefully at the job and understand what drives success. Define success not only by outcomes but how well these factors predict other KPIs, attrition, rate of promotion, etc. In the course, they also point out what works for one role or company does not necessarily work for others. Google found that GPA was an awful predictor of performance, but for Goldman Sachs it is the gold standard of who will be successful.

Slide1

Additional ways to improve performance evaluation include:

  • Broaden the sample. Add additional opinions, more performance metrics, different projects and assignments. The key is to use diverse, uncorrelated signals.
  • Find and create exogenous variation. The only truly valid way to tease out causation is to control an employee’s environment. Have the employee change teams, direct report, projects, offices as the variation will provide a better sense of the employee’s ability.
  • Reward in proportion to the signal. Match the duration and complexity of rewards to the duration and complexity of past accomplishments. For short, noisy signals it is better to give bonuses and praise rather than raises and promotions.
  • The wisdom of crowds. Average of guesses is surprisingly good (even the exercises like guessing the number of jelly beans in a bowl), so get multiple experts to help with your assessment. Ensure, though, that their predictions are independent of each other (they are not talking to each other, they do not have the same background, etc).
  • Ensure statistical significance. A small sample (one project, one season, etc) is less likely to give you an accurate measure.
  • Use multivariate regression. This analysis will allow you to separate out the influence of different characteristics.

At the end of the day, you need to separate the signal from the noise to evaluate current performance and predict future success. Someone may have had a great performance or year but they may be a less valuable future employee than someone else because of luck or other environmental factors.

Recruiting the right people

Evaluating performance is not only important for your current team but also recruiting the best new hires. Hiring the wrong person can have huge consequences, including missed growth opportunities, damaging your culture and decreased output. Yet, most companies find consistently recruiting the right people difficult. This is often caused by the Illusion of Validity, that we think we know more about people than we actually do. We interview somebody and believe we can judge his or her suitability for a job. This Illusion is popped by research that shows the correlation of several hiring tools to subsequent performance (Ranked from most effective to worst:

  1. Work samples.
  2. Cognitive ability tests (these are general intelligence tests).
  3. Structured interviews.
  4. Job knowledge tests.
  5. Integrity tests.
  6. Unstructured interviews.
  7. Personality tests.
  8. Reference checks.

Several of the low scoring tools reinforce the Illusion of Validity. Unstructured interviews, where you meet someone and get a sense of their strengths and weaknesses, is often the paramount driver for whether we hire a candidate, but we are not good judges of character. I remember reading when President Bush first met Russian President Putin in 2001, he said “I looked the man in the eye. I found him to be very straight forward and trustworthy.” We see how well that worked out. As the above research also shows, reference checks are even more ineffective in the hiring process for similar reasons.

What does work is seeing examples of their previous relevant work, intelligence tests and structured interviews. Structured interviews are one designed to assess specific attributes of a candidate.

Use analysis for internal promotions

As well as improving the hiring process, People Analytics can help move the right people internally into the right roles. Often, people are promoted based on having done a great job in their current role. The course shows, though, that this approach often leads to negative outcomes (both for the employee and the company). The skills needed to succeed in the next job may not be the same skills that led to success in the current job. Performance in one job is not automatically a predictor of performance in a new role.

Just as it is important to understand the key predictors of success when recruiting, you need to do the same with internal promotion. Understand what leads to success in the new role and hire internally (or externally) those most likely to succeed. The good news is that research has shown that people promoted performed better overall than new hires into comparable roles.

Reducing employee churn

Attrition is one of the costliest problems company’s face and People Analytics can help combat this problem. The expense of losing an employee includes hiring a replacement, training costs, loss of critical knowledge and the impact on customer relationships. People analytics offers help in mitigating this problem. You should start by analyzing the percent turnover at specific milestones (3 months, 6 months, 1 year, etc.) and evolve into using multivariate regressions to predict who will reach each milestone. As you get more sophisticated you can build a survival model to understand over time what proportion will stay with your company. And then finally look at a survival/hazard rate model to test what factors accelerate the risk of exit.

During the course, they also provided some interesting data on why people leave. The decision to quit is most commonly driven by external factors, comparing the current job to a new opportunity. This understanding is critical as internal factors do play a role, internal issues still have a relatively small relationship to how likely people are to churn.

To reduce churn over time, the instructors of the course suggest an informed hiring strategy (where predicting churn is integrating into who is hired) and target interventions (reduce factors that accelerate risk of exit, address unmet needs, focus retention efforts, etc).

Using network analysis to improve collaboration

Another great takeaway from the course was how to use network analysis to understand, improve and incentive collaboration. Without getting too granular, network analysis involves looking at the informal links between employees, who gets information from who and what direction(s) that information is flowing. Once you draw that map, you can understand who are central to communications, who are outside the map, areas for improvement and people who should be rewarded for their role in collaboration.

network map

While there are many details to creating and analyzing a network, there are five key areas to focus on when looking at individuals (there are no right and wrong answers for each attribute, optimizing depends on the goal and environment):

  1. Network size. How many people are they connected to.
  2. Network strength. How important and often are the lines of communication.
  3. Network range. How many different groups are they connected to. Range would be small if you are connected to everyone on your team even if it is a big team, large if you are connected to one person at every other corporate function (i.e. marketing, accounting, analytics, etc.)
  4. Network density. Are the connections connected to different people or to each other.
  5. Network centrality. Is everyone equally central or are there some in the middle and others on the fringes.

Understand how your company’s network works will allow you to understand collaboration patterns. For example, by deconstructing performance, you can understand if collaboration patterns impact performance. If there is a positive causal relationship, you can work to replicate or improve these relationships. If there is no relationship, your team might be wasting time on unnecessary collaboration.

You can use this analysis to understand if collaboration is needed and where. Then you can strategically build ties and bridges between different parts of the organization. This result can be achieved with:

  • Cross-functional meetings.
  • Conference calls or video conferences
  • Job rotations
  • Site visits
  • Events

You should also identify where collaboration is unnecessary or overly burdensome and reduce demands on people. Match overloaded people with well-regarded employees who are under-utilized, who can relieve some of the burden. Also identify a small number of new connections that would have the biggest positive impact on team connectivity and shift responsibilities more evenly across members.

Tying performance evaluation with collaboration

People analytics can be particularly helpful connecting the performance evaluation methods discussed above with analysis of collaboration. As I wrote earlier, the key to good performance reviews is understanding what drives the outcomes you are looking for. If collaboration is one of those success drivers, you need to evaluate it thoroughly and incorporate into performance reviews and internal promotions (you do not want to promote someone weak at collaboration into a role where it is vital to success).

You should revise your evaluation systems to include collaboration. First, this will provide incentive to employees to build and use meaningful relationships. Second, it will recognize team members who help others win new clients or serve current customers, even if those direct results accrue to someone else (the basketball player who passes the ball rather than dunks).

To achieve this goal, you need to have the right measures. If you are assessing individual collaboration, you need to look at elements the individual controls. You then need to make sure there is reliability, which are the assessments will remain consistent over time and across raters. Third, the measures must have validity (accuracy). There also needs to be comparability, you need to be able to use the measures to look at all people who you are evaluating. Finally, it must be cost effective, it should not be too expensive to collect the information.

Key takeaways

  • You need to align performance evaluations with the underlying factors that create success; deconstruct what leads to the outcomes you want and then assess people on those factors.
  • Some common problems when evaluating people include context (attributing results to a person when the environment drove success or failure), interdependence (assessing on an individual level a result that was driven by a team), self-fulfilling prophecies (people perform consistent with expectations) and reverse causality (we attribute causality to correlation, even though the factors may not be related or may be in the other direction).
  • You should assess how your team or company works as a network, looking at the relationships, and then encourage and grow ones that lead to desired outcomes.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on September 9, 2020June 17, 2020Categories Analytics, General Social Games Business, General Tech BusinessTags bias, collaboration, interdependence, network analysis, People Analytics, performance evaluation, recruiting2 Comments on People Analytics for Online Gaming

How to overcome survivorship bias

How to overcome survivorship bias

A few months ago I shared a story and post on Facebook about survivorship bias and was amazed how often it was liked and shared. It also highlights the risk of survivorship bias in the gaming and gambling space. The image and blurb told the story how the navy analyzed aircraft that had been damaged and based future armament decisions on where they had received battle damage, thus they were going to increase the armor on the wingtips, central body and elevators. These were the areas that showed the most bullet holes.

Facebook plane story

One statistician, Abraham Wald, the founder of statistical sequential analysis, however fortuitously stopped this misguided effort. According to Wikipedia, “ Wald made the assumption that damage must be more uniformly distributed and that the aircraft that did return or show up in the samples were hit in the less vulnerable parts. Wald noted that the study only considered the aircraft that had survived their missions—the bombers that had been shot down were not present for the damage assessment. The holes in the returning aircraft, then, represented areas where a bomber could take damage and still return home safely. Wald proposed that the Navy instead reinforce the areas where the returning aircraft were unscathed, since those were the areas that, if hit, would cause the plane to be lost.”

Survivorship bias is universal

Survivorship bias occurs everywhere. If you are a poker player, you may have a hand of three of clubs, eight of clubs, eight of diamonds, queen of hearts and ace of spades. The odds of that particular configuration are about three million to one, but as economist Gary Smith writes in Standard Deviations, “after I look at the cards, the probability of having these five cards is 1, not 1 in 3 million.”

Another example would be professional basketball. If you look at the best professional basketball players, a high percentage never went to university for more than one year. From this information, you (or your teen son) may infer the best path to the NBA is going to university for one year or less. The reality is that there are millions (if not billions) of people who went to university for less than a year and never played in the NBA (or even the G League). The LeBron Jameses and DeAndre Aytons are likely in the NBA despite playing less than a year in college due to their great skill, not because they did not go to university for more than a year.

As an investor, survivorship bias is the tendency to view the fund performance of existing funds in the market as a representative comprehensive sample. Survivorship bias can result in the overestimation of historical performance and general attributes of a fund.

In the business world, you may go to a Crossfit gym that is packed with the owner making a great living. You decide to leave your day job and replicate his success. What you did not see is the hundreds of Crossfit gyms that are not profitable and have closed.

The problem exists in gaming

You often see survivorship bias in the gaming and gambling space. People will look at a successful product and select a couple of features or mechanics they believe have driven the success. They then try to replicate it and fail miserably, only to then wonder why the strategy did not work for them. What they fail to analyze is the many failed games (for every success there are at least 8-10 failures) because they do not even know they exist. The failed games may have had more of the feature you are replicating. Getting a star like Kim Kardashian is a great idea if you only look at Kim Kardashian: Hollywood, but if you look at the hundreds of other IPs that have failed your course of action might be very different.

Survivorship bias can also lend its ugly head when building a VIP program. You talk to your VIPs and analyze their behavior, thus building a program that reinforces what they like about the game. What you neglect, however, is that other non-existent features might have created even more VIPs.

In the gambling space, you may look at a new blackjack variant that is doing great and build a strategy around creating new variants of classic games. What you did not see is all the games based on new variants that have failed.

Avoiding survivorship bias

Looking simply at successes, or even failures, leads to bad decision making. When looking at examples in your industry or other industries, you need to seek out both the successes and failures. With the failures, you need to make sure they are the failures (not the airplanes that returned shot up but the ones that were destroyed). You also should not use others successes or failures as a short cut to robust strategy decisions. You need to analyze the market, understand your strengths-weaknesses-opportunities-threats (SWOT) and do a blue ocean analysis. Only then will you build a strategy that optimizes your likelihood for success.

Key takeaways

  • In WW2, by analyzing surviving aircraft the US navy almost made a critical mistake in adding armor to future airplanes. The planes that returned were actually survivors, while it was the planes that were destroyed that showed where on the plane was the greatest need for new armor. This phenomenon is called survivorship bias.
  • This bias extends into the gaming and gambling space, as companies analyze what has worked in successful games but do not know if it also failed (perhaps to a greater degree) in products that no longer exist.
  • Rather than just looking at survivors or winners to drive your strategy, you should do a full SWOT and Blue Ocean analysis, that is the strongest long-term recipe to optimize your odds of success.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on May 14, 2019March 4, 2019Categories Analytics, General Social Games Business, General Tech Business, Social CasinoTags analytics, bias, survivor bias, Survivorship bias1 Comment on How to overcome survivorship bias

Why I Am Morally Opposed to Intuition

Why I Am Morally Opposed to Intuition

I recently told a colleague I was morally opposed to intuition and it more than raised an eyebrow, it largely shocked the person I was speaking with. As someone who has been in the game industry for too many years, I am often asked to use my intuition to review game or business ideas, prioritize product features and evaluate potential employees, so it would be easy to move my agenda forward by relying on intuition. Unfortunately, it would also be a mistake.

Intuition is often used as an excuse for not having facts to prove your case. During my first year of university, I learned that I should not rely on intuition. As I developed an appreciation for analytics and statistics, I began to understand the best way to approach making optimal decisions. Bayes Theorem further strengthened the fallacy that decisions should be taken on intuition. Then, as I have I read more about heuristics and biases in decision-making, my opposition to intuition has solidified.

Common Sense and Intuition

In my first year at University, one lesson had a very long-term impact. An English professor, discussing that common sense and intuition should not be used to argue for one side in a debate, pointed out that in the 1800s one of the primary justifications of slavery was that it was “common sense” that blacks were inferior to whites. He further attacked using common sense as an argument, showing that many of the great tragedies and mistakes were justified by common sense and intuition: The Inquisition, Holocaust, Smoot Hawley, etc.

Intuition and common sense are often an excuse for not having facts

If you are trying to get a desired decision but do not have strong data to support your decision, intuition and common sense are ways people try to still get their proposed course of action approved. Many times you are pursuing a fast decision and feel there is insufficient time to collect data. Other times data is not easily available to analyze a situation. Intuition and common sense provide an argument for moving forward quickly on a decision even when there is not data to support it.

This problem is magnified if you are in a senior position. A direct or indirect insubordinate is unlikely to argue their bosses’ or the CEO’s intuition is flawed. It is particularly dangerous for those in the senior position, who will see their “intuition” supported by subordinates, further confirming to them that it is an appropriate course of action. This confirmation replaces data in driving decisions forward, and then the leader has to deal with the consequences.

Decision making biases

Reinforcing the issues with relying on intuition and common sense is the many decision making biases people exhibit. I have written frequently about consumer behavior, particularly Daniel Kahneman’s work, that shows how people often make faulty decisions. These biases include:

  • Confirmation bias.Confirmation bias is when you ignore information that conflicts with what you believe and only select the information that confirms your beliefs.
  • The Linda Problem.When given a story about a fictional person and then potential careers for that person, virtually everyone (from students to very successful professionals) chose a persona that was a subset of a broader persona, thus impossible that the former was more likely.
  • Status quo bias. People to prefer for things to stay the same by doing nothing or by sticking to a previous decision, even if the previous decision will lead to a worse outcome.
  • The narrative fallacy. People try to comprehend information in stories, rather than looking at just the facts they create a story that links them together even if there is not really a link.
  • Dunning-Kruger effect. The Dunning-Kruger Effect is when incompetent or somewhat unskilled people think they are more skilled than they are. As the article quotes, “incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are.”
  • Backfire effect. The backfire effect is after analyzing something that you or your company are doing, if the results are negative and the action was bad, you or your colleagues refuse to accept the results.
  • Bandwagon effect. The bandwagon effect is what you would assume, the tendency to do things because many other people are doing it. People will rally around a cause, an idea, a candidate, a variation, or a strategy simply because it is popular.
  • Endowment effect.The endowment effect is how people value items they own more than they would if they objectively viewed the item. Somebody might not accept $10,000 for their used car, but if a car dealer offered the same car to them they would not pay $8,000 for it.

Rather than focus on our biases when making decisions, the key takeaway is that people often do not make optimal or rational decisions. When relying on intuition or common sense, data that can offset these biases is neglected.

Common sense usually forgets Bayes Theorem

Related to the biases people experience when relying on intuition is the inability to process statistics well. I have written frequently about Bayes Theorem and how people often infer incorrect probability of a certain result. Bayes’ Theorem is a rigorous method for interpreting evidence in the context of previous experience or knowledge. Bayes’ Theorem transforms the probabilities that look useful (but are often not), into probabilities that are useful. It is important to note that it is not a matter of conjecture; by definition a theorem is a mathematical statement has been proven true. Denying Bayes’ Theorem is like denying the theory of relativity.

By way of an example, I will repeat one I used in 2014. Say you wake up with spots all over your face. You rush to the doctor and he says that 90 percent of the people who have smallpox have the symptoms you have. Since smallpox is often fatal, your first inclination may be to panic. Rather than freak out, you then ask your doctor what is the probability you have smallpox. He would then respond 1.1 percent (or 0.011). Although still not great news, it is much better than 90 percent—but more importantly it is useful information.

The key here is that people do not understand the actual likelihood that something will occur. In this case, your intuition might say you have smallpox. If this then prompts you to fly to a smallpox clinic, you probably made a bad decision. The same happens professionally, where people make decisions by inferring the wrong probability of a possible outcome.

Slide1

When to use your intuition?

NEVER. Intuition is an excuse to make decisions without data or without putting the work into looking at the data. If your intuition is actually based on past experience, then this experience is data and you should looked at it as a data point (but only one data point and one that is not more valid because you experienced it personally). To make good decisions, you should review the data and make decisions that increase the probability of the optimal outcome. You also need to just say no when somebody asks you to make an intuition based decision, you can help provide data but never should make the decision on what your gut says.

Key takeaways

  • Intuition is a very flawed way of making decisions but is often the default, particularly for leaders with extensive experience. It is actually a way to ignore data (or avoid the work collecting it) and leads to poor decisions and priorities.
  • Intuition and common sense were often the justification for some of the worst decisions in history, from slavery to the protectionist Smoot Hawley Tariffs.
  • Intuition should never be used to make decisions. Instead, spend time collecting and analyzing data and deriving decisions with the highest probably for a positive outcome.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on April 16, 2019March 20, 2019Categories Analytics, General Social Games Business, General Tech BusinessTags bias, Common Sense, decision making, Intuition4 Comments on Why I Am Morally Opposed to Intuition

How to manage your own biases

I have always been interested in decision making and how people often are not logical in not only their preferences but even how they remember and look at facts. The most useful book I ever read was Thinking, Fast and Slow by Daniel Kahneman, (highly recommend it if you haven’t read it yet) and one of my favorite academics is behavioral economist Dan Ariely. Not only does Kahneman and Ariely’s research help you understand consumer behavior, it helps you understand your own decision making and, most importantly, mistakes most of us make.

A recent guest blog post on the Amplitude Blog, 5 Cognitive Biases Ruining Your Growth, does a great job of describing five biases that can greatly impact your business. While I will try to avoid just repeating the blog post, below are the five biases and some ways they may be impacting you:

  1. Confirmation bias. Confirmation bias is when you interpret or recall information in a way that confirms your preexisting beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities.This bias occurs regularly in the game space, especially with free to play games.

    A product manager may have driven a new feature, maybe a new price point on the pay wall. Rather than running an AB test (maybe insufficient traffic or other changes going on), they then review the feature pre and post launch. Game revenue per user increased 10 percent so they create a Powerpoint and email the CEO that there new feature had a 10 percent impact. Then the company adds this feature to all its games. The reality is that at the same time the feature was released the marketing team stopped a television campaign that was attracting poorly monetizing players. The latter is actually what caused the change in revenue. As someone who has known a lot of product managers, I can confirm this bias in the real world.

  2. The narrative fallacy. People try to comprehend information in stories, rather than looking at just the facts they create a story that links them together even if there is not really a link. If you watch business news, when the stock market goes up 5 points, the narrative may be the market has rebounded from its Brexit blues. If the market goes down 5 points, the same story would be the market is still suffering from Brexit. The reality is that 5 points is statistically insignificant (the market is an aggregate of multiple stocks) so neither narrative is more likely in either scenario. The key issue here is that we attribute causation where there is none.An example in the game world.

    Two branded games are in the top 5 of new releases. All of the analysis is that branded games are now what customers are looking for. The realities is that the two games, totally unrelated, had strong mechanics and were just that lucky 10% of games that succeed. Allowing the Narrative Fallacy to win, however, you then put your resources to branded games, which are no more popular than before the launch of the two successful titles.

  3. Dunning-Kruger Effect. Before the Amplitude post, I had not heard of this bias, at least with this name, but once you read about it I am sure you will know cases of it. The Dunning-Kruger Effect is when incompetent or somewhat unskilled people think they are more skilled than they are. As the article quotes, “incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are.”

    Again, for the example from the game industry. Let’s say you want to port your game to a new VR platform. You go to your development team and they say it won’t be a problem. You sign up for the project, give them the specs, six months later they still cannot get the game to run on the VR platform as they have no idea how to develop VR (this is a nicer example than some others I can remember).

  4. Backfire effect. The backfire effect is after analyzing something that you or your company are doing, if the results are negative and the action was bad, you or your colleagues refuse to accept the results. As they write in the blog post, “the exact definition of the Backfire Effect [is]: ‘When people react to disconfirming evidence by strengthening their beliefs.’”

    As an example, you decide to analyze how your company has been calculating LTV. You look back at the analysis done the last two years and see how actual LTV tracked with projections at that time. You discover that you underestimated actual spend by 50 percent. Should be great news, will allow you to ramp up dramatically your user acquisition. Instead, when you present this data to your analytics team, they refuse to accept it, saying your analysis is flawed because you are not looking at the right cohorts.

  5. Bandwagon effect. The bandwagon effect is what you would assume, the tendency to do things because many other people are doing it. People will rally around a cause, an idea, a candidate, a variation, or a strategy simply because it is popular.

    Given that I want to keep this blog post under 500 GB, I will not list all the examples of the bandwagon effect I have seen in the game industry. Product strategy, however, is the most obvious culprit. When the free to play game industry started to evolve to mobile, everyone started porting its Facebook games over to mobile. Since Zynga and the other big companies were doing it, all of the smaller companies as well as newly funded ones also tried to bring the same core mechanics from Facebook over to mobile. Mechanics that worked on Facebook, however, did not work on mobile but companies continued doing it because everyone else was. Rather than identify the market need and a potential blue ocean, companies just joined the bandwagon.

Slide1

Avoid these biases

The key to making the right decisions is not to assume you do not have biases, but always to be diligent in reviewing your decisions and making sure you are thinking rationally. All of these biases can lead to personal or company failure, so the inability to identify them can have extreme consequences.

Key takeaways

  1. Understanding our biases allows us to not only understand our customers but make better decisions.
  2. A core bias you see in the game industry is confirmation bias, where someone looks at data to prove their hypothesis (or brilliance), even if the data does not really support it.
  3. Another critical bias is the narrative fallacy, where we create a story to explain an event even if the story is not the cause of the event.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on March 29, 2017March 28, 2017Categories General Social Games Business, General Tech Business, UncategorizedTags bias, decision making1 Comment on How to manage your own biases

The heart of good forecasting: Be conservative

I recently read a paper, “The Golden Rule of Forecasting: Be Conservative” by Armstrong, Green and Graefe, that showed empirically the most important principles in making forecasts and predictions. Given the value of accurate forecasting (e.g., for building your business, making investments, choosing between product strategies), by understanding the golden rule you will help optimize your decision making.

Slide1

What is particularly compelling about this paper is that it is based on extensive research and empirical studies. So while many forecasting and decision-making guidelines are based on hypothesis or observation, The Golden Rule of Forecasting is based on data.

At the heart of the golden rule of forecasting is that you should be conservative; forecasters must seek all knowledge relevant to the problem and use methods that have been validated for the situation.

With “be conservative” as the overarching golden rule, the authors performed extensive research to develop a list of guidelines that help lead to good forecasts and identified practices that generate poor decisions. Continue reading “The heart of good forecasting: Be conservative”

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on February 11, 2014February 13, 2014Categories Analytics, General Social Games BusinessTags bias, data, ForecastingLeave a comment on The heart of good forecasting: Be conservative

Get my book on LTV

The definitive book on customer lifetime value, Understanding the Predictable, is now available in both print and Kindle formats on Amazon.

Understanding the Predictable delves into the world of Customer Lifetime Value (LTV), a metric that shows how much each customer is worth to your business. By understanding this metric, you can predict how changes to your product will impact the value of each customer. You will also learn how to apply this simple yet powerful method of predictive analytics to optimize your marketing and user acquisition.

For more information, click here

Follow The Business of Social Games and Casino on WordPress.com

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 1,298 other followers

Lloyd Melnick

This is Lloyd Melnick’s personal blog.  All views and opinions expressed on this website are mine alone and do not represent those of people, institutions or organizations that I may or may not be associated with in professional or personal capacity.

I am a serial builder of businesses (senior leadership on three exits worth over $700 million), successful in big (Disney, Stars Group/PokerStars, Zynga) and small companies (Merscom, Spooky Cool Labs) with over 20 years experience in the gaming and casino space.  Currently, I am on the Board of Directors of Murka and GM of VGW’s Chumba Casino

Topic Areas

  • Analytics (113)
  • Bayes' Theorem (8)
  • behavioral economics (8)
  • blue ocean strategy (14)
  • Crowdfunding (4)
  • General Social Games Business (452)
  • General Tech Business (189)
  • Growth (88)
  • International Issues with Social Games (50)
  • Lloyd's favorite posts (100)
  • LTV (54)
  • Machine Learning (10)
  • Mobile Platforms (37)
  • Social Casino (50)
  • Social Games Marketing (104)
  • thinking fast and slow (5)
  • Uncategorized (31)

Social

  • View CasualGame’s profile on Facebook
  • View @lloydmelnick’s profile on Twitter
  • View lloydmelnick’s profile on LinkedIn

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 1,298 other followers

Archives

  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • November 2019
  • October 2019
  • September 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • December 2010

Most Recent Posts

  • Interview with Jay Powell on trends from 2020 and expectations for 2021
  • Summary of posts September to December 2020
  • 2021 Pre-Mortem: What went wrong in 2021
  • The Power of Content

RSS

RSS Feed RSS - Posts

RSS Feed RSS - Comments

Categories

  • Analytics (113)
  • Bayes' Theorem (8)
  • behavioral economics (8)
  • blue ocean strategy (14)
  • Crowdfunding (4)
  • General Social Games Business (452)
  • General Tech Business (189)
  • Growth (88)
  • International Issues with Social Games (50)
  • Lloyd's favorite posts (100)
  • LTV (54)
  • Machine Learning (10)
  • Mobile Platforms (37)
  • Social Casino (50)
  • Social Games Marketing (104)
  • thinking fast and slow (5)
  • Uncategorized (31)

Archives

  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • November 2019
  • October 2019
  • September 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • December 2010
January 2021
S M T W T F S
 12
3456789
10111213141516
17181920212223
24252627282930
31  
« Dec    

by Lloyd Melnick

All posts by Lloyd Melnick unless specified otherwise
The Business of Social Games and Casino Website Powered by WordPress.com.
Cancel
%d bloggers like this: