Skip to content

The Business of Social Games and Casino

How to succeed in the mobile game space by Lloyd Melnick

Tag: Amos Tversky

Thinking, Fast and Slow, Part 3: The Invisible Gorilla

I have written several times about the work of Kahneman and Tversky, highlighted in the book Thinking, Fast and Slow, and how helpful it is in understanding decision-making and consumer behavior. One of the most enlightening experiments done Kahneman and Tversky, the Invisible Gorilla experiment, shows the difference between tasks that require mental focus and those we can do in the background.

Invisible gorilla

The Invisible Gorilla experiment

In this experiment, people were asked to watch a video of two teams playing basketball, one with white shirts versus one with black shirts (click to see Invisible Gorilla experiment). The viewers of the film need to count the number of passes made by members of the white team and ignoring the players wearing black.

This task is difficult and absorbing, forcing participants to focus on the task. Halfway through the video, a gorilla appears, crossing the court, thumps its chest and then continues to move across and off the screen.

The gorilla is in view for nine seconds. Fifty percent, half, of the people viewing the video do not notice anything unusual when asked later (that is, they do not notice the gorilla). It is the counting task, and especially the instruction to ignore the black team, that causes the blindness.

While entertaining, there are several important insights from this experiment

  • One important insight is that nobody would miss the gorilla if they were not doing the task. When you are focusing on a mentally challenging task, which can be counting passes or doing math or shooting aliens, you do not notice other actions nor can you focus on them.
  • A second insight is we do not realize the limitations we face when focused on one task. People are sure they did not miss the gorilla. As Kahneman writes, “we are bind to our blindness.”

System 1 and System 2

The Invisible Gorilla also serves as a framework to understand the two systems people use to think. System 1 operates automatically and quickly, with liitle or no effort and no sense of voluntary control. An example of System 1 thinking would be taking a shower (for an adult), where you do not even think about what you are doing.

System 2 thinking is deliberate, effortful and orderly, slow thinking. System 2 allocates attention to the effortful mental activities that demand I, including complex computations. The operations of System 2 are often associated with subjective experience of agency, choice, and concentration. The highly diverse operations of System 2 have one feature in common: they require attention and are disrupted when attention is drawn away .

The automatic operation of System 1 generates surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps.

Implications

Understanding System 1 and System 2 has several implications. First, if you are involved in an activity requiring System 2 thought, do not try to do a second activity requiring System 2 thought. While walking and chewing bubble gum are both System 1 for most people and can be done simultaneously, negotiating a big deal while typing an email are both System 2 and should not be done at the same time.

Second, do not create products that require multiple System 2 actions concurrently. While System 2 is great for getting a player immersed in a game, asking them to do two concurrently will create a poor experience. A third implication is when onboarding someone to your product, only expose them to one System 2 activity at a time.

Example from our world, Urbano’s Failed App

I like to use examples from the game space to illustrate how understanding Kahneman and Tversky’s work can impact your business. In this example, Urbano runs product design for a fast growing app company at the intersection of digital and television. He has built a great sports product that allows players to play a very fun game while watching any sporting activity on television. Unfortunately, Urbano’s company is running out of funds and the next release needs to be a hit or else they will not survive. Although the product has tested well, Urbano is nervous because of the financial situation and decides to add more to the product, to make the app based on what happens the past three minutes during the televised match. They launch the app and although players initially start playing, they never come back and the product fails.

Another company buys the rights to the product and conducts a focus test. They find out users forgot what happened on television because they were focusing on the app and then could not complete the game. They take out the part requiring attention to the televised match and the product is a huge success. The difference was that the latter did not require multiple System 2 thinking simultaneously, it left television watching as a System 1 activity.

Key Takeaways

  1. In a famous experiment, people watching a basketball game who had to count passes one team made missed the appearance of a gorilla on the video. The experiment showed when you are focusing on something, you do not notice what else is happening.
  2. We are blind to things in the background. We are blind to our blindness. In the Invisible Gorilla experiment, not only did people not see the gorilla, they refused to believe that they missed a gorilla.
  3. There are two types of mental activities, System 1 that are automatic and reflexive, and System 2, that requires deliberate, effortful and orderly thinking.

Share this:

  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
Like Loading...
Unknown's avatarAuthor Lloyd MelnickPosted on May 31, 2017May 29, 2017Categories General Social Games Business, General Tech Business, thinking fast and slowTags Amos Tversky, attention span, Daniel Kahneman, decision making, Invisible gorilla, Product design, thinking fast and slow1 Comment on Thinking, Fast and Slow, Part 3: The Invisible Gorilla

Thinking, Fast and Slow, Part 2: Why we should care about judgment errors

As promised last month, I will spend a few blog posts summarizing Thinking, Fast and Slow by Daniel Kahneman. Before diving into the heuristics, biases and processes that he and his colleague Amos Tversky identified, it is important to understand why he wrote the book and why it is so useful. Fortunately, he largely does this in his introduction so it is a great place to start.

Let’s not beat ourselves up but make ourselves better

First, Kahneman points out that the goal of his research is not to prove we are idiots but to help us minimize bad decisions. Understanding flaws in human decision-making is no more insulting or denigrating than writing about diseases in a medical journal belittles good health. Rather, our decision-making is generally quite good, most of our judgments are appropriate most of the time, but there are systemic biases that if we understand can make our decision-making more effective.

By understanding Kahneman’s work, you will be better able to identify and understand errors of judgment and choice, in others and then yourself. As Kahneman points out, “an accurate diagnosis may suggest an intervention to limit the damage that bad judgments and choices often cause.”

Is our judgment flawed?

At its roots, Kahneman began his career trying to determine if people consistently made biased judgments. Long story short, we do.

One example drove this determination home to Kahneman and Tversky and probably will to you also. Early in their collaboration, both Kahneman and Tversky realized they made the same “silly” prognostications about careers that toddlers would pursue when they became adults. They both knew an argumentative three year old and felt it was likely that he would become a lawyer, the empathetic and mildly intrusive toddler would become a psychotherapist and the nerdy kid would become a professor. They, both smart academics, neglected the baseline data (very few people became psychotherapists, professors or even lawyers compared with other professions) and instead believed the stories in their head about who ended up in what careers was accurate. This realization drove their ensuing research, that we are all biased.

How understanding Kahneman’s work impacts the world

The broad impact of Kahneman and Tversky’s work drives home it’s importance to everyone. When they published their first major paper, it was commonly accepted in academia that:

  1. People are generally rational, and their thinking is normally sound
  2. Emotions such as fear, affection and hatred explain most of the occasions on which people depart from rationality

Not only did these two assumptions drive academia (particularly economics and social sciences) but also their acceptance often drove business and government decisions. The work laid out in Thinking, Fast and Slow, however, disproved these two assumptions and thus drove entirely different decisions to generate strong results.

Scholars in a host of disciplines have found it useful and have leveraged it in other fields, such as medical diagnosis, legal judgment, intelligence analysis, philosophy, finance, statistics and military strategy. Kahneman cites an example from the field of Public Policy. His research showed that people generally assess the relative importance of issues by the ease with which they are retrieved from memory, and this is largely determined by the extent of coverage in the media. This insight now drives everything from election strategy to understanding (and countering) how authoritarian regimes manipulate the populace.

Kahneman and Tversky were also careful to ensure the subject of their experiments were not simply university students. By using scholars and experts as the subject of their experiments, thought leaders gained an unusual opportunity to observe possible flaws in their own thinking. Having seen themselves fail, they became more likely to question the dogmatic assumption, prevalent at the time that the human mind is rational and logical. I found the same myself and am confident that you will also. The idea that our minds are susceptible to systematic errors is now generally accepted.

Why it is called Thinking, Fast and Slow

Slide1

While Kahneman and Tversky’s early work focused on our biases in judgment, their later work focused on decision-making under uncertainty. They found systemic biases in our decisions that consistently violated the rules of rational choice.

Again, we should not discount our decision-making skills. Many examples of experts who can quickly make critical decisions, from a chess master who can identify the top 20 next moves on a board as he walks by to a fireman knowing what areas to avoid in a burning building, experts often make critical decisions quickly.

What Kahneman and Tversky identified, though, is that while this expertise is often credited with good decision making, it is more of retrieving information from memory. The situation serves as a cue or trigger for the expert to retrieve the appropriate answer.

This insight helps us avoid a problem where our experience (which we consider intuition) does not actually help but hinders. In easy situations, intuition works. In difficult ones, we often answer the wrong questions. We answer the easier question, often without noticing the substitution.

If we fail to come to an intuitive solution, we switch to a more deliberate and effortful form of thinking. This is the slow thinking of the title. Fast thinking is both the expert and heuristic.

Example from our world, The Allan Mistake

Many of my readers are experienced “experts” from the mobile game space, so I will start with a hypothetical example that many of us can relate to. In this example, Allan is the GM of his company’s puzzle game division. He has been in the game industry over twenty years and has seen many successful and failed projects. The CEO, Mark, comes to Allan and says they are about to sign one of three celebrities to build a game around.

Allan knows the demographics of puzzle players intimately and identifies the one celebrity who is most popular with Allan’s target customers. Nine months later they launch the game and it is an abysmal failure. Allan is terminated and wonders what he did wrong.

Allan then looks over his notes from when he read Thinking, Fast and Slow, and realizes his fundamental mistake. When Mark came to him and asked which celebrity to use, Allan took the easy route and analyzed the three celebrity options. He did not tackle the actual question, whether it was beneficial to use a celebrity for a puzzle game and only if that was positive to pick between the three. If he had answered the more difficult question (difficult also because it would have set him against Mark), he would have found that celebrity puzzle games are never successful, regardless of the celebrity. Although it may have created tensions at the time with Mark, he probably would have been given an opportunity to create a game with a higher likelihood of success and still be in his position.

Key takeaways

  • Our decision making is not bad but by understanding our systemic biases we can be more efficient.
  • Understanding that people are not regularly rational and that this irrationality is not driven by emotion allows us to make better decisions in fields as diverse as medical diagnosis, legal judgment, intelligence analysis, philosophy, finance, statistics and military strategy.
  • Fast thinking refers to quick decisions and judgments based on our experience while slow thinking is the analysis of difficult questions.

Share this:

  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
Like Loading...
Unknown's avatarAuthor Lloyd MelnickPosted on May 3, 2017May 11, 2017Categories General Social Games Business, General Tech Business, thinking fast and slowTags Amos Tversky, Daniel Kahneman, decision making, thinking fast and slowLeave a comment on Thinking, Fast and Slow, Part 2: Why we should care about judgment errors

Thinking, Fast and Slow, Part 1: The Linda Problem

I recently finished Michael Lewis’ most recent book, The Undoing Project: A Friendship that Changed the World and it motivated me to revisit Daniel Kahneman’s Thinking, Fast and Slow. Lewis’ book describes the relationship between Daniel Kahneman and Amos Tversky, two psychologists whose research gave birth to behavioral economics, modern consumer behavior theory and the practical understanding of people’s decision making. He explains the challenges they faced and the breakthroughs that now seem obvious.

As I mentioned, The Undoing Project reminded me how important Kahneman’s book was, probably the most important book I have ever read. It has helped me professionally, both understand consumer behavior and make better business decisions. It has helped me in my personal life, again better decision making in everything from holiday choices to career moves. It helps even to explain the election of Donald Trump or how the situation in North Korea has developed.

In the Undoing Project, two things drove home the importance of Kahneman’s work. First, despite being a psychologist, Kahneman won the Nobel Prize for Economics in 2002. It is difficult enough to win a Nobel Prize (I’m still waiting for the call), but to do it in a field that is not your practice is amazing. The second item that proved the value of Kahneman’s (and his colleague Amos Tversky) work was the Linda Problem. I will discuss this scenario later in this post, but the Linda Problem proved how people do not make rational decisions, myself included. It convinced the mainstream that people, including doctors and intellectuals, consistently made irrational decisions.

Despite the value I derived from Thinking, Fast and Slow, I never felt I learned all I could from it. I found it very difficult to read, the exact opposite of a Michel Lewis book, and did not digest all the information Kahneman provided. Even when I recommended the book to friends, I often caveat the recommendation with a warning it will be hard to get through.

Given the importance of Kahneman’s work and the challenge I (and probably others) have had in fully digesting Thinking, Fast and Slow, I will be writing a series of blog posts, each one summarizing one chapter of Kahneman’s book. I hope you find it as useful as I know I will.

The Linda Problem

As discussed above, the Linda Problem is the research by Kahneman and Tversky that largely proved people thought irrationally, or at least did not understand logic. While I normally like to paraphrase my learnings or put them into examples relevant for my audience, in this case it is best to show the relevant description from The Undoing Project, as the Linda Project was a scientific study that I do not want to misrepresent:

Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Linda was designed to be the stereotype of a feminist. Danny and Amos asked: To what degree does Linda resemble the typical member of each of the following classes?

  1. Linda is a teacher in elementary school.
  2. Linda works in a bookstore and takes Yoga classes.
  3. Linda is active in the feminist movement.
  4. Linda is a psychiatric social worker.
  5. Linda is a member of the League of Women voters.
  6. Linda is a bank teller.
  7. Linda is an insurance salesperson.
  8. Linda is a bank teller and is active in the feminist movement.

Danny [Kahneman] passed out the Linda vignette to students at the University of British Columbia. In this first experiment, two different groups of students were given four of the eight descriptions and asked to judge the odds that they were true. One of the groups had “Linda is a bank teller” on its list; the other got “Linda is a bank teller and is active in the feminist movement.” Those were the only two descriptions that mattered, though of course the students didn’t know that. The group given “Linda is a bank teller and is active in the feminist movement” judged it more likely than the group assigned “Linda is a bank teller.” That result was all that Danny and Amos [Tversky] needed to make their big point: The rules of thumb people used to evaluate probability led to misjudgments. “Linda is a bank teller and is active in the feminist movement” could never be more probable than “Linda is a bank teller.” “Linda is a bank teller and active in the feminist movement” was just a special case of “Linda is a bank teller.” “Linda is a bank teller” included “Linda is a bank teller and activist in the feminist movement” along with “Linda is a bank teller and likes to walk naked through Serbian forests” and all other bank-telling Lindas.

One description was entirely contained by the other. People were blind to logic. They put the Linda problem in different ways, to make sure that the students who served as their lab rats weren’t misreading its first line as saying “Linda is a bank teller NOT active in the feminist movement.” They put it to graduate students with training in logic and statistics. They put it to doctors, in a complicated medical story, in which lay embedded the opportunity to make a fatal error of logic. In overwhelming numbers doctors made the same mistake as undergraduates.

The fact that almost everyone made the same logic mistakes shows how powerful this understanding is. It proves that our judgment, and thus decision making, is often not logical but does contain flaws. This understanding helps explain many things in life and business that sometimes do not seem to makes sense.

Linda Problem

The implications

Once you understand how our judgment is biased, it can help you make better decisions. It can also provide insights into how your customers view different options and why people behave as they do. In future posts, I will explore all of Kahneman and Tversky’s major findings and how they apply.

Key Takeaways

  • In the Undoing Project, Michael Lewis writes about the relationship and research of Daniel Kahneman and Amos Tversky, two psychologists who changed the way we understand decision making
  • The Linda Problem proved to the non-believers that people made illogical judgments. When given a story about a fictional person and then potential careers for that person, virtually everyone (from students to very successful professionals) chose a persona that was a subset of a broader persona, thus impossible that the former was more likely.
  • By understanding how people make judgments and decisions, we can improve our own decision making process and better understand our friends, family and customers.

Share this:

  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
Like Loading...
Unknown's avatarAuthor Lloyd MelnickPosted on April 19, 2017April 20, 2017Categories General Social Games Business, General Tech Business, Lloyd's favorite posts, thinking fast and slowTags Amos Tversky, Daniel Kahneman, decision making, Linda Problem, michael lewis, thinking fast and slow3 Comments on Thinking, Fast and Slow, Part 1: The Linda Problem

Get my book on LTV

The definitive book on customer lifetime value, Understanding the Predictable, is now available in both print and Kindle formats on Amazon.

Understanding the Predictable delves into the world of Customer Lifetime Value (LTV), a metric that shows how much each customer is worth to your business. By understanding this metric, you can predict how changes to your product will impact the value of each customer. You will also learn how to apply this simple yet powerful method of predictive analytics to optimize your marketing and user acquisition.

For more information, click here

Follow The Business of Social Games and Casino on WordPress.com

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 791 other subscribers

Most Recent Posts

  • Join me at PDMA Inspire for my talk on new product prioritization
  • Why keep studying?
  • The next three years of this blog
  • Interview with the CEO of Murka on the biggest growth opportunity in gaming, Barak David

Lloyd Melnick

This is Lloyd Melnick’s personal blog.  All views and opinions expressed on this website are mine alone and do not represent those of people, institutions or organizations that I may or may not be associated with in professional or personal capacity.

I am a serial builder of businesses (senior leadership on three exits worth over $700 million), successful in big (Disney, Stars Group/PokerStars, Zynga) and small companies (Merscom, Spooky Cool Labs) with over 20 years experience in the gaming and casino space.  Currently, I am the GM of VGW’s Chumba Casino and on the Board of Directors of Murka Games and Luckbox.

Topic Areas

  • Analytics (114)
  • Bayes' Theorem (8)
  • behavioral economics (8)
  • blue ocean strategy (14)
  • Crowdfunding (4)
  • DBA (2)
  • General Social Games Business (459)
  • General Tech Business (195)
  • Growth (88)
  • International Issues with Social Games (50)
  • Lloyd's favorite posts (101)
  • LTV (54)
  • Machine Learning (10)
  • Metaverse (1)
  • Mobile Platforms (37)
  • Prioritization (1)
  • Social Casino (52)
  • Social Games Marketing (105)
  • thinking fast and slow (5)
  • Uncategorized (33)

Social

  • View CasualGame’s profile on Facebook
  • View @lloydmelnick’s profile on Twitter
  • View lloydmelnick’s profile on LinkedIn

RSS

RSS Feed RSS - Posts

RSS Feed RSS - Comments

Categories

  • Analytics (114)
  • Bayes' Theorem (8)
  • behavioral economics (8)
  • blue ocean strategy (14)
  • Crowdfunding (4)
  • DBA (2)
  • General Social Games Business (459)
  • General Tech Business (195)
  • Growth (88)
  • International Issues with Social Games (50)
  • Lloyd's favorite posts (101)
  • LTV (54)
  • Machine Learning (10)
  • Metaverse (1)
  • Mobile Platforms (37)
  • Prioritization (1)
  • Social Casino (52)
  • Social Games Marketing (105)
  • thinking fast and slow (5)
  • Uncategorized (33)

Archives

  • September 2023
  • December 2021
  • July 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • November 2019
  • October 2019
  • September 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • December 2010
December 2025
S M T W T F S
 123456
78910111213
14151617181920
21222324252627
28293031  
« Sep    

by Lloyd Melnick

All posts by Lloyd Melnick unless specified otherwise
Google+

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 791 other subscribers
Follow Lloyd Melnick on Quora

RSS HBR Blog

  • How One Manufacturer Achieved Net Zero at Zero Cost
  • What Can U.S. Employers Do About Rising Healthcare Costs?
  • When You Have to Execute a Strategy You Disagree With
  • 4 Ways to Build Durable Relationships with Your Most Important Customers
  • What Jargon Says About Your Company Culture
  • Research: When Used Correctly, LLMs Can Unlock More Creative Ideas
  • Your New Role Requires Strategic Thinking…But You’re Stuck in the Weeds
  • For Circular Economy Innovation, Look to the Global South
  • Why Great Leaders Focus on the Details
  • Corporate Disclosure in the Age of AI

RSS Techcrunch

  • An error has occurred; the feed is probably down. Try again later.

RSS MIT Sloan Management Review Blog

  • AI Coding Tools: The Productivity Trap Most Companies Miss
  • How Procter & Gamble Uses AI to Unlock New Insights From Data
  • Rewire Organizational Knowledge With GenAI
  • Hungry for Learning: Wendy’s Will Croushorn
  • Beat Burnout: 10 Essential MIT SMR Reads
  • How Leaders Stay True to Themselves and Their Stakeholders
  • Our Guide to the Winter 2026 Issue
  • Broadening Future Perspectives at the Bank of England
  • A Faster Way to Build Future Scenarios
  • Assess What Is Certain in a Sea of Unknowns
The Business of Social Games and Casino Website Powered by WordPress.com.
  • Subscribe Subscribed
    • The Business of Social Games and Casino
    • Join 726 other subscribers
    • Already have a WordPress.com account? Log in now.
    • The Business of Social Games and Casino
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...
 

    %d