Skip to content

The Business of Social Games and Casino

How to succeed in the mobile game space by Lloyd Melnick

Category: thinking fast and slow

The ten most valuable business books I have ever read

The ten most valuable business books I have ever read

A colleague recently asked me what ten books I would recommend to him and it turned into a much more difficult question than I expected. While it is relatively easy to rank books you read in the last few month or even year, picking the ten most useful at all time is very hard, a lot of books have contributed to my growth. After much thought, I came out with my top ten books and after going through the exercise I felt the list could be useful to everyone.

While I did not initially rank the books, given everyone’s limited time, I have now ranked them from one to ten based on how much value I derived from the book. Below are my top-10, with the most valuable one first (given that this is not Miss Universe, I did not think creating suspense by starting at number 10 made sense):

  1. Thinking, Fast and Slow by Daniel Kahneman. Kahneman’s book about human behavior and decision making has influenced me more than any other work. It has helped me understand what drives others and mistakes I commonly made. The book will help you make better decision, understand your customers better, be a superior leader and create more compelling products.
  2. Blue Ocean Strategy: How to Create Uncontested Market Space and Make the Competition Irrelevant. Blue Ocean Strategy drives how I develop strategy everywhere I have had the opportunity. It starts by showing the superior results in creating a new market space rather than competing directly in an existing space and this leads to a framework for building long term competitive advantage.
  3. Predictably Irrational by Dan Ariely. Predictably Irrational is my favorite book on the list, reading Dan Ariely is comparable to reading a Michael Lewis or even Tom Clancy book, a true page turner. Ariely’s work is in the same space as Kahneman, behavioral economics, or why people make the decisions that they do. In effect, people are not rational (which undercuts traditional economics) but their irrationality is not haphazard, it is predictable. Like Kahneman’s work, reading Predictably Irrational will improve your decision making, leadership and ability to interact with your customers.
  4. Hooked: How to Build Habit-Forming Products by Nir Eyal. Hooked is the best book I have read about how to create truly compelling product. It provides a framework for building something that customers will engage with regularly, thus having a high lifetime value.
  5. Smart Customers, Stupid Companies: Why Only Intelligent Companies Will Thrive, and How To Be One of Them. This book highlighted the value of personalization before it was cool. It anticipated the trend of customers expecting an experience tailored to them before everyone gave it lip service and still provides compelling evidence on the value of personalization.
  6. The Signal and the Noise: Why So Many Predictions Fail–but Some Don’t by Nate Silver For those of you who are not familiar with Nate Silver, he is probably the best known US statistician because of his success predicting election results (though he did miss on Trump) and high profile sports analytics sight. The Signal and the Noise is fantastic at explaining in a very easy to understand way how analytics work, why they sometimes do not, and how you can apply them.
  7. The Ultimate Question. The Ultimate Question is effectively an explanation of NPS (Net Promoter Score) and framework on how to apply it. I find NPS the most useful KPI after LTV (and a key driver of it) and this book helps you understand how to apply it correctly as it is often the most misused KPI.
  8. Collaboration: How Leaders Avoid the Traps, Build Common Ground, and Reap Big Results by Morten Hansen. Hansen’s book made it into my top-10 largely because collaboration is so often misused to justify more meetings and design by committee, which destroys value. Hansen, instead, shows you how to collaborate to create increased efficiency and better results.
  9. Contagious: Why Things Catch On by Jonah Berger. Contagious is in the top-10 because it is the only work I have ever read that really shows you how to make a product or marketing viral. Given the value of virality in LTV, this book provides core knowledge that will help your marketing, CRM and product decisions.
  10. Moneyball by Michael Lewis. The tenth spot in this list was actually the hardest to fill, as I had to drop many other great books. Moneyball, however, changed the way I looked at the video game and digital entertainment ecosystems. It highlighted similar opportunities as Billy Beane saw in building a baseball club. And, like with Dan Ariely’s books, it was a lot of fun to read.

 

As I just wrote, there were a lot of contenders for the top ten and a lot of valuable books I would love to include. If you are looking for other great books to read, I also recommend (this time not in order):

  • Grit: The Power of Passion and Perseverance
  • The High Roller Experience: How Caesars and Other World-Class Companies Are Using Data to Create an Unforgettable Customer Experience.
  • Blue Ocean Shift: Beyond Competing – Proven Steps to Inspire Confidence and Seize New Growth.
  • 10% Happier: How I Tamed the Voice in My Head, Reduced Stress Without Losing My Edge, and Found Self-Help That Actually Works–A True Story.
  • Whale Hunt in the Desert: Secrets of a Vegas Superhost.
  • Essentialism: The Disciplined Pursuit of Less.
  • The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail (Management of Innovation and Change).
  • Toughness: Developing True Strength On and Off the Court by Jay Bilas.
  • The Success Matrix: Winning in Business and in Life.

Happy reading and also please post your suggestions.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on February 20, 2018April 11, 2020Categories blue ocean strategy, General Social Games Business, General Tech Business, thinking fast and slowTags blue ocean, dan ariely, Daniel Kahneman, Jonah Berger, Nate Silver, Reading List, thinking fast and slow2 Comments on The ten most valuable business books I have ever read

Thinking, Fast and Slow, Part 4: The Yom Kippur War

When reading Michael Lewis great book about Daniel Kahneman and Alex Tversky, The Undoing Project, Lewis references several times the Yom Kippur War. The war had a big influence on the thinking of Kahneman and Tversky.

The references particularly piqued my interest because I was too young to understand what was happening during the conflict but it did not make its way into most history texts when I was in school. It was also interesting because in a matter of days it went from a war that looked like it could destroy Israel, there were rumors they were even considering the nuclear option, to a war where the entire Egyptian Third Army was encircled.

With changes on the battlefield that dramatic there had to be fantastic lessons in decision making so I decided to learn more about the conflict. By reading The The Yom Kippur War: The Epic Encounter That Transformed the Middle East by Abraham Rabinovich, I learned how the Yom Kippur War is a great case study in the biases and paradigms that form the foundation of Kahneman’s Thinking, Fast and Slow.

yom kippur war

The danger of overconfidence

The Yom Kippur War highlighted one of the biggest errors in decision-making, over-confidence. If Israel had not mobilized its reserves shortly before the war started, the odds at the beginning of war would be in the Arabs’ favor by several orders of magnitude. The 100,000 Egyptian soldiers and 1,350 tanks west of the Suez canal faced 450 Israeli soldiers in makeshift forts and 91 Israeli tanks in the canal zone. On the northern front, where Israel faced Syria, the Syrians enjoyed 8 to 1 superiority in tanks and far greater in infantry and artillery.

The limited forces Israel deployed on both the Syrian and Egyptian fronts opposite vastly larger enemy armies reflected a self-assurance induced by the country’s stunning victory in the Six Day War. Israel believed it had attained a military superiority that no Arab nation or combination of nations could challenge.

Even when war appeared likely, the Israelis moved only a small number of forces to face the Syrians. Abramovich quoted the Israeli Chief of Staff, Dado Elazar as saying “’We’ll have one hundred tanks against their eight hundred, that ought to be enough.’ In that sentence, Elazar summed up official Israel’s attitude towards the Arab military threat.“

This overconfidence almost led to the collapse of the Israeli military. Abramovich wrote, “a common factor behind all these failings was the contempt for Arab arms born of that earlier war, a contempt that spawned indolent thinking.“

The reality was that the Egyptian and Syrian forces were not like their predecessors in earlier conflicts, but instead had the most modern Soviet weapons and a more disciplined and professional military. The overconfidence that prompted the Israeli military to not take seriously its opponents put its soldiers in an untenable position that led them initially to be overwhelmed.

Impact on your life

Given that you probably do not lead an organization with tanks and artillery, you may ask why should I care whether the Israeli military was overconfident. The lesson, however, that is pertinent is that underestimating your competition could be disastrous. Just because your competitor has not been able to develop a product in the past that is of comparable quality to your product, does not mean that they will never have that capability. You may dominate the market but your competition is working on ways to jump over you.

You also may underestimate their likelihood to want to compete in certain market sectors. You may have gained 80 percent of the racing game market after pushing your top competitors away so you move your development to sports games because you now own racing games. Do not assume they do not have a secret project to create a new racing game that will suddenly make your product obsolete.

Confirmation bias

The Yom Kippur War highlighted one of the biases that Kahneman and Tversky have regularly wrote about, confirmation bias. Confirmation bias is when you ignore information that conflicts with what you believe and only select the information that confirms your beliefs.

In the Yom Kippur War, Egypt and Syria were able to almost overwhelm the Israelis because the Israelis did not expect to be attacked by overwhelming force. Although the Arab states did launch a surprise attack, it should not have been a surprise. Both Egypt and Syria mobilized huge numbers of forces (which was visible to the Israelis), while multiple intelligence sources and even the leader of Jordan warned the Israelis an attack was imminent. It was confirmation bias, however, that kept the Israelis from believing they would be attacked and preparing for it (until the last minute).

First, the Israelis ignored any information that did not support their theory that they would not be attacked. Abramovich writes, “Eleven warnings of war were received by Israel during September from well-placed sources. But [Head of Military Intelligence] Zeira continued to insist that war was not an Arab option. Not even [Jordan’s King] Hussein’s desperate warning succeeded in stirring doubts.”

Explaining away every piece of information that conflicted with their thesis, they embraced any wisp that seemed to confirm it.  The Egyptians claimed they were just conducting exercises while the Syrian maneuvers were discounted as defensive measures. Fed by this double illusion—an Egyptian exercise in the south and Syrian nervousness in the north—Israel looked on unperturbed as its two enemies prepared their armies for war in full view. Abramovich writes, “the deception succeeded beyond even Egypt’s expectations because it triggered within Israel’s intelligence arm and senior command a monumental capacity for self-deception. ‘We simply didn’t feel them capable [of war].’”

As I mentioned above, examples of decision making flaws were abundant on both sides and Egypt also suffered greatly because of confirmation bias. When Israel began its counter-attack that eventually led to the encirclement of the 3rd Army, the Egyptians President Sadat only looked at data that supported his hypothesis. Given the blow the Israelis had received at the start of the war and the fact that they were heavily engaged on the Syrian front, the Egyptians were thinking in terms of a raid, not a major canal crossing.  An early acknowledgement of the Israeli activity could have stemmed the attack and possibly left the Egyptians in the superior position but they only saw what they wanted to see.

Impact on your life

I come across confirmation bias almost weekly in the business world. One example you often see in the game space is when a product team is looking to explain either a boost in performance or a setback. If the numbers look good, they will often focus on internal factors, such as a new feature, and “confirm” that this development has driven KPIs. If metrics deteriorate, they will often focus on external factors, maybe more Brazilian players, that confirm the problem is outside of their control. These examples of confirmation bias often lead to long delays identifying and dealing with problems or shifting too many resources to reinforce features that do not have an impact.

Not acknowledging or seeking reality

Another major decision making flaw that the Yom Kippur War highlights is avoiding reality. One of the leading Israeli commanders did not venture out of his bunker and relied on his own pre-conceptions of what was going on rather than the actual situation. Rabinovich writes that “although he was only a short helicopter trip from the front, [General] Gonen remained in his command bunker at Umm Hashiba, oblivious to the true situation in the field and the perceptions of his field commanders. As an Israeli analyst would put it, Gonen was commanding from a bunker, rather than from the saddle.”

On the Egyptian side, to avoid panic, the Egyptian command had refrained from issuing an alert about the Israeli incursion. Thus, the Israeli forces were able to pounce on unsuspecting convoys and bases. There had been a number of clashes involving Israeli tanks and the paratroopers but no one in Cairo—or Second Army headquarters—was fitting the pieces together.

Thus, rather than successfully defending against the Israelis, the Egyptians left their troops blind to what was happening.

Impact on your life

If your game or product is not performing, you need to understand what is really happening. I have often seen products soft launched in tier three markets that show poor KPIs. Rather than reporting these KPIs to leadership, they will proceed with the real launch in tier one markets. This pre-empts the product team from fixing the product and also wastes money with a failed launch.

Assuming the past is the same as the present

Another decision-making bias demonstrated in the Yom Kippur war was assuming the past would repeat. As I wrote earlier, the Israelis would assume the Arabs would fight poorly because they did in previous wars, including the Six-Day War in 1967, where Israel routed the Arab States. They thus did not prepare their forces for any different type of opponent or different weaponry.

This bias also contributed to their failure to realize they would be attacked imminently. When General Shalev, assistant to Israel’s Commander in Chief, was warned of a likely attack, he reminded the so-called alarmist that he had said the same thing during a previous alert in the spring, “you’re wrong this time too,” he said.  Because a previous alert was wrong, the Israeli high command discounted a clear danger.

Impact on your life

In the game space, you frequently see decisions made based on looking in the rear view mirror. I have seen many executives decide to make a type of game – first person shooter, invest express sim, tower defense, etc – because these are the hot type of games. Then when their game comes to market and fails, they do not understand why they always seem to be behind the trends.

Key takeaways

  • The Yom Kippur War provides examples of key errors in decision-making, by both sides, that can be leverages in business.
  • One of the key learnings is that over-confidence can be fatal. Underestimating your competition because you have dominated them can allow them to gain a superior position.
  • Another key error in decision making is confirmation bias, picking out the information that confirms what you want to believe and disregarding the data that conflicts with your hypothesis.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on June 14, 2017June 6, 2017Categories General Social Games Business, General Tech Business, thinking fast and slowTags confirmation bias, over-confidence, thinking fast and slow, Yom Kippur WarLeave a comment on Thinking, Fast and Slow, Part 4: The Yom Kippur War

Thinking, Fast and Slow, Part 3: The Invisible Gorilla

I have written several times about the work of Kahneman and Tversky, highlighted in the book Thinking, Fast and Slow, and how helpful it is in understanding decision-making and consumer behavior. One of the most enlightening experiments done Kahneman and Tversky, the Invisible Gorilla experiment, shows the difference between tasks that require mental focus and those we can do in the background.

Invisible gorilla

The Invisible Gorilla experiment

In this experiment, people were asked to watch a video of two teams playing basketball, one with white shirts versus one with black shirts (click to see Invisible Gorilla experiment). The viewers of the film need to count the number of passes made by members of the white team and ignoring the players wearing black.

This task is difficult and absorbing, forcing participants to focus on the task. Halfway through the video, a gorilla appears, crossing the court, thumps its chest and then continues to move across and off the screen.

The gorilla is in view for nine seconds. Fifty percent, half, of the people viewing the video do not notice anything unusual when asked later (that is, they do not notice the gorilla). It is the counting task, and especially the instruction to ignore the black team, that causes the blindness.

While entertaining, there are several important insights from this experiment

  • One important insight is that nobody would miss the gorilla if they were not doing the task. When you are focusing on a mentally challenging task, which can be counting passes or doing math or shooting aliens, you do not notice other actions nor can you focus on them.
  • A second insight is we do not realize the limitations we face when focused on one task. People are sure they did not miss the gorilla. As Kahneman writes, “we are bind to our blindness.”

System 1 and System 2

The Invisible Gorilla also serves as a framework to understand the two systems people use to think. System 1 operates automatically and quickly, with liitle or no effort and no sense of voluntary control. An example of System 1 thinking would be taking a shower (for an adult), where you do not even think about what you are doing.

System 2 thinking is deliberate, effortful and orderly, slow thinking. System 2 allocates attention to the effortful mental activities that demand I, including complex computations. The operations of System 2 are often associated with subjective experience of agency, choice, and concentration. The highly diverse operations of System 2 have one feature in common: they require attention and are disrupted when attention is drawn away .

The automatic operation of System 1 generates surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps.

Implications

Understanding System 1 and System 2 has several implications. First, if you are involved in an activity requiring System 2 thought, do not try to do a second activity requiring System 2 thought. While walking and chewing bubble gum are both System 1 for most people and can be done simultaneously, negotiating a big deal while typing an email are both System 2 and should not be done at the same time.

Second, do not create products that require multiple System 2 actions concurrently. While System 2 is great for getting a player immersed in a game, asking them to do two concurrently will create a poor experience. A third implication is when onboarding someone to your product, only expose them to one System 2 activity at a time.

Example from our world, Urbano’s Failed App

I like to use examples from the game space to illustrate how understanding Kahneman and Tversky’s work can impact your business. In this example, Urbano runs product design for a fast growing app company at the intersection of digital and television. He has built a great sports product that allows players to play a very fun game while watching any sporting activity on television. Unfortunately, Urbano’s company is running out of funds and the next release needs to be a hit or else they will not survive. Although the product has tested well, Urbano is nervous because of the financial situation and decides to add more to the product, to make the app based on what happens the past three minutes during the televised match. They launch the app and although players initially start playing, they never come back and the product fails.

Another company buys the rights to the product and conducts a focus test. They find out users forgot what happened on television because they were focusing on the app and then could not complete the game. They take out the part requiring attention to the televised match and the product is a huge success. The difference was that the latter did not require multiple System 2 thinking simultaneously, it left television watching as a System 1 activity.

Key Takeaways

  1. In a famous experiment, people watching a basketball game who had to count passes one team made missed the appearance of a gorilla on the video. The experiment showed when you are focusing on something, you do not notice what else is happening.
  2. We are blind to things in the background. We are blind to our blindness. In the Invisible Gorilla experiment, not only did people not see the gorilla, they refused to believe that they missed a gorilla.
  3. There are two types of mental activities, System 1 that are automatic and reflexive, and System 2, that requires deliberate, effortful and orderly thinking.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on May 31, 2017May 29, 2017Categories General Social Games Business, General Tech Business, thinking fast and slowTags Amos Tversky, attention span, Daniel Kahneman, decision making, Invisible gorilla, Product design, thinking fast and slow1 Comment on Thinking, Fast and Slow, Part 3: The Invisible Gorilla

Thinking, Fast and Slow, Part 2: Why we should care about judgment errors

As promised last month, I will spend a few blog posts summarizing Thinking, Fast and Slow by Daniel Kahneman. Before diving into the heuristics, biases and processes that he and his colleague Amos Tversky identified, it is important to understand why he wrote the book and why it is so useful. Fortunately, he largely does this in his introduction so it is a great place to start.

Let’s not beat ourselves up but make ourselves better

First, Kahneman points out that the goal of his research is not to prove we are idiots but to help us minimize bad decisions. Understanding flaws in human decision-making is no more insulting or denigrating than writing about diseases in a medical journal belittles good health. Rather, our decision-making is generally quite good, most of our judgments are appropriate most of the time, but there are systemic biases that if we understand can make our decision-making more effective.

By understanding Kahneman’s work, you will be better able to identify and understand errors of judgment and choice, in others and then yourself. As Kahneman points out, “an accurate diagnosis may suggest an intervention to limit the damage that bad judgments and choices often cause.”

Is our judgment flawed?

At its roots, Kahneman began his career trying to determine if people consistently made biased judgments. Long story short, we do.

One example drove this determination home to Kahneman and Tversky and probably will to you also. Early in their collaboration, both Kahneman and Tversky realized they made the same “silly” prognostications about careers that toddlers would pursue when they became adults. They both knew an argumentative three year old and felt it was likely that he would become a lawyer, the empathetic and mildly intrusive toddler would become a psychotherapist and the nerdy kid would become a professor. They, both smart academics, neglected the baseline data (very few people became psychotherapists, professors or even lawyers compared with other professions) and instead believed the stories in their head about who ended up in what careers was accurate. This realization drove their ensuing research, that we are all biased.

How understanding Kahneman’s work impacts the world

The broad impact of Kahneman and Tversky’s work drives home it’s importance to everyone. When they published their first major paper, it was commonly accepted in academia that:

  1. People are generally rational, and their thinking is normally sound
  2. Emotions such as fear, affection and hatred explain most of the occasions on which people depart from rationality

Not only did these two assumptions drive academia (particularly economics and social sciences) but also their acceptance often drove business and government decisions. The work laid out in Thinking, Fast and Slow, however, disproved these two assumptions and thus drove entirely different decisions to generate strong results.

Scholars in a host of disciplines have found it useful and have leveraged it in other fields, such as medical diagnosis, legal judgment, intelligence analysis, philosophy, finance, statistics and military strategy. Kahneman cites an example from the field of Public Policy. His research showed that people generally assess the relative importance of issues by the ease with which they are retrieved from memory, and this is largely determined by the extent of coverage in the media. This insight now drives everything from election strategy to understanding (and countering) how authoritarian regimes manipulate the populace.

Kahneman and Tversky were also careful to ensure the subject of their experiments were not simply university students. By using scholars and experts as the subject of their experiments, thought leaders gained an unusual opportunity to observe possible flaws in their own thinking. Having seen themselves fail, they became more likely to question the dogmatic assumption, prevalent at the time that the human mind is rational and logical. I found the same myself and am confident that you will also. The idea that our minds are susceptible to systematic errors is now generally accepted.

Why it is called Thinking, Fast and Slow

Slide1

While Kahneman and Tversky’s early work focused on our biases in judgment, their later work focused on decision-making under uncertainty. They found systemic biases in our decisions that consistently violated the rules of rational choice.

Again, we should not discount our decision-making skills. Many examples of experts who can quickly make critical decisions, from a chess master who can identify the top 20 next moves on a board as he walks by to a fireman knowing what areas to avoid in a burning building, experts often make critical decisions quickly.

What Kahneman and Tversky identified, though, is that while this expertise is often credited with good decision making, it is more of retrieving information from memory. The situation serves as a cue or trigger for the expert to retrieve the appropriate answer.

This insight helps us avoid a problem where our experience (which we consider intuition) does not actually help but hinders. In easy situations, intuition works. In difficult ones, we often answer the wrong questions. We answer the easier question, often without noticing the substitution.

If we fail to come to an intuitive solution, we switch to a more deliberate and effortful form of thinking. This is the slow thinking of the title. Fast thinking is both the expert and heuristic.

Example from our world, The Allan Mistake

Many of my readers are experienced “experts” from the mobile game space, so I will start with a hypothetical example that many of us can relate to. In this example, Allan is the GM of his company’s puzzle game division. He has been in the game industry over twenty years and has seen many successful and failed projects. The CEO, Mark, comes to Allan and says they are about to sign one of three celebrities to build a game around.

Allan knows the demographics of puzzle players intimately and identifies the one celebrity who is most popular with Allan’s target customers. Nine months later they launch the game and it is an abysmal failure. Allan is terminated and wonders what he did wrong.

Allan then looks over his notes from when he read Thinking, Fast and Slow, and realizes his fundamental mistake. When Mark came to him and asked which celebrity to use, Allan took the easy route and analyzed the three celebrity options. He did not tackle the actual question, whether it was beneficial to use a celebrity for a puzzle game and only if that was positive to pick between the three. If he had answered the more difficult question (difficult also because it would have set him against Mark), he would have found that celebrity puzzle games are never successful, regardless of the celebrity. Although it may have created tensions at the time with Mark, he probably would have been given an opportunity to create a game with a higher likelihood of success and still be in his position.

Key takeaways

  • Our decision making is not bad but by understanding our systemic biases we can be more efficient.
  • Understanding that people are not regularly rational and that this irrationality is not driven by emotion allows us to make better decisions in fields as diverse as medical diagnosis, legal judgment, intelligence analysis, philosophy, finance, statistics and military strategy.
  • Fast thinking refers to quick decisions and judgments based on our experience while slow thinking is the analysis of difficult questions.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on May 3, 2017May 11, 2017Categories General Social Games Business, General Tech Business, thinking fast and slowTags Amos Tversky, Daniel Kahneman, decision making, thinking fast and slowLeave a comment on Thinking, Fast and Slow, Part 2: Why we should care about judgment errors

Thinking, Fast and Slow, Part 1: The Linda Problem

I recently finished Michael Lewis’ most recent book, The Undoing Project: A Friendship that Changed the World and it motivated me to revisit Daniel Kahneman’s Thinking, Fast and Slow. Lewis’ book describes the relationship between Daniel Kahneman and Amos Tversky, two psychologists whose research gave birth to behavioral economics, modern consumer behavior theory and the practical understanding of people’s decision making. He explains the challenges they faced and the breakthroughs that now seem obvious.

As I mentioned, The Undoing Project reminded me how important Kahneman’s book was, probably the most important book I have ever read. It has helped me professionally, both understand consumer behavior and make better business decisions. It has helped me in my personal life, again better decision making in everything from holiday choices to career moves. It helps even to explain the election of Donald Trump or how the situation in North Korea has developed.

In the Undoing Project, two things drove home the importance of Kahneman’s work. First, despite being a psychologist, Kahneman won the Nobel Prize for Economics in 2002. It is difficult enough to win a Nobel Prize (I’m still waiting for the call), but to do it in a field that is not your practice is amazing. The second item that proved the value of Kahneman’s (and his colleague Amos Tversky) work was the Linda Problem. I will discuss this scenario later in this post, but the Linda Problem proved how people do not make rational decisions, myself included. It convinced the mainstream that people, including doctors and intellectuals, consistently made irrational decisions.

Despite the value I derived from Thinking, Fast and Slow, I never felt I learned all I could from it. I found it very difficult to read, the exact opposite of a Michel Lewis book, and did not digest all the information Kahneman provided. Even when I recommended the book to friends, I often caveat the recommendation with a warning it will be hard to get through.

Given the importance of Kahneman’s work and the challenge I (and probably others) have had in fully digesting Thinking, Fast and Slow, I will be writing a series of blog posts, each one summarizing one chapter of Kahneman’s book. I hope you find it as useful as I know I will.

The Linda Problem

As discussed above, the Linda Problem is the research by Kahneman and Tversky that largely proved people thought irrationally, or at least did not understand logic. While I normally like to paraphrase my learnings or put them into examples relevant for my audience, in this case it is best to show the relevant description from The Undoing Project, as the Linda Project was a scientific study that I do not want to misrepresent:

Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Linda was designed to be the stereotype of a feminist. Danny and Amos asked: To what degree does Linda resemble the typical member of each of the following classes?

  1. Linda is a teacher in elementary school.
  2. Linda works in a bookstore and takes Yoga classes.
  3. Linda is active in the feminist movement.
  4. Linda is a psychiatric social worker.
  5. Linda is a member of the League of Women voters.
  6. Linda is a bank teller.
  7. Linda is an insurance salesperson.
  8. Linda is a bank teller and is active in the feminist movement.

Danny [Kahneman] passed out the Linda vignette to students at the University of British Columbia. In this first experiment, two different groups of students were given four of the eight descriptions and asked to judge the odds that they were true. One of the groups had “Linda is a bank teller” on its list; the other got “Linda is a bank teller and is active in the feminist movement.” Those were the only two descriptions that mattered, though of course the students didn’t know that. The group given “Linda is a bank teller and is active in the feminist movement” judged it more likely than the group assigned “Linda is a bank teller.” That result was all that Danny and Amos [Tversky] needed to make their big point: The rules of thumb people used to evaluate probability led to misjudgments. “Linda is a bank teller and is active in the feminist movement” could never be more probable than “Linda is a bank teller.” “Linda is a bank teller and active in the feminist movement” was just a special case of “Linda is a bank teller.” “Linda is a bank teller” included “Linda is a bank teller and activist in the feminist movement” along with “Linda is a bank teller and likes to walk naked through Serbian forests” and all other bank-telling Lindas.

One description was entirely contained by the other. People were blind to logic. They put the Linda problem in different ways, to make sure that the students who served as their lab rats weren’t misreading its first line as saying “Linda is a bank teller NOT active in the feminist movement.” They put it to graduate students with training in logic and statistics. They put it to doctors, in a complicated medical story, in which lay embedded the opportunity to make a fatal error of logic. In overwhelming numbers doctors made the same mistake as undergraduates.

The fact that almost everyone made the same logic mistakes shows how powerful this understanding is. It proves that our judgment, and thus decision making, is often not logical but does contain flaws. This understanding helps explain many things in life and business that sometimes do not seem to makes sense.

Linda Problem

The implications

Once you understand how our judgment is biased, it can help you make better decisions. It can also provide insights into how your customers view different options and why people behave as they do. In future posts, I will explore all of Kahneman and Tversky’s major findings and how they apply.

Key Takeaways

  • In the Undoing Project, Michael Lewis writes about the relationship and research of Daniel Kahneman and Amos Tversky, two psychologists who changed the way we understand decision making
  • The Linda Problem proved to the non-believers that people made illogical judgments. When given a story about a fictional person and then potential careers for that person, virtually everyone (from students to very successful professionals) chose a persona that was a subset of a broader persona, thus impossible that the former was more likely.
  • By understanding how people make judgments and decisions, we can improve our own decision making process and better understand our friends, family and customers.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on April 19, 2017April 20, 2017Categories General Social Games Business, General Tech Business, Lloyd's favorite posts, thinking fast and slowTags Amos Tversky, Daniel Kahneman, decision making, Linda Problem, michael lewis, thinking fast and slow1 Comment on Thinking, Fast and Slow, Part 1: The Linda Problem

Get my book on LTV

The definitive book on customer lifetime value, Understanding the Predictable, is now available in both print and Kindle formats on Amazon.

Understanding the Predictable delves into the world of Customer Lifetime Value (LTV), a metric that shows how much each customer is worth to your business. By understanding this metric, you can predict how changes to your product will impact the value of each customer. You will also learn how to apply this simple yet powerful method of predictive analytics to optimize your marketing and user acquisition.

For more information, click here

Follow The Business of Social Games and Casino on WordPress.com

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 1,320 other followers

Most Recent Posts

  • Lessons for gaming and tech companies from the Peter Drucker Forum
  • Chaos Theory, the Butterfly Effect, and Gaming
  • How to give help without micromanaging
  • Measure yourself by your worst day

Lloyd Melnick

This is Lloyd Melnick’s personal blog.  All views and opinions expressed on this website are mine alone and do not represent those of people, institutions or organizations that I may or may not be associated with in professional or personal capacity.

I am a serial builder of businesses (senior leadership on three exits worth over $700 million), successful in big (Disney, Stars Group/PokerStars, Zynga) and small companies (Merscom, Spooky Cool Labs) with over 20 years experience in the gaming and casino space.  Currently, I am on the Board of Directors of Murka and GM of VGW’s Chumba Casino

Topic Areas

  • Analytics (114)
  • Bayes' Theorem (8)
  • behavioral economics (8)
  • blue ocean strategy (14)
  • Crowdfunding (4)
  • General Social Games Business (457)
  • General Tech Business (194)
  • Growth (88)
  • International Issues with Social Games (50)
  • Lloyd's favorite posts (101)
  • LTV (54)
  • Machine Learning (10)
  • Mobile Platforms (37)
  • Social Casino (51)
  • Social Games Marketing (104)
  • thinking fast and slow (5)
  • Uncategorized (31)

Social

  • View CasualGame’s profile on Facebook
  • View @lloydmelnick’s profile on Twitter
  • View lloydmelnick’s profile on LinkedIn

RSS

RSS Feed RSS - Posts

RSS Feed RSS - Comments

Categories

  • Analytics (114)
  • Bayes' Theorem (8)
  • behavioral economics (8)
  • blue ocean strategy (14)
  • Crowdfunding (4)
  • General Social Games Business (457)
  • General Tech Business (194)
  • Growth (88)
  • International Issues with Social Games (50)
  • Lloyd's favorite posts (101)
  • LTV (54)
  • Machine Learning (10)
  • Mobile Platforms (37)
  • Social Casino (51)
  • Social Games Marketing (104)
  • thinking fast and slow (5)
  • Uncategorized (31)

Archives

  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • November 2019
  • October 2019
  • September 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • December 2010
February 2021
S M T W T F S
 123456
78910111213
14151617181920
21222324252627
28  
« Jan    

by Lloyd Melnick

All posts by Lloyd Melnick unless specified otherwise
The Business of Social Games and Casino Website Powered by WordPress.com.
Cancel

 
Loading Comments...
Comment
    ×
    %d bloggers like this: