Skip to content

The Business of Social Games and Casino

How to succeed in the mobile game space by Lloyd Melnick

Tag: goals

How Operations Analytics can help online gaming companies

How Operations Analytics can help online gaming companies

I recently took an online course from Wharton on Operations Analytics and it was valuable because it was for traditional businesses, not game companies. Social game companies are great at player, marketing and customer analytics, to the point where it is hard to eke out competitive advantage using analytics because everyone is so good; it is largely the cost of doing business. To gain an edge it is sometimes useful to look outside the space and learn best practices from other industries, even ones considered less sophisticated than online gaming, and apply these insights to our business.

newsvendor

While Operations Analytics is a focus for retailers and bricks-and-mortar businesses, it often does not make it on the radar for social, mobile and iGaming companies. The foundational problem in the space is even referred to as the Newsvendor Problem, clearly based on a very traditional business. After taking the Operations Analytics course, I realized many useful applications for gaming companies from this field.

Forecasting demand and allocating resources

I have worked at many large gaming companies, including several that were public companies, and, despite the companies reliance on analytics, financial forecasting was at best a guess, at worst a hope. A retailer, especially a land-based one, lives or dies by its forecasts. Order too much and you are left with potentially worthless inventory. Order too little, and you miss profits that can then be used to market, grow or sustain during seasonal periods.

Accurate forecasting is critical to make valid decisions today (hiring, investing, M&A, etc.), and it is as critical for online gaming companies as it is for a bricks-and-mortar business. For example, a good forecast will help you hire the right size Customer Support team. It will also help you optimize cash flow, manage your marketing expense, allow you to seek financing preemptively, determine when to launch new content and features and more.

The first key learning from Operations Analytics related to forecasting is that point forecasting are usually wrong. Point forecasts are unreliable not only in business but in all elements of life. The chance of predicting the amount of rainfall next month accurately is close to zero.

Instead, a good forecast should be a range, showing the likely outcomes. When modeling an uncertain future, you build probability distributions based on past data (and possibly incorporating expert estimates). You can put likelihoods on different scenarios, such as low, normal and high. When looking at your potential sales for an online game (or your portfolio of games), which is almost a continuous distribution of scenarios (like amount of rainfall), you want to group the scenarios rather than create individual ones, a continuous distribution.

One of the core ways of using existing data is to base forecasts on moving averages, a practice I recently implemented and have seen quite useful. A moving average is a forecast based on the average of the n most recent observations. In my case, we have created much more accurate predictions looking at the past 28 days of data (so n=28), taking both the mean and standard deviation. It is not a perfect tool, though, as it misses trends (i.e. pandemics) and does not show causation.

While finding the mean and standard deviation of past data is a good foundation for creating your forecast, sophisticated companies integrate additional sources of information. These may include an aggregation of forecasts (for example from each of your product managers), customer surveys (how much they expect to change their purchasing behavior), a jury of executive opinion (what your leadership team thinks) or the Delphi Method (individual opinions that are compiled and reconsidered again and again until there is a consensus).

Smoothing data for seasonality

Another common practice in operations analytics that is not always used in online gaming is smoothing for seasonality. While game companies are keenly aware that sales (and other KPIs) will be impacted by day of the week or hour of the day (referred to as seasonality), some companies are better than others at adjusting for this seasonality.

article1-seasonality-792x350

Most game companies realize these differences and will not compare Saturday revenue with Sunday revenue. Instead, they will look WoW (week on week) and compare Saturday with the previous Saturday. The weakness in this approach is that a lot happens in seven days so you are losing very useful information in the comparison, rather than seeing immediate trends you are dealing with a seven day delay.

An approach to avoid this problem is to remove seasonality from the numbers. You can do this by looking historically at the numbers, creating a sample mean, then finding seasonal averages (if we are looking at day of the week you would have a mean for Mondays, a mean for Tuesdays, etc), divide the seasonal averages by the sample mean to create a seasonal factor and then divide each observation by the seasonal factor. By using a seasonal factor, you can identify immediately any big deviations and not have to wait a week to understand if your game is broken.

Projecting new game launches and other high uncertainty situations

While online game companies are not always sophisticated in modeling their forecasts, it gets much worse when projecting for new games. Some companies rely on the wishful thinking strategy, projecting what revenue will be if the game is a hit (despite the fact less than 20 percent of new games are successful). Others take the opposite tack, assuming no revenue from new projects until they already have data. Neither of these approaches provides useful information for companies’ planning. They either muddy the water so that people do not take the financials seriously or provide misleading data that leads to too much or too little additional resources (such as hiring support staff or preserving marketing budget).

While there is significant variance in potential outcomes for new games, there are methods to create a usable range for forecasting purposes. Given that there is limited (possibly a soft or beta launch) or no actual demand data, you need to augment that data with other sources:

  • Sales of similar games. How have comparable products done at launch and over time.
  • Composites. Estimates from the product team or marketing.
  • Customer surveys. Responses from target customers to market research.
  • Jury of executive opinion. Estimates from your executive or leadership team.
  • Delphi method. Individual opinions compiled and reconsidered, repeat until overall group consensus is (hopefully) reached.

You then should refine your estimate based on past experience. Look at previous estimates and compare them to actuals. If you are historically 5X overly optimistic or 20 percent overly pessimistic traditionally, adjust your estimate to reflect this historical variance. Also use the historical data to calculate a standard deviation, so you can create a forecasted range of potential values, rather than a specific number. Then incorporate this range in your planning.

A more advanced method to use when estimating a new product launch, or anything else where there is a high level of uncertainty, is simulation. You can build a simulation of a new product launch using a third party tool (StatPlus is one such tool) coupled with the projections above. You can then see 100 or 1,000 or more likely outcomes with a distribution similar to past product launches. This simulation will then not only give you a range of likely outcomes but also help you identify best and worst case scenarios, so you can accurately gauge risk and reward. You can then optimize your internal resource allocation (i.e. hiring new support agents or performance marketers) by optimizing for reward while using your risk measures as a constraint.

Focused goals

focus

For me, the most valuable insight from operations analytics is that you should only have one goal. When creating an optimization model, you optimize for one decision variable (revenue, profit, cost, etc.), changing multiple variable within constraints. It ends up being a, sometimes complex, algebraic model with a solution. Mathematics such as this, however, can also have a broader application.

Linear regression

I will never forget an executive committee meeting I had several years ago where a new CEO said we have two goals, grow top line revenue like a growth company and grow margin. At the time, several people asked which is more important and the CEO responded they both are, we need to optimize for both. Fast forward several years later and the company wallowed in mediocrity, without significant organic revenue or margin growth.

So what does that story have to do with the math behind optimization models? In an optimization model you can only optimize for one objective. You can optimize for profitability or revenue but if you try to optimize for both, Excel would give you a big error message saying it is impossible to optimize for both. Instead you can optimize for one and set the other as a constraint (optimize for growth with margin >= 20%). This mathematical reality is also a strategic reality, when setting your objective variable do not set multiple objectives but make the hard decision as to what is most important and then use the others as constraints when building your strategy.

Key takeaways

  • While Operational Analytics are a focus primarily in retail and traditional businesses, there are many best practices that iGaming and social game companies can leverage.
  • Forecasting is central to generating and earmarking resources but is often a challenge for game companies, rather than trying to create a point forecast create a range based on moving averages and looking at standard deviation. For new products, create a simulation that will show the distribution of potential outcomes and the risks and rewards possible.
  • You need one, and only one, distinct goal and then optimize your strategy around that goal; it’s impossible to optimize for multiple goals. Use constraints to incorporate what used to be additional goals.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on August 12, 2020June 7, 2020Categories Analytics, General Social Games Business, General Tech Business, Social Games MarketingTags Forecasting, goals, Operations Analytics, seasonality2 Comments on How Operations Analytics can help online gaming companies

The lesson that Roger Bannister taught us

The lesson that Roger Bannister taught us

Roger Bannister’s sub-4 minute mile in 1954is not only inspiring but a critical lesson for business people. Until Bannister, nobody had been able run the mile in less than 4 minutes despite recorded efforts for over 1,000 years and very serious efforts starting in 1886. Before Bannister’s feat, many argued and believed that the human body was unable to run a mile in less than 4 minutes, that it was physically impossible. It was considered the Holy Grail of sports, with media and crowds constantly looking for someone who could achieve this inhuman feat.

bannister

The lesson comes not from how Bannister achieved this apparently miraculous accomplishment, but what happened next. While nobody else had been able to break the 4-minute mark despite hundreds of years of effort, 46 days after Bannister’s feat John Landry ran the mile in 3 minutes 58 seconds. About a year later, three more runners also broke the 4-minute threshold, doing it in the same race.

The lesson

The lesson to draw from Bannister’s achievement, and what followed, is that what you consider impossible may not be. Most importantly, once you are able to achieve the impossible, it becomes the new baseline and even more is possible. The key is breaking the barrier, overcoming the impossible.

Recently, I saw a successful company break the 4-minute mile and reach a new level of performance. It hit what had for years been an impossible milestone. Breaking its personal four-minute mile increased long-term profitability by 30 percent.

The company achieved this result by focusing efforts of multiple teams to create a super-revenue day. All elements of the company designed a plan to create one huge day, the one that was formally their 4-minute mile (though they hadn’t come as close as Bannister’s competitors had). These results not only led to the highest revenue day in the company’s history, but revenue previously considered unattainable.

Like the 4-minute mile, though, once they had broken through this barrier, they themselves were amazed to see that they surpassed that goal repeatedly, without the need even of special initiatives. Now what was once a ceiling is turning into a floor for the company.

Do the impossible, and then more

What the 4-minute mile and this company’s success show is that there is tremendous value in tackling what looks to be impossible. This ceiling could be 100,000 daily average users or a 5 percent conversion rate or a $10 million month, a goal that would impact you significantly and set your company up for a brighter future. If you find a way to overcome it, not only do you derive immediate value but more importantly you can change your long-term trajectory.

Key takeaways

  • In 1954, Roger Bannister ran the first sub-4-minute mile, a goal many had considered humanly impossible for hundreds of years.
  • Once Bannister broke the barrier, somebody else broke it less than two months later and then three runners achieved the feat in the same race about a year later.
  • In business, too, once you find a way to overcome an impossible barrier you are likely to find that you can do it repeatedly and create a new floor.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on February 25, 2020February 15, 2020Categories General Social Games Business, General Tech Business, GrowthTags 4-minute mile, goalsLeave a comment on The lesson that Roger Bannister taught us

How to build a Leaderboard that actually works and drives KPIs

Leaderboards are a common feature in games but developers are often surprised because they are ineffectual or quickly lose impact. The problem is not in the underlying value of leaderboards but in how they are often designed. A recent blog post by Omar Ganai and Steven Ledbetter, How to Motivate with Leaderboards, does a great job of presenting the underlying psychology driving leaderboards and best practices.

What makes leaderboards work

The key principle behind leaderboards is that people want to win and winning improves status. What is often neglected, however, is that some players do not want to win, they want to avoid losing. The latter is important as players who want to avoid losing perform worse when competing. Competition is good for motivation and achievement only when it helps users feel competent. You need to design your leaderboards so it does not make your players feel incompetent.

Ganai and Ledbetter point out that self-determination theory shows people seek and engage in undertakings that fulfill three basic needs. Thus, a well designed leaderboard is consistent with these three needs:

  1. Competence. The emotion a player feels when they successfully complete a challenging goal. The opposite feeling is ineffective or helpless.
  2. Relatedness. The feeling a player has when they are understood and liked by other players. The opposite feeling is rejection and disconnection.
  3. Autonomy. The satisfaction a player gets resulting from a personal commitment and choice. The opposite here is coercion and manipulation.

An effective leaderboard will combine competence, relatedness and autonomy while not making the player feel helpless, disconnected or manipulated.

Best practices in designing leaderboards

To design a leaderboard that drives behavior and incorporates the three needs above, the authors point to four “ingredients”:

Slide1

  • Goal-setting.Goal-setting involves giving or guiding a user toward a goal, and has become recommended as an effective building block for behavior change. The goal of most leaderboards implicitly is to be number one. You need to go beyond this implicit goal and guide your player toward a goal. Effective goals include having fun, learning and showing autonomy. They also recommend nesting intrinsic goals with the extrinsic goals, like making yourself a better poker player by competing on the leaderboard. Finally, an effective technique nests individual goals inside team goals, so the leaderboard is more about playing with others than being number one.

    There are also some goals you should avoid as they will prove demotivating. These goals include meaningless rewards (get more worthless points by finishing number one), emphasizing outcomes players cannot control and focusing on pride (i.e. you should win because only the smartest win).

  • Feedback. A strong feedback mechanic can promote feelings of mastery and competence. You should provide feedback for players on how they are progressing tied to the above goals they have set. The authors also suggest proving juicy feedback, “juicy feedback is varied, unexpectedly excessive sensual positive feedback on small user actions and achievements.”
  • Social comparison. Social comparison helps players understand how they are doing compared to others. Rankings are inherently a form of social comparison. The trick is doing it right because social comparison can make people feel ineffective and unrelated. People tend to compare themselves with people above them so it is easy for them to then feel incompetent.

    There are some techniques to mitigate the risks in social comparison. First, you can tell players they have achieved a standard, even if they did not finish first. Second, explain why players got the score they did and explain how they can do better. Third, give players a choice of playing more or stopping (putting them in control). Finally, acknowledge losing is not fun. If you keep players focused on improving and playing well, they are likely to stay engaged. Also, if they lose as part of a team, the impact of the loss will not be as great, thus it is critical to emphasize connections and relationships.

  • Social rewards. Just as Facebook uses the Like button, let other players reward a player for their activity. You can achieve this impact by letting them follow the player or just sending a virtual high-five. It also helps to make the rewards surprising, as predictable rewards undermine intrinsic motivation.

What to do

Rather than avoiding leaderboards, build them but build them correctly. If you employ a lazy approach and just rank players 1 to one million, the leaderboard will not work well and impact will diminish over time. If you take the time, however, to set up effective goal setting, provide good feedback, employ social comparison and have strong social rewards, you will have a winning feature and move up the AppStore leaderboard.

Key takeaways

  1. While leaderboards are a central feature in many games, for them to be effective you must build them properly or else they will be ineffective.
  2. A key to good design is keeping players from feeling incompetent or inferior.
  3. The other critical components of powerful leaderboards are clear goals, a strong feedback loop, social comparison and rewards that are social.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on June 19, 2018June 18, 2018Categories General Social Games Business, General Tech Business, Social CasinoTags Feedback, game design, goals, leaderboards3 Comments on How to build a Leaderboard that actually works and drives KPIs

How to manage your algorithms

While everyone is focused on creating the most advanced algorithms for their predictive analytics and optimizing your team’s performance, I have not seen anything on how to manage your algorithms. A great article in Harvard Business Review – Algorithms Need Managers, Too by Michael Luca, Jon Kleinberg and Sandhil Mullainathan – does a great job of combining the two issues and providing a solution.

The authors begin by pointing out most businesses rely on predictions throughout their organization. The decisions can range from predicting a candidate’s performance and whether to hire them, what initiatives will have the highest ROI and what distribution channels will yield the most sales. Companies increasingly are using computational algorithms to make these predictions more accurate.

The issue is, if the predictions are inaccurate (and although they are computer generated, they are still predictions not facts) they can lead you into bad decisions. Netflix learned this the hard way when its algorithms for recommending movies to DVD customers did not hold when its users moved to streaming. More relevant to digital marketers, algorithms that generate high click through rates may actually bring in poor users not interested in your underlying game or product. As the authors write, “to avoid missteps, managers need to understand what algorithms do well – what questions they answer and what questions they do not.”

How algorithms can lead you amiss

An underlying issue when using algorithms is that they are different than people. They behave quite differently in two key ways:

  • Algorithms are extremely literal, they do exactly what they are told and ignore any other information. While a human would understand quickly that an algorithm that gets users that generate no revenue is useless, if the algorithms was just built to maximize the number of users acquired it would continue attracting worthless users.
  • Algorithms are often black boxes, they may predict accurately but not what is causing the action or why. The problem here is that you do not know when there is incomplete information or what information may be missing.

Once you realize these two limitations of algorithms, you can then develop strategies to combat these problems. The authors then provide a plan for managing algorithms better.

Slide1

Be explicit about all of your goals

When initiating the creation of an algorithm, you need to understand and state everything you want the algorithm to achieve. Unlike people, algorithms do not understand the implied needs and trade-offs necessary often to optimize performance. People understand the end goal and then backward process how to best achieve that eventual goal. There are also soft goals to most initiatives, and these goals are often difficult to measure (and thus input into your algorithms). There could also be a goal of fairness, for example a bank using an algorithm to optimize loan behavior may not provide enough loans in areas where it feels a moral obligation to do so. Another example could be where you may want to optimize your business units sales but the behavior could negatively impact overall sales of your company.

The key is to be explicit about everything you hope to achieve. Ask everyone involved to list their soft goals as well as the primary objective. Ask people to be candid and up-front. With a core objective and a list of concerns in front of them, the algorithm’s designer can then build trade-offs into the algorithm. This process may entail extending the objective to include multiple outcomes, weighted by importance.

Minimize myopia

Algorithms tend to be myopic, they focus on the data at hand and that data often pertains to short-term outcomes. There can be a tension between short-term success and long-term profits and broader corporate goals. People understand this, computer algorithms do not.

The authors use the example of a consumer goods company that used an algorithm to decide to sell a fast-moving product from China in the US. While initial sales were great, they ended up suffering a high level of returns and negative customer satisfaction that impacted the brand and overall company sales. I often see this problem in the game industry, where product managers find a way to increase in-app purchases short term but it breaks player’s connection with the game and long-term results in losses.

The authors suggest that this problem can be solved at the objective-setting phase by identifying and specifying long-term goals. But when acting on an algorithm’s predictions, managers should also adjust for the extent to which the algorithm is consistent with long-term aims.

I recommend using NPS to balance out short term objectives with the long-term health of the product and company. I have written before about NPS, Net Promoter Score, which is probably the most powerful tool to measure customer satisfaction. It is also highly correlated with growth and success. By ensuring you keep your NPS high, you are providing a great way to look holistically at the success of specific initiatives.

Chose the right data inputs

Using the right data can make your algorithms much more effective. When looking at a game like Candy Crush, you can create levels by looking at when people abandon the game and decompose the levels before abandonment. However, by adding social media posts to the your data, you could get a more holistic view of which levels players are enjoying and thus build a more compelling product.

The authors also point to an example with the City of Boston. By adding Yelp reviews to what health inspectors use to determine what restaurants to inspect, they were able to maintain their exact same performance but with 40 percent fewer inspectors. Thus, the new data source had a huge impact on productivity.

The authors point to two areas of data that can improve your algorithms:

    • Wider is better. Rather than focusing on more data, the amount of data you know about each customer determines the width. Leveraging comprehensive data is at the heart of prediction. As the authors write, “every additional detail you learn about an outcome is like one more clue, and it can be combined with clues you’ve already collected. Text documents are a great source of wide data, for instance; each word is a clue.”
    • Diversity matters. Similar to your investment strategy, you should use data sources that are largely uncorrelated. If you use data that moves closely to your data sources, you will have the illusion of using multiple data sources but really only be looking at one angle of the data. If each data set has a unique perspective, it creates much more value and accuracy.

Understand the limitations

As with anything, it is also critical to understand the limitations of algorithms. Knowing what your algorithm will not do is equally important as understanding how it helps. Algorithms use existing data to make predictions about what might happen with a slightly different setting, population, time, or question. “In essence, you are transferring an insight from one context to another. It’s a wise practice, therefore, to list the reasons why the algorithm might not be transferable to a new problem and assess their significance,” according to the authors.

As the authors point out, “ remember that correlation still doesn’t mean causation. Suppose that an algorithm predicts that short tweets will get retweeted more often than longer ones. This does not in any way suggest that you should shorten your tweets. This is a prediction, not advice. It works as a prediction because there are many other factors that correlate with short tweets that make them effective. This is also why it fails as advice: Shortening your tweets will not necessarily change those other factors.”

Use algorithms, just use them smartly

This post is not intended for you to avoid using algorithms, it is actually the opposite goal. Algorithms are increasingly powerful and central to business success. Whether you are predicting how consumers will react with a feature, where to launch your product or who to hire, algorithms are necessary to get great results. Given the central importance of these algorithms, however, it is even more crucial to use them correctly and optimize their benefit to your company.

Key takeaways

  1. Algorithms are increasingly powerful and central to business success. Given the central importance of these algorithms it is even more crucial to use them correctly and optimize their benefit to your company.<
  2. Problems with algorithms result from them being literal (they do exactly what you ask) and are largely a black box (they do not explain why they are offering certain recommendations).
  3. When building algorithms, be explicit about all your goals, consider the long-term implications and make sure you are using as broad data as possible.

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on March 23, 2016February 28, 2016Categories Analytics, General Social Games Business, General Tech Business, Machine LearningTags algorithms, analytics, goals, Machine learning, Net Promoter Score, NPSLeave a comment on How to manage your algorithms

My takeaways from BDRM 2012, day 2

Friday offered some more great sessions on consumers’ decision making that are relevant to the social games industry. For those who did not see my post on Friday, I spent last week at the Behavioral Decision Research in Management Conference (BDRM) in Boulder. My major takeways on Friday were as follows:

Leeds School of Business at the University of Colorado

Continue reading “My takeaways from BDRM 2012, day 2”

Share this:

  • Facebook
  • LinkedIn

Like this:

Like Loading...
Author Lloyd MelnickPosted on July 2, 2012August 12, 2012Categories Analytics, General Social Games Business, Social Games MarketingTags BDRM 2012, collections, customization, goals, social games, statistics2 Comments on My takeaways from BDRM 2012, day 2

Get my book on LTV

The definitive book on customer lifetime value, Understanding the Predictable, is now available in both print and Kindle formats on Amazon.

Understanding the Predictable delves into the world of Customer Lifetime Value (LTV), a metric that shows how much each customer is worth to your business. By understanding this metric, you can predict how changes to your product will impact the value of each customer. You will also learn how to apply this simple yet powerful method of predictive analytics to optimize your marketing and user acquisition.

For more information, click here

Follow The Business of Social Games and Casino on WordPress.com

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 1,313 other followers

Lloyd Melnick

This is Lloyd Melnick’s personal blog.  All views and opinions expressed on this website are mine alone and do not represent those of people, institutions or organizations that I may or may not be associated with in professional or personal capacity.

I am a serial builder of businesses (senior leadership on three exits worth over $700 million), successful in big (Disney, Stars Group/PokerStars, Zynga) and small companies (Merscom, Spooky Cool Labs) with over 20 years experience in the gaming and casino space.  Currently, I am on the Board of Directors of Murka and GM of VGW’s Chumba Casino

Topic Areas

  • Analytics (114)
  • Bayes' Theorem (8)
  • behavioral economics (8)
  • blue ocean strategy (14)
  • Crowdfunding (4)
  • General Social Games Business (457)
  • General Tech Business (194)
  • Growth (88)
  • International Issues with Social Games (50)
  • Lloyd's favorite posts (101)
  • LTV (54)
  • Machine Learning (10)
  • Mobile Platforms (37)
  • Social Casino (51)
  • Social Games Marketing (104)
  • thinking fast and slow (5)
  • Uncategorized (31)

Social

  • View CasualGame’s profile on Facebook
  • View @lloydmelnick’s profile on Twitter
  • View lloydmelnick’s profile on LinkedIn

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 1,313 other followers

Archives

  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • November 2019
  • October 2019
  • September 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • December 2010

Most Recent Posts

  • Lessons for gaming and tech companies from the Peter Drucker Forum
  • Chaos Theory, the Butterfly Effect, and Gaming
  • How to give help without micromanaging
  • Measure yourself by your worst day

RSS

RSS Feed RSS - Posts

RSS Feed RSS - Comments

Categories

  • Analytics (114)
  • Bayes' Theorem (8)
  • behavioral economics (8)
  • blue ocean strategy (14)
  • Crowdfunding (4)
  • General Social Games Business (457)
  • General Tech Business (194)
  • Growth (88)
  • International Issues with Social Games (50)
  • Lloyd's favorite posts (101)
  • LTV (54)
  • Machine Learning (10)
  • Mobile Platforms (37)
  • Social Casino (51)
  • Social Games Marketing (104)
  • thinking fast and slow (5)
  • Uncategorized (31)

Archives

  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • November 2019
  • October 2019
  • September 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • December 2010
February 2021
S M T W T F S
 123456
78910111213
14151617181920
21222324252627
28  
« Jan    

by Lloyd Melnick

All posts by Lloyd Melnick unless specified otherwise
The Business of Social Games and Casino Website Powered by WordPress.com.
Cancel

 
Loading Comments...
Comment
    ×
    %d bloggers like this: