On
Reddit, I saw a lot of discussions, comments and questions about what determines the success of the game. How important is quality? Is the only defining aspect really the great popularity of the game on the market before its release? Do demos help or hurt? If the performance of the game at release turned out to be bad, then what is the likelihood of their correction? Is it possible to at least approximately predict the sales of a game before its release?
Preparing to
release my own game , I spent a lot of time monitoring the released releases in an attempt to find answers to these questions. I compiled a spreadsheet, recorded subscribers, availability of early access, the number of reviews for the first week, month and quarter.
Now I decided to share this data in the hope that they will help other developers understand and predict the sales of their games.
')
First notes on the data:
- One of the most important data sources is the number of reviews on Steam. There is reliable evidence that it strongly correlates with the number of copies sold, the ratio of “50 sales per review on Steam” is often mentioned, but the range of values ​​is quite wide. It seems that most Steam games fall in the range from 25 to 120 sales per review on Steam, but there are also outliers. In addition, games with a very small number of reviews are more likely to be outliers in this regard. My game is the only one in which I have clear sales figures. You can read my long post about her release on Reddit , but the most important thing for us is that I sold 1,587 copies in the first week and 3,580 copies in the first quarter.
- Total number of games in the selection: 115.
- I chose the games semi-randomly from the Popular Upcoming and All Upcoming sections. This inclines the sample more towards popular games, and I did it on purpose: I wanted to have a diverse selection, but so that games with zero sales would not completely dominate it.
- Games are ordered by release date, which is in the range from 10/26/18 to 12/20/18.
Column Explanation- Release discount: first week discount, 0.25 = 25% discount
- Week, assumption: this is my assumption, made before the release of the game, about how many customer reviews on Steam it will have after exactly one week.
- Week Valid: Number of reviews received by the game after 1 week.
- 3 months: the number of reviews received by the game in 3 months.
- Subscribers: the number of subscribers to the game group before its release. In some cases, this parameter was fixed right before the release, in others - a week before it.
- Review score: The percentage of positive reviews on Steam after one month. Games must have at least 20 reviews to get a rating.
Question 1: can quality predict success?I recently read a post stating that the main success metric for an indie game is its quality.
Quality, of course, is a subjective metric. The most obvious way to objectively measure quality for Steam games is by their percentage of positive reviews. This is the percentage of reviews of game buyers who gave the game a positive rating. I excluded all games that did not have at least 20 reviews in the first month, which reduced the selection to 56 games.
The correlation (Pearson) between the rating of the game and the number of reviews three months after the release was -0.2. But 0.2 (plus or minus) is not such a strong correlation. More importantly, Pearson correlation can fluctuate if the data contain large outliers. Looking at the games themselves, you can see that the difference is an ejection artifact. Literally: Valve's Artifact had the largest number of reviews three months later and one of the lowest ratings (at that time 53%). When I removed this game from the data, the correlation essentially became zero.
An alternative correlation model called the Spearman coefficient, which performs rank correlation and minimizes the effect of large outliers, showed a similar result.
Conclusion: if the correlation between the quality of the game (measured as an estimate by reviews on Steam) and the first quarter of sales (measured by the total number of reviews) exists, then it is too small to be found in this data.Question 2: do demos, early access or discounts affect the success / failure at the time of release of the game?Unfortunately, there were so few games that had demos before release (10) that only a very strong correlation could tell us anything. As it turned out, no significant correlations were found.
There were more games with early access (28), but the correlation was again too small to be significant.
More than half of the game had a discount per week of release, and in fact there is a moderate negative correlation of -0.3 between the discount and the number of reviews in the first week. However, it seems that this is mainly the result of the tendency of developers of AAA games (selling the most copies) not to make discounts during the release of the game. If we remove the games that most likely earned more than $ 1 million in the first week, the correlation will drop to almost zero.
Conclusion: not enough data. No clear correlations were found between demos, early access or release discounts and the number of reviews: even if they help or hurt sales, the influence is not so coordinated as to be noticeable in such a sample.Question 3: Does success predict the game’s popularity before release (for example, the number of subscribers on Steam)?The number of “subscribers” to any game on Steam can be found by finding its automatically created
Game Center . Before the release of the game, this is a good approximate indicator of the level of its popularity in the market.
The correlation between subscribers shortly before the release and the number of reviews after 3 months was 0.89. This is a very strong positive correlation. The rank correlation also turned out to be high (0.85), and this tells us that the result is caused not only by a few highly anticipated games.
With the exception of a single outlier (which will be described below), the ratio between the number of reviews for 3 months and the number of subscribers before the release of the game ranged from 0 (for several games that did not receive a single review) to 1.8 with a median value of 0.1. If you have 1,000 subscribers right before the release, then by the end of the first quarter you should expect “about” 100 reviews.
I noticed that there were several games, the number of subscribers of which seemed too large compared to secondary indicators of the game’s popularity on the market, for example, threads of discussions on forums and attention on Twitter. After conducting a study, I came to the conclusion that the Steam platform considers subscribers to activate the keys before the release. If the game developer handed out a lot of Steam keys before the release (for example, as rewards in Kickstarter or as part of a beta test), it turns out that the game attracted more subscribers than it would receive “organically”.
Conclusion: the organic subscribers collected before the release of the game are a serious indicator of continued success.Question 4: what about the price?The correlation between the price and the number of reviews after 3 months is 0.36, which is a moderate correlation. I’m not entirely sure of the usefulness of this data: it’s pretty obvious that games with a large budget have a large marketing budget.
The correlation between price and ratings in reviews is -0.41. It seems likely that players consider the price in their reviews, and that there are more requirements for a $ 60 game than for a $ 10 game.
Question 5: Do first-week sales predict first-quarter results?The correlation between the number of reviews after 1 week and the number of reviews after 3 is 0.99. Spearman's correlation is 0.97. This is the largest correlation I found in this data.
If we exclude games that sold a very small number of copies (less than 5 reviews in the first week), then most games after 3 months have about twice as many reviews than after 1 week. From this it can be assumed that in the first week as many copies are sold as will be sold in total over the next 12 weeks. The vast majority of games have a tail ratio (ratio of reviews after 3 months and after 1 week) ranging from 1.3 to 3.2.
I often saw questions from developers whose release of games on Steam went poorly. They wanted to know what could be done to improve sales. I’m sure that marketing after release can affect future sales, but it seems that the first week still draws a clear line of results.
Conclusion: everything says that the connection existsQuestion 6: Does quality help tail sales of the game?In the previous question, we said that despite the strong correlation of sales of the first week with the first quarter, ratios still vary in a wide range. Let us designate as the
tail coefficient the ratio of reviews after 3 months to reviews after 1 week. The lowest value is 0.95 for Pro Fishing Simulator, which even managed to lose one review. The maximum coefficient was 6.9, we will consider this extremely large surge later. The worst “tail” corresponds to a game with a score of 22%, and the best - to a score of 96%, and this is most likely not a coincidence.
The overall correlation between the tail coefficient and Steam ratings is 0.42.
Conclusion: even despite the absence of a clear correlation between quality and the total number of reviews / sales, there is a moderate correlation between the rating of the game and its “tail”. This hints to us that “good games” show themselves in the long run better than “bad games”, but the influence of this factor is small compared to the more important factor in the popularity of games on the market.Question 7: is it possible to predict the success of a game before its release without knowing the data on wishlists?When I collected data for each game, sometimes before its planned release date, I made a forecast about how many reviews she would receive in the first week, and entered this forecast in a spreadsheet.
The main factor on the basis of which I made a forecast was the number of subscribers. Sometimes I corrected the forecast when I felt that the value was incorrect and used auxiliary sources, for example, activity on the Steam forum and attention on Twitter.
The correlation between my guesses and the true value is 0.96, and this is a very strong correlation. As you can see in the data, my forecasts are mostly approximately true, except in a few cases where I was very wrong.
In my experience, multiplying the number of subscribers by 0.1, in most cases we get an approximate estimate of the number of reviews in the
first week of the first quarter. If the game does not have at least one question in the discussion forum for every 100 subscribers, then this may say that there are a large number of “inorganic” subscribers, and the assessment needs to be adjusted.
Conclusion: yes, with some exceptions, using the data on subscribers and other indicators, you can approximately predict the results of the first week. Given the strong correlation between the sales of the first week and quarter, you can get an approximate understanding of the results of the first quarter even before the release.Last question: what about the emissions I mentioned?There were several games in the data that stood out in one way or another.
Burst 1: Overdungeon . Shortly before the release, the game had 77 subscribers - a rather small number, and based on only these data, I would expect less than a dozen reviews in the first week. As a result, there were 86 of them. But there was more to it: the game had a strong tail and ended the first quarter with 572 reviews. With a large margin, it has the largest ratio in the sample between the number of reviews and the number of subscribers.
Based on the reviews, you can understand that it is an analogue of Slay the Spire, but is very popular in Asia. It seems that 90% of the reviews are written in Japanese or Chinese. If anyone has any idea about the reasons for the unusual apparent success of the game, then I would be interested to hear them.
This seems to be the only clear example of a game data with a minimum number of subscribers before the release, which had solid sales in the first quarter.
Surge 2: 11-11 Memories Retold . This game had just before the release of 767 subscribers, ten times more than Overdungeon. This is still not very much even for a small indie game. But she had a good favorable factor: Yoan Fanise, who was the co-director of the popular Valiant Hearts game with a similar theme, became the director of the game. It was animated by Aardman Studios, famous for the cartoon Wallace and Gromit. The publisher was Bandai Namco Europe, not some inexperienced company. The voice acting of the game was done by Sebastian Koch and Elijah Wood. The game has received many good reviews in both the gaming and regular press. It currently has a rating of 95% positive reviews on Steam.
And despite all this, no one bought it. 24 hours after the release, she literally had zero reviews on Steam. A week later, they became only 10. Three months later, she showed the largest “tail” in the data, but even then reached only 69 reviews. Now there are about 100 of them - an incredible tail coefficient, but the game most likely turned out to be a commercial failure.
This is a great example that a good game + good development quality does not always mean good sales.
Notes: The most important findings from this analysis:
- The success of the game on Steam is very dependent on the performance of the first week
- The indicators of the first week are strongly correlated with the popularity of the game on the market before release
- Quality does not greatly affect the performance of the first week, but can have a positive impact on the tail of game sales.
- All conclusions regarding sales depend on the relationship between the number of reviews and sales.