This past GDC 2023, Steam hosted roundtable sessions for developers to provide feedback. I was able to attend one and it was an enlightening experience.
Recently, Steam just followed up by posting a summary of the biggest topics discussed. There were no real details shared here other than referencing current documentation and of course their updated traffic reporting in lieu of Google Analytics
So, thought it made sense to share a bit more of what was discussed in the session I attended, the importance of those conversations and some of my thoughts.
Steam UTMs, Tracking and GA
There was a good amount of discussion on this topic. What was most enlightening here was really getting a better understanding of Steam’s focus when it comes to reporting information.
User privacy is obviously a big one. So discussions around tracking pixels from third-party platforms (ie Facebook) was a hard no. Makes sense for lots of reasons.
In light of user privacy, there were discussions about what reporting information could be shared. Obviously, nothing that could clearly identify a user, but talking about what kind of thresholds could be in place to provide helpful data insights while not singling out users.
For example, if you’re getting a nominal amount of traffic from a particular source, it will likely be bucketed into the “other” category. This is to protect users so data can’t be cross-referenced with other information which can then “single them out.” Again though, the thresholds for what gets bucketed into one source versus “other” is unknown, but whatever it is, it’s likely in favor of protecting users’ privacy.
This can make things tricky, as those thresholds can limit transparency and the overall value of insights, especially for Steam pages that don’t receive a lot of external traffic. This doesn’t change too much of what was already in place, but it’s really helpful to understand this.
One thing that really stood out to me regarding this topic was discussions around reporting windows attributing wishlists and sales to traffic regarding the accuracy and timeliness of that information.
As many of you may have seen in the past, some UTM tracking conversion data wouldn’t post until after about a 7-day period. This was because Steam wanted this information to be as accurate as possible and that period of time was to account for things like returns or wishlist removals.
If anyone can recall, this window was much shorter when Stearm first rolled out UTM tracking. You may have also noticed you could have had 100 wishlists one day, and check again to find you only had 90 or 110.
While the accuracy of information is important, so is the timeliness of the data. When running certain types of campaigns, (like Facebook ads), receiving data as fast as possible allows effective decisions to be made within those campaigns in a timely manner.
This seems to be addressed in their recent announcement and update, however, it remains to be seen what changes could happen to how information in this context is reported.
Steam’s Best Practices for Marketing Games
Steam’s summary of this put communication with audiences at the forefront, which likely consisted of a TON of discussion around various topics that fit into this category. Marketing is basically a form of communication after all.
Before sharing some topics that were discussed, I think it’s important to note how Steam may be approaching these discussions. There were lots of great ideas and helpful feedback, however, Steam needs to serve its interests and prioritize things accordingly to its goals and focus.
Sometimes, these align with the devs’ interests. When they don’t, it’s not a matter of not being important, just they may be focusing on other priorities.
A/B Testing Steam Pages
One conversation that came up was the value A/B testing Steam pages could have. Between switching out capsules, testing copy, and page elements, this would offer very direct and valuable insight into optimizing Steam pages.
I got the impression that this would be quite an undertaking for Steam to implement this feature, so I wouldn’t plan on it any time soon.
Monitoring Steam Forums
There was a discussion that went down a rabbit hole around Steam Forums. To summarize, the discussion basically concluded that Steam forums can easily equate to negative reviews if those conversations are negative, so it’s important to monitor.
Ironically enough, Chris Hanney had a GDC presentation that discussed this very topic, and several of the points he mentioned in his presentation were also shared by the Steam team members. One of them was the importance of responding to user feedback, which then lead to some very interesting conversations.
Steam Reviews
This was not one of the biggest discussions during the Steam Roundtables, but definitely a notable one.
In most cases responding to user feedback is essential and highly suggested. However, this conversation eventually revealed that users are not notified when developers respond to a review.
There are lots of reasons why this can be frustrating, from providing great customer service to taking action in an effort to flip a negative review. In either case, it seemed the majority of participants in the discussion felt a feature that notifies users when developers respond to their review would provide an overall better experience for both users and devs.
Since the conversation migrated to reviews, lots of other interesting topics started to surface.
Steam Review Ranking
One topic was how reviews are displayed against the overall rating a Steam game has. For example, let’s say your game has a Steam rating of 90%. That means 10% of the reviews are negative and 90% are positive.
However, when looking at game reviews on a Steam Page, it’s possible that the first review shown could be negative. This was something I have seen firsthand, and when evaluating Steam page reviews further, it didn’t appear that the “Was this review helpful” rating system on reviews was a ranking factor in displaying a negative review over a positive one, or vice versa.
When asking Steam, I believe they mentioned the above-the-fold reviews (the first 10 or so listed, not including the right side margin “recently reviewed” section) would somewhat reflect the game’s rating. So from our previous example, 9 reviews would be positive, and 1 would be negative.
However, there was no information shared about whether there were ranking factors or how the placement of negative or positive reviews was decided for the top reviews shown in that list.
This was somewhat concerning, as placement of reviews could have a significant impact on sales, even for titles with a 90% rating. As the discussion continued, I asked Steam if they had any insight into user behavior in terms of how far they scroll when reading reviews.
This isn’t the information I’d expect anyone to have handy, but it posed a very interesting scenario. If Steam user behavior in this context is anything like how people use search engines, then they may not go past the first three to five reviews, especially if the negative review makes a convincing point.
That means, there’s a chance that negative reviews that are listed at the top could have a negative impact on sales, despite and overall positive rating.
Unfortunately, there is no Review Ranking Optimization (RRO) that you can perform, other than doing everything possible to avoid negative reviews altogether.
However, I suggested Steam consider how XBOX addresses this issue by randomizing what order reviews are shown for users, helping to eliminate initial bias or first impressions that could be received from a negative review that is always listed at the top.
Of course, this whole argument can be completely flipped as Steam users DO have the ability to filter review results. Despite this, the default settings still leaves some concern as how the first ten reviews are shown. It’s still a luck of the draw that could make or break your sales.
Poor Quality and Meme Reviews
Lastly, there was some additional discussion around the validity of memes and not-so-helpful reviews. Meme reviews are where users put text art images of memes, or just textual memes in place of a review. These obviously don’t provide any value whatsoever other than they are either positive or negative.
This, of course, opened up other discussions around the validity of some reviews as not all users know how to provide constructive criticism or helpful feedback.
The latter statement can be very subjective, but still a pain point for most developers. With that, there was a solution proposed to Steam to provide a suggested format for players and offer context for users on the importance reviews can have for developers.
My hopes are that based on the notes they jotted down, Steam sees this as an “easy” thing to address by updating their reviews announcement post or creating other documentation for users.
The above speaks volumes, well beyond meme reviews of Goat Simulator. Unfortunately, there may not be any way to reduce the occasional Heisenberg meme review.
Overall, it was a great experience and an opportunity to speak with members of Steam was great. The insight they provided allowed for a better understanding of the platform from their perspective. More importantly, it showed that they were listening.