Google Play Experiments
What is Google Play Experiments
Widely known as Google Play Store listing experiments or GEx for short, it’s the A/B testing functionality provided by Google who themselves describe it as a tool to find: “..the best app icon, images, video, and descriptions for your app on Google Play.” (You can’t currently test an app’s title.) The internal tool enables the testing of these elements on live Google Play Store traffic in order to compare two or more variants with different creatives / messaging to see which versions see higher conversions. App developers/marketers can decide how many users would get to each variation and divert a portion of traffic to test.
Why Google Play Experiments are Important
A/B testing is vital because conversion rates (CVR) especially for first time installs, is one of the factors that influences mobile growth and specifically organic growth the most, as it signals to the platform to surface the app through charts and search results.
The insights received from each experiment give a general sense to whether certain creatives work better than others, which is an important aspect of optimization.
A/B testing increases growth by getting Google Play Store pages to be more efficient in capturing installs from every X number of impressions an app developer / marketer is able to drive through difference channels and sources.
Google Play Experiments and ASO
Google allows several types of experiments to run, including: one global experiment of the main default graphics and up to five localized experiments by language, in addition to all the elements highlighted in the top paragraph. By conducting experiments successfully, it’s possible to build a strong ASO strategy that gives an app / game a competitive advantage over a game in a similar category.
For ASO, the A/B testing in the Google Play Console is one of the tools available that helps improves CVR, CVR being a pillar of the ASO industry. Unfortunately, a significant drawback to GEx is that on-page behavioral analytics (such as video watch rates, screenshot gallery rates) are missing.
So yes, it’s possible to know when, but it’s hard to understand why a certain variation has been more successful using GEx only. Why did one variation perform better than another? What specific creative / message convinced visitors to install? It’s difficult to make data-driven conclusions on the winning variety (and put effort into recreating iterations) to drive additional CVR improvements without truly understanding the reasoning behind each win. But using a third-party A/B testing platform like Storemaven will provide those insights and more. Storemaven can use their tech to run complementary tests based on why a specific variation performs better and provide analysis on how users are responding/engaging with app store creatives and messages.