On June 7th, Apple revealed two iOS 15 new features to the App Store during their annual World Wide Developer Conference (WWDC) that are coming with the new operating system iOS 15, sometime in the fall of 2021.
These features, Product Page Optimization and Custom Product Pages, will change the lives of mobile marketers, turning App Store Optimization (ASO) into a powerful, not-to-be-ignored aspect of any app or game developer’s strategy.
In this instantaneous guide, we’ll discuss one of these features – Product Page Optimization, also known as PPO. You can check out our second guide, looking into iOS 15 Custom Product Pages (CPP).
What is iOS 15 Product Page Optimization?
Product Page Optimization for iOS 14 is a brand new feature where you can test your Product Page App Creatives in the App Store with three different treatments (variations) and your control App Store page.
In each PPO treatment, you’ll be able to choose a different icon, screenshots, and app preview video. According to Apple’s statement, all creative assets will need to go through the normal App Store review process before you can use them. For each test, you will be able to decide the portion of traffic which should be allocated to each treatment. For example, 90% Control, 10% for each of the three treatments.
Once your test is live, you’ll be able to compare the performance of each variant by examining the impression volume and the install volume (and see the conversion rate of each App Store page). Each PPO test can run within a specific localization. Meaning, you can test specific treatments for specific countries.
Apple makes it clear that the intention is for developers to use this new mechanism to optimize for Search and Browse traffic (aka “Organic”), while Paid UA (referral traffic) has a dedicated solution in Custom Product Pages – CPP (For more information, check out our full guide to iOS 15 Custom Product Pages).
You’ll also have the ability to run tests in only specific localizations.
For example, if your product page is currently localized in English, Japanese and Korean, you can choose to run your tests in Japanese only.
This means that any users that see the English or Korean localized versions of your product page will not be included in the test. This gives you the flexibility to focus your test if your treatment is only relevant for a subset of your localizations.
Let’s break things down further:
- What does Product Page Optimization mean for ASO?
Only Organic. If you are fully adopting Apple’s new features and using custom store pages for all your paid UA traffic, the only traffic that will reach your default App Store product page is organic traffic, composed of search and browse traffic.
This means that setting up three variations on your default App Store product page and comparing their performance vs. the control variation will actually allow you to optimize your conversion rates for real organic traffic.
- What are the challenges of APP Store Product Page Optimization?
A testing tool alone isn’t enough, you need a decision-making protocol.
In-Depth Analysis: Apple doesn’t seem to provide any real testing functionality. Yes, they’re about to provide you with a way to compare the performance of each variation but they’re not going to provide you with statistical insights regarding which variation is really performing better.
Here at Storemaven we know from six years of testing experience that the testing method itself, i.e the way in which you make a decision as to which variation is really performing better, is 10x more important than the testing technology.
In other words, the fact you have access to run a basic test doesn’t mean you are going to be able to positively increase your conversion rates.
- Will the Google Experiments conundrum repeat itself?
Google Experiments – Google’s own tool that helps developers A/B test its app store – have been available for years. But, from heavy research we did into the platform, scanning all global Google Experiments and detecting those that concluded with a winner that’s not the control, we found that less than 15% of such experiments/tests end with a winner.
Why? There are a few reasons. One is that most test owners under-invest in the research & hypothesizing phase, the creative design phase, and the analysis phase. Whether it’s lack of time, resources, or understanding, this is the norm.
Another example comes from web A/B testing that has been available for around two decades. According to research from AppSumo, less than 12.5% of tests produce results.
Another issue is called “Peeking”. Let’s explain: Within Apple’s new iOS 15 app A/B testing feature, you have the ability to run a test for up to 90 days. During this time you will be able to compare the performance of each variant and you’re supposed to decide when to stop the test and implement the winner so that all your (organic) traffic will go to it.
This opens up a huge issue that Evan Miller, one of the world’s top experts on A/B testing & statistics, wrote extensively about. “Peeking”. What does it mean? Essentially, someone will start a test, look at the result every few hours or every day and once they see the results they want to see (one of the variants beat the control) they’ll conclude and stop the test.
But once the “winner” is called and applied to all 100% of the traffic, our “peeker” is usually disappointed to find that at best, no real improvement in conversion rates occurred, and at worst, conversion rates dropped. This is due to the fact that the test results had very low statistical significance.
Although Product Page Optimization (PPO) is a nice to have testing tool, whether running natively on the App Store or not, it will need a great decision-making protocol to enable the deployment of better converting App Store product pages. Get early access to our Growth Intelligence Platform.