ASO Best Practices: How to Test Your Mobile App Icon

ASO Best Practices: Everything You Should Know About App Icon Tests

Your App Icon plays a significant role in your overall App Store Optimization (ASO) and user acquisition (UA) efforts as it’s the first visual element app store visitors see and is the only consistent graphic asset that appears throughout the rest of their journey.

Testing your App Icon is a beneficial test to run given the impact it has not only on organic growth and conversion but the full user funnel. It must drive UA ad clicks and then effectively differentiate your app from competitors, prompt downloads, and encourage visitors to re-open the app once it’s on their devices.

However, even when developers or mobile marketers take the time to test their Icons, more often than not the tests don’t reveal actionable insights or lead to useful results, thus costing their companies valuable time and money.

To help guide your Icon tests and ensure you’re getting impactful results that can lead to sustainable success, we compiled this resource that covers the common Icon testing mistakes you should avoid and the factors that lead to valuable tests.

 

The Importance of App Icon Testing

As mentioned before, your App Icon has a profound impact on visitors throughout all areas of the funnelfrom having a prominent place in UA (most likely Facebook) ads, the Search Results Page, top charts and niche featured placements on the Apple App Store and Google Play Store, to eventually being displayed on your full Product Page. Plus, it’s the only app store element that has a lasting impact on re-engagement since it reminds users to open the app when they see it on their smartphone homescreens.

Given its importance and effect on conversion and retention, you want to ensure you’re displaying an Icon that gives your app or game the best chance to succeed and drive conversion. Testing is one of the most effective, data-driven ways of doing this. In fact, through our work with leading mobile publishers, we’ve found that an optimized Icon has the potential to boost conversion rates (CVR) by up to 30%.

Since Icons appear in a variety of areas, it’s not possible to test them cohesively throughout the funnel. However, this downside doesn’t eliminate the benefits of testing. There are still ways to measure each step of the funnel and improve your likelihood of long-term success.

Before getting into the effective ways to test your Icon, let’s start with why many Icon tests fail…

 

The App Icon Testing Mistakes to Avoid

After tracking thousands of Icon tests that leading mobile publishers have run on Google Experiments, we found that on average, only 20% of tests actually succeed in finding an Icon that converts better than the control. In fact, out of all possible elements to A/B test, Icon tests fail the most.

Why?

1) Lack of Clear Hypotheses

One of the major reasons most Icon tests fail is related to hypotheses, or lack thereof.

Hypotheses in the context of app store testing are precise statements that can be proven or disproven and should be used as a starting point for further investigation. They drive the creative design and direction of the test so it leads to actionable results. However, people tend to think of Icon A/B tests the way they think of web and landing page A/B tests—where you can change a single button color and suddenly see an uptick in clicks.

It’s important to understand that mobile is not like that both due to the nature of the platforms and by the unique challenges of app store testing in general (e.g., 100% users are sent to the same place, every app store page has the same layout, visitors engage in different ways, etcetera). In general, when developers or mobile marketers run their own Icon tests, they usually test elements that don’t actually create an impact.

For example…

Weak Hypothesis

  • Visitors prefer to see an Icon with a blue background instead of a red background.

The issue with this hypothesis is that either users won’t notice the changes or the hypothesis itself won’t lead to significant learnings about your app store visitors. So, users prefer a red background color logo—now what?

When weak hypotheses are used to drive tests, the changes are too subtle to make a significant impact on the performance or CVR of the app store page. The key is to understand how to drive conversion, which means understanding what visitors respond to and what aspects of the app or game are most appealing to them. It’s important to develop a long-term strategy rather than continuously running multiple, unrelated Icon tests that don’t lead to valuable insights.

Brother in Arms 3: Sons of War Mobile App Icon Test

It’s also possible to develop hypotheses that don’t actually lead to any beneficial understandings. For example, while the Icon test example above is a vast improvement to simply changing the background color, we still don’t consider it to be as effective as it could be. The developer uses the same character (and same facial expression) in similar positions with slight adjustments to the weapons he holds. The design directions of each Icon are too similar to generate useful results even though a “winning” variation was found.

Now let’s see a hypothesis from the other end of the spectrum…

Strong Hypothesis

  • Gamers to my app need to build an emotional connection to a character before delving into the specifics of gameplay

This hypothesis, on the other hand, allows you to dig deeper into why a certain Icon won and helps you identify the main selling point of your game (e.g., characters or specific gameplay items) that will drive the most conversion. You can also use these insights when designing your other app store creatives.

Strong Mobile App Icon Test

Above is an example of a strong hypothesis being put to test. Typically only testing characters against each other won’t lead to meaningful results, but the key is that they’re testing characters with strong brand recognition. This is a powerful test to conduct in order to determine which of these widely known characters potential installers respond the best to and will convince them to install the game (in this case, it’s The Flash).

2) Relying Solely on Google Experiments Results

One of the significant drawbacks of Google Experiments is that you don’t receive data on visitor behavior or engagement. This means you’re blind to the subtle positive or negative impact that Icon changes could have on visitors’ interaction with your page and their decision to install. This also means you lack insights that should be used to form the foundation and hypotheses of subsequent tests.

Google Experiments Icon Test

For example, in the Google Experiment test above, the developer tested a variety of Icons that showcased different gameplay environments. Although a winning variation was found, meaning that one Icon variation began to receive 100% of traffic, it’s hard to pinpoint what insight can be pulled from the test.

Why did this Icon perform better? The winning Icon looks similar to the Video thumbnail—did it win because it caused more people to watch the Video? Is there a correlation between the Icon and Video plays? If your Icon is increasing Video views and then you change your Video or Poster Frame (Video thumbnail), how will that impact your overall conversion? If you’re unable to answer these questions, you could potentially (and unintentionally) damage CVR. For instance, if the Icon worked because of the correlation with the Video, your upcoming Video release may harm CVR significantly, even if it’s a better Video on its own.

Overall, you need more information on actionable items you can implement post-test, not just an intellectual exercise that gives you a shallow understanding of what Icon had the highest conversion.

3) Using the Same Creatives on the Apple App Store and Google Play

Through testing, you will also gain valuable insight into the differences between Google Play and the App Store. A major issue is that many developers assume the Icons they test in Google Experiments can also be applied to iOS and have similar results. This couldn’t be further from the truth.

For example, in some of our tests with leading game developers, gameplay-focused icons led to higher CVR on Google Play, while character-focused icons performed better on iOS. In this case, CVR would’ve been harmed had the developer pushed the same Icon to both platforms.

Different Stores, Different Strategies

Overall, the App Store and Google Play Stores are fundamentally different platforms, and they should be treated as such in the area of ASO. This is because:

  • The overall design of the stores are still not the same (e.g., no autoplay feature on Google Play videos, image resolutions are different, etcetera)
  • Developers often drive different traffic to each store (i.e., different sources, campaigns, and ad banners)
  • Different apps are popular in each platform so competition varies
  • The user base for Google is not the same as iOS—user mindsets and preferences are different.

 

How to Maximize Success for Your App Icon Test

Upon analyzing successful Icon tests, we’ve found that there are specific factors consistent in each of them.

1) Develop Strong Hypotheses

As we discussed in the previous section, it’s imperative that your hypotheses are precise and framed in a way that will advance your understanding of your app store visitors. It’s not enough just to change the background color or test different characters against each other.

Strong Mobile App Icon Test: Clear Hypothesis

In the Google Experiments test above, Gameloft creatively tested distinct combinations of both character placement and gameplay elements. Each of the variations is different enough in their messaging and design to isolate the reason why a specific Icon won and to provide a foundation that can guide the next test. In this case, the winning Icon shows that visitors don’t respond well to a character in the Icon, implying that gameplay mechanisms are the main appeal over environmental and storytelling aspects.

2) Know Which Metrics Matter

Once you’ve implemented strong hypotheses, you’ll need to know how to effectively measure the test results. As we mentioned previously, Icon testing is never fully pure since you can’t change and test your Icon in every step of the funnel. The insights you can receive, though, are significant enough to create actionable next-steps.

In addition to analyzing your iTunes Connect data and the Google Developer Console (despite its limitations), you should be monitoring:

App Store Engagement Data

One of the most important metrics to use when measuring the impact of your ASO efforts is app store engagement data. This information is key to understanding your different audiences and identifying exactly which messages and creatives drive their actions.

Plus, it’s important to be aware that your Icon functions in parallel with other app store assets; it shouldn’t be looked at alone. For example, in cases where the Icon matches exactly what’s included in the Video thumbnail or Screenshots, the repetition could potentially harm CVR and lead to dropped visitors. In other cases, your Icon may clash with your Gallery and discourage visitors from exploring the rest of your product page.

Monitoring app store engagement uncovers the nuances of visitor behavior and reveals how different creatives impact their decisions. The insights you find here enable you to create a more holistic ASO strategy that takes into account the combined effect that app store assets have on engagement and conversion.

Impact on Different Traffic Sources

Once a test is complete, you should be checking:

  • Overall CVR
  • Browse CVR (conversion of users who discover your app through top charts, featured app pages, or navigation tabs)
  • Search CVR (conversion of users who discover your page by directly searching for your brand, or by searching relevant keywords and seeing your app in the Search Results Page)
  • Referral CVR (conversion of users who arrive from paid campaigns through sources such as Facebook, Google, or network traffic)
  • Click-through-rate (CTR) of search and organic traffic
  • Impression to install (ITI) of paid referral traffic

These metrics also provide insight into the possible correlation between different traffic sources. For example, if your Icon worked well with paid traffic and had positive CTR and CVR impact on search, then you know for the next test that search traffic behaves similarly to UA traffic when it comes to Icon preferences. If you understand how the funnel can be impacted, it’s not as much of a gamble in the next test, which means you don’t have to worry about damaging search CVR while conducting subsequent tests.

Re-engagement Data

Depending on your company’s KPI’s, looking at re-engagement data (e.g., open rate, number of sign-ups or registrations, in-app purchases, etcetera) will also give you a deeper understanding of the impact your Icon has on the bottom of the funnel.

 

Why You Shouldn’t Stop at Icon Tests

There’s no denying that Icon testing should be a critical part of your ASO strategy. Since an optimized Icon can have such a positive effect on your installs, it’s easy to assume that a non-optimized one can potentially harm conversion by the same amount it can benefit you.

However, your Icon is not the only test-worthy app store creative. On average, elements such as your Video or Screenshots can yield higher returns—in some cases, up to 40% higher CVR.

Based on our analyses of over 500M users, we’ve identified the most impactful app store product page marketing assets by CVR lift:

Impact of ASO

*Please note that Google are testing a new Store Listing design which impacts the assets that are visible above the fold. This infographic is based on data in the current layout.

At the end of the day, an Icon test is a great place to start, and the results can give you unique insights into your users and the best way to showcase your brand through your app store creatives.

But when you’re designing Icons, it’s equally as important to look at the other assets at your disposal, such as Screenshots or Video, so you can create a long-term ASO strategy that incorporates all of these visual elements. This is what can truly set you apart and positively impact the return on investment (ROI) of your mobile app marketing efforts.

 

————

StoreMaven is the leading ASO platform that helps global mobile publishers like Google, Zynga, and Uber test their App Store and Google Play creative assets and understand visitor behavior. If you’re interested in testing your Icon or other app store creatives, schedule a demo with us.

How App Store Engagement Data Can Change the Way You Market Your App
What Halloween Can Teach You About Seasonal App Store Optimization (ASO)

4 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

2 + 20 =