— 19 Jan 2018

App Store A/B Testing Timeline: from Research to Follow-Up Experiments

Liza Knotko

Intelligent ASO is capable of increasing conversion without extra traffic expenses and app A/B testing is its core ingredient. Once you decided to optimize your product page, there’s no use applying random changes in the app stores and waiting for phenomenal results straight away.

The truth is, you never know what will work and what will deteriorate your app’s performance. That’s why app A/B testing is a must when it comes to ASO.

Basically, A/B testing is a method of hypothesis checking. You distribute traffic equally among two or more variants of your app store page element (icon, screenshots, etc.). Each group represents the whole audience and behaves like an average user would. Thus, you identify a variation with the best performance.

Download a FREE app A/B testing workflow template

Split-testing doesn’t only result in app conversion rate improvement, it also assists in the validation of various audience segments, traffic channel, and product positioning. Pre-launch experiments may also help in product evaluation and ideas qualification.

prisma-splitmetrics
Source: splitmetrics.com

When you’re an A/B testing newbie, it might be hard to know where to begin. Indeed, A/B testing is not as straightforward and easy as it may seem at first. However, there are certain steps which are integrant parts of any A/B testing activity.

Let’s review each step of a classic A/B testing workflow. Any A/B testing timeline should include the following action points:

  1. research and analysis;
  2. variations brainstorming;
  3. designing of variations;
  4. running your A/B experiment;
  5. results evaluation;
  6. results implementation and tracking;
  7. follow-up experiments.

Research and Analysis for A/B Testing

Duration: 2 days – week

Let’s make it clear – leading companies don’t A/B test random changes these days. They’d rather use research and elaborate strong hypotheses based on the collected data. And, as we know, a solid hypothesis is a cornerstone of any A/B testing success.

Here’s the list of core research activities made by Gabe Kwakyi from Incipia that can help you embark upon this highly important phase.

  1. Compare your screenshots, feature graphic, videos, and icon designs with the styles of other apps that appear near yours in keyword search results. Your app has to stand out among these apps in order to earn a user’s initial install.

It’s preferable to focus on the keywords that have the highest volume and for which your app ranks best for, as these keywords will have the highest contribution margin to your organic downloads.

App Store A/B Testing Timeline: from Research to Follow-Up Experiments
Source: App Store
  1. Compare the messaging found in your captions, videos, and descriptions against that of your competitors. You are to come up with the messaging that gives your app a competitive advantage.

Think over what statements you can make that your competitors cannot. Analyze the advantages your rivals possess and how you can overcome them in order to convince users to download your app.

  1. Read your app’s reviews to determine the language and key features that users care about and integrate it into your messaging, mark selling points that don’t appeal to your users and are basically the dead weight of your product page.

Try to focus on features that are marked most useful by users, inspect the most recent reviews for better results.

  1. Review your app’s usage data to figure out which features or configurations are most-used or most-purchased. It makes sense to use these insights to increase your app listing’s appeal to new users.

Brainstorming Variations for A/B Testing

Duration: a couple of hours

Your variations for A/B Testing should always reflect a quality hypothesis. Form your hypothesis and create variants after researching how your app can be best positioned relative your competition.

The ASO guide assists in setting your goals and choosing the type of an experiment you want to launch.

Keep in mind that hypothesis is not a question, it’s a statement which suggests a solution, its expectation effect, and its rationalization. For example, Prisma decided to test the following presumption based on the app category best practices:

Using a bolder font in captions placed on the top of screenshots triggers conversion growth due to better readability.

Prisma tests screenshots with SplitMetrics
Source: Prisma case study

Thus, on this step, you’d also want to think over variations layout based on the hypothesis you plan to test and prepare technical design specification.

Designing Variations for A/B Testing

Duration: 1 day – a few weeks

It’s the phase where designers create layouts for the upcoming A/B testing. The design is to reflect presumptions under the test and correspond to the technical design specification.

The duration time of this phase depends on the workload of your in-house designers. Companies that don’t have a staff designer have to resort to third-party solutions which, naturally, takes more time.

Designing variations for SplitMetrics experiments
Source: theapplaunchpad.com

Running Your A/B Experiment

Duration: 7-14 days

You can proceed with A/B testing itself once the first 3 steps are completed. It’s vital to choose a traffic source and appoint a target audience before launching an experiment. Remember that users are to be split equally.

A/B test won’t even bother you if you decide to test with SplitMetrics, the tool which will distribute audience members automatically landing these users on two different variations.

All you have to do is fill your experiment with quality visitors and the platform will do everything else for you:

  • collect powerful insights into visitor behavior on your app page;
  • filter fraudulent traffic to ensure results trustworthiness;
  • calculate confidence level sufficient for determining the winner of your experiment.

It’s recommended to run split-tests for at least 7 days to keep track of users behavior on all weekdays.

 
App Store A/B Testing Timeline: from Research to Follow-Up Experiments
Source: splitmetrics.com

Evaluation of A/B Testing Results

Duration: a few hours

You can finish your test and start analyzing its results at reaching a statistically significant number of active users who visited the experiment. If you run your test with SplitMetrics the system identifies the winner automatically when the experiment reaches 85% confidence.

Upon reaching the trustworthy confidence level, your original hypothesis will be either proven or refuted. It’s worth mentioning that a disproved hypothesis doesn’t mean that your A/B testing ended in a fiasco. Quite the opposite.

App Store A/B Testing Timeline: from Research to Follow-Up Experiments
Source: SplitMetrics dashboard

Negative results prevent us from making changes that could cost us thousands of unseized installs.

Conversion changes analysis is a must, there’s no denying. Yet, don’t forget to spare time to explore other metrics which will help you understand your users better:

  • video metrics;
  • scroll heatmaps;
  • time on page;
  • interaction funnel;
  • screenshot stats;
  • behavioral segments.
 
App Store A/B Testing Timeline: from Research to Follow-Up Experiments
Source: SplitMetrics dashboard

Results Implementation and Tracking

Duration: 1-2 months

Providing you find a clear winner, you can implement the results in the App Store straight away. It goes without saying that you should track the conversion changes after the upload of the optimized product page elements.

If you stay consistent with your A/B testing activity, the results will speak for themselves. Nevertheless, your conversion won’t change overnight. It normally takes 1-2 month to see a new trend gain ground.

Follow-up Experiments

Duration: 3-4 months

It’s crucial to turn A/B testing into an essential part of your ASO strategy. The truth is that app stores are ever-changing systems subject to constant alterations.

It’s critically important to run follow-up experiments to scale results.

If you want to reap the maximum benefits from A/B testing your product page, mind the following golden rules:

  • Your variations should look like store pages to ensure natural behavior of users;
  • Only one hypothesis should be tested within one experiment;
  • Statistically significant results can be attained only if you fill the test with enough traffic (the higher conversion, the fewer users you need);
  • It’s a good idea to run an A/B experiment for 7-14 days to track users behavior on all weekdays.

When it comes to app’s conversion, there is always room for improvement and well-structured approach to A/B testing can help you become an App Store highflyer. However, as in all things, success requires time, dedication, and persistence.

Download a FREE app A/B testing workflow template

Share this article
Liza Knotko
Liza Knotko
Ex-Marketing Manager at SplitMetrics
5 years of experience in mobile marketing, responsible for helping app publishers, such as MSQRD, Prisma, ZeptoLab, Rovio and Wargaming, succeed in their conversion optimization strategies.
Read all articles
Apple Search Ads Optimization
Cut CPA by 50% and double campaign ROAS with our free AI-powered automation solution.
Create Free Account
Share this article