How Not To A/B Test: 7 Fails of App Marketers


Split tests have got so mainstream that people are now A/B testing everything from educational videos to dating profiles (yes, this is happening). Yet, commoditization of the tactic means people too often fall into a trap of overlooking some basic rules. Let’s look at some of the A/B testing fails in the context of app store pages.

1. You got the wrong sample size.

It’s hardly possible to derive conclusions out of an experiment with fewer than 100 installs per variation. Smaller sample size will get you results that are more folk wisdom than science. Obviously, a larger representative sample will yield more accurate data. At the same time, you don’t want excessive responses, because driving traffic to the test can be pricey.

You can use a number of formulas from statistics to calculate sample size based on population size, margin of error, confidence error, and standard deviation. You can try a number of online tools to estimate the sample, such as Stata, nQuery Advisor, and UnifyPow. The simplest way to calculate it online will perhaps be one of the following online tools:

Evan’s Sample Size Calculator

SurveySystem Sample Size Calculator

a/b testing sample size

When you run A/B tests with SplitMetrics, the software will track your experiment to ensure statistically significant results.

split testing on the app store conversion rate

a/b testing app store drive traffic via facebook

2. You don’t have a hypothesis.

When you run a test on your app page, it helps to be strategic about how you design the experiment. A solid plan about what you’re going to test will keep you on track. A clear understanding of the desired outcome will get you to the goal faster and with fewer resources. But before you formulate your A/B test hypothesis, ask yourself the following two questions:

1) What’s your goal?

2) What could be a potential problem with your app page?

With app page a/b testing, the goal is usually visitor-to-install conversion rate, and the problem is usually some design element(s) that could hinder it.

Now you can create a hypothesis. Two things should be in it: 1) what is your target improvement in conversion 2) what is it that you’re going to change to fix the problem you identified in the second question.

Let’s create a sample hypothesis for Rovio’s Angry Birds 2 app page to illustrate.

If we consider an ever-increasing cost per click on mobile app install campaigns, even an incremental increase in conversion rate will help save a lot of money. A median conversion range for games, according to 2015 SplitMetrics comprehensive study on 10M users, is 4,47%. Let’s consider it our baseline conversion; and we’re feeling adventurous enough to try and get it to 10%, or a 5.53 points increase (+123,71%).

Here’s one idea: Most games app developers publish app screenshots in landscape orientation. For games category, it’s almost the norm, so – taking a guess here – if we switch to portrait mode, will our app page will stand out?

a/b testing app store optimization angry birds As a result of this split test, Rovio’s Angry Birds 2 increased page conversion by 13% (portrait orientation won)

Conversion goal: app page visitor-to-install conversion of 10%

Problem statement:“Many games use landscape orientation screenshots, and ours doesn’t differentiate”.

Hypothesis: “By switching our screenshots from landscape orientation to portrait, we can achieve a higher conversion rate”.

Now, with this hypothesis in mind, we can run tests on 2 sets of screenshots with portrait and landscape orientation.

3. You don’t integrate with install tracking tools.

If you use marketing tracking tools such as Tune, Appsflyer, and Adjust, you can integrate them with your A/B testing software and attribute new app’s users to your testing campaigns. This will allow you to get rich data on the kind of users you acquire.

SplitMetrics works with any mobile tracking or analytics software. The custom parameter is sent with a variable {click_id}, and you can add it to your custom install URL.

integrate a/b testing software with install tracking

4. You don’t run follow-up tests.

When you run your first test, external factors may skew up your experimental data. Iterative testing comes in handy not only to question and confirm overall conversion changes, but also to double-check which of the page elements triggered more install events. You can focus on one element (e.g., first screenshot, third screenshot, vs. read-more) and see if it was driving your campaign’s results.

app store advanced analytics features with splitmetrics

5. You test too many variations at once.

The problem with multivariate tests is that they are difficult to set up. They require extensive planning and don’t really prove that a certain variable was responsible for the increase or decrease in conversion. With too many changes on your app page at once, it’s hard to tell exactly what’s working and what’s not. Simple changes like test of background color, orientation, a different first screenshot, and a different order of screenshots are much easier to set up and produce data that makes sense. If you don’t have the luxury of unlimited time and traffic, try to make one change at a time and then iterate.

a/b testing app store optimization with fiftythree paper app

Paper by FiftyThree marketing team saw an increase in conversion of their app store page with different colors for each screenshot 

6. You stop experiments too early.

Commit to a sample size before you start the experiment and wait until the experiment is over. I highly recommend reading Evan Miller’s article from back in 2010, as it’s one of the best online resources on the topic. Evan argues that A/B testing experiments used with a manual or automatic “stopping rule” will produce invalid tests. This mistake is the result of what is known in statistics as “repeated significance testing errors”. 

7. You don’t keep testing.

Seasonal buying habits, demographic changes, market changes, product and competitor changes can all drive different results at various time points. To stay ahead of competition on the app market, you want to keep moving fast and optimize as you go. Top mobile publishers often take advantage of the multiple sales and promotions running on app stores. Such opportunities include Halloween-themed design changes in October, Christmas app store “decorations”, or even extreme brand makeovers.

a/b testing seasonal app store icons

Seasonal icons from Rovio’s Angry Birds: Easter eggs, Santa hat, Valentine’s hearts, and Patrick’s green vibes


Here’s a list of additional online resources on A/B testing if you want to dig deeper into the topic:

1. Statistical Analysis and A/B Testing by Jesse Farmer. An excellent read on how to tell if the variations in visitor behavior that you see are due to random chance. Farmer covers what’s null hypothesis, Z-score, and other cool things. You’ll also find examples of landing page conversion testing.

2. The Ultimate Guide to A/B Testing by Paras Chopra. A comprehensive article from the collection of Smashing Magazine, covering basic stuff like what you can A/B test. It also has some good links to the classic A/B testing case studies.

3. Naked Statistics by Charles Wheelan. “A lifesaver for those who slept through Stats 101” is a pretty accurate definition of this Amazon bestseller.


App store page a/b testing offers mobile marketers a bunch of benefits, including higher conversion rate and an increase in organic app downloads. The beauty of a/b testing is that it’s like going straight to your customers and asking them what they like and don’t like about your app page design. However, with the rise of a/b testing cult comes the risk of overlooking the simple premises this scientific method is based on. We’ve covered 7 common mistakes marketers make when they conduct app page split testing. By being strategic and smart about your ASO experiments, you’ll gain better and more reliable insights.