10 Ways Marketers Use App Store Page Analytics To Increase Conversion


What happens when people come to your app store page? Specifically, what are visitors doing from the moment they arrive via a campaign link to the moment they drop off or hit the “get” button?

Knowing how visitors move through your app store page and interact with its content helps you optimize it to increase conversion.

We launched advanced visitor behavior analytics for app store a/b testing two months ago. In this article, I’ll share how our clients have been using the new product features to run better a/b tests, assess the performance of the page, and optimize their app store visual assets.

Note: the first free experiment comes with all the advanced analytics features. Sign up and try.

1. Find out if users actually “read more”



Most description tests have to do with the first paragraph or even the very first line of text. Perhaps not surprisingly for many marketers, visitors rarely go beyond the “read more” break. People will read the first couple of words and then make a decision. About 1% of all page visitors take the time and effort to read the full description.

Below are some of the common one-liner description tests marketers run:

  • Pricing (“Limited time offer”)
  • Raving user and media reviews (“As Seen on Forbes”)
  • Social proof (“Enjoyed by X millions fans)
  • App awards and achievements (“Featured by Apple in 10+ countries”).

Sometimes, even one word can make a difference. One user acquisition manager at a health app shared a story where he removed “please” from an app description (the message was “please be safe with prescription drugs”) and saw a 31% lift in conversion after running an experiment on Google Play. “Stop being so polite” was his advice to the online mobile community.

On average, we’ve seen a maximum increase in conversion of 59,6% after description a/b testing across different app categories.

2. Know exactly when users click “Install”


“Conversion is the only metric I care about, ” I hear that a lot from user acquisition managers who run a/b tests. By following visitor path on the app page, they see which events triggered the conversion event. You can compare how many people downloaded the app as soon as they got to the page (direct installs) with those who scrolled and “read more”.

Direct Installs

A download isn’t a commitment to use the app or even open it; many will install the app after spending zero time on the page. Some of the popular sources of direct installs are the publisher’s website, word-of-mouth, and traditional media campaigns. Machine Zone, Supercell, King, Wargaming, Big Fish, Good Games Studios, and other big players that use TV as one of their marketing channels enjoy a high volume of direct traffic and instant installs.

Installs after Scrolling

On average, 15 to 20% of users scroll screenshots past the first one. Obviously, this very first screenshot should be designed for maximum conversion. To increase scroll rate and app page engagement, marketers experiment with different colors for each screenshot. Another popular way to up visitors’ engagement is to design screenshots as frames of a larger canvas.

Related: Read more about how to design converting screenshots here.

3. Use competitive intelligence to set goals


How do you know if your baseline or post-experiment app page conversion is any good? When you run A/B tests, you can set realistic goals for your app store optimization activities using industry benchmarks. For each test in Games, Entertainment, Social Networking, Education, and Health and Fitness categories, the software shows how you measure up to your category competitors.

Related: Check out the app page conversion benchmarks report for 2015 here.

4. Rate visitors’ interest

variations overview

Engagement rate is somewhat of a barometer for the page design appeal. It helps marketers figure out if the creatives need more design and editorial work. Engagement as an app store performance metric describes any interaction with the content. High-impact creatives make visitors click more. You can see it in the proportion of “explored” vs. “bounced” visitors. Marketers often think of the engagement rate for each screenshot as a separate ad unit and measure the impact of creatives separately and as a whole.

5. Rank the performance of each screenshot

Screen Shot 2016-04-04 at 3.03.34 PM

For each screenshot, marketers can get a breakdown of how many people viewed it, for how long, and how many installs each one brought. These data may help identify the need for rearranging screenshot order. Most often, we see that few visitors scroll to the last screenshot, so the main focus for design and editorial should really be on the first two.

Another telling metric is the number of views in full screen mode. Few people expand screenshots. It’s critical to make sure your caption fonts are large enough to be visible as you browse the store without going into full screen. Obviously, very few, if any, will zoom in on anything on a screenshot. Sophisticated details in app store design creatives will likely go unnoticed 99% of the time.

Related: Read how SongPop2 increased its conversion with larger fonts and bolder colors here.

6. Test relevance of campaign traffic


App marketers can look at the average session duration not only to gauge engagement, but also to identify potential issues with campaign targeting. This metric is especially telling if you segment users to measure session duration for a specific traffic source, region, device, OS, or demographics. Detailed reports will give you a more accurate representation of your app store page performance.

However, with the average time on page, there’s one catch. In case a visitor leaves the alternative page and then returns in a couple hours to install the app, the system calculates this whole time as his average time on page. This is rare, but something to be aware of, as it can screw up your experimental data. It’s always best to try and pull the whole dataset and single out the outliers.

7. Filter visitors

Screen Shot 2016-03-30 at 11.12.13 AM

It’s generally a good idea to filter yourself, your user acquisition team, designers, clients, or anyone else you don’t want to see in your experiment results. Use custom filters to exclude IP addresses, operating systems, and other things.

Note: you can find your public IP address if you google “what’s my IP address” or ask the IT department about company’s IP addresses and subnets.

8. Rule out the probability of random chance results


Smart software can help strike a balance between having a large representative sample and overspending on paid traffic for the test. As you run your experiment, SplitMetrics smart assistant will show you how many installs per variation you will need to get statistically significant results. Usually it’s about 100 installs per each variation.

Related: Learn other tips on how to run tests to get statistically significant results and avoid common marketers’ fails here.

9. Spy on your users’ scroll journey

a/b testing analytics heatmaps

Scroll depth report gives marketers data on how far visitors are scrolling and where most will stop. This helps to understand visitor behavior and optimize for better engagement. For many marketers, it also makes a great case for prioritizing app page redesign. Some use it to show their boss how design changes worked for the app page performance beyond conversion numbers.

10. Segment users

Screen Shot 2016-03-30 at 2.16.07 PM


Advanced segmentation option helps you single out certain subsets of data and see changes in demographic trends. For example, marketers can track the impact of localization on the new market. Some other use cases include tracking the difference in how women and men interact with campaign creatives. Also, marketers will sometimes look at the results for different age groups to validate campaign targeting.

Optimize Beyond Conversion

Looking at conversion numbers is scratching the surface of what you can do with app store optimization. With advanced analytics, you can get rich data about your app page visitors: see the breakdown of installs by each event that triggered it, look at you app page visitors’ scroll journeys, filter what matters, and track how you measure up to the competition.

New to SplitMetrics? Create your first test in less than five minutes.