Prisma: How We A/B tested and Optimized 2016’s App of the Year

20 app ab testing case study splitmetrics prisma app

App Store and Google Play don’t always agree on things, but this time they unanimously picked Prisma as the app of 2016. MSQRD was named runner up for best app of the year for iPhone and a top trending app of 2016 for Android.

What do Prisma and MSQRD have in common (apart from both being photo/video apps, obviously)? …They both used Splitmetrics to A/B test and optimize their store pages and improve conversion rates. Now, of course, we’re not saying that we’re the reason behind the immense success (because we’re too modest and humble :). But we’re happy and proud to have been part of these great stories and would like to share our chapter with you.

In case you missed it, catch up on how MSQRD used Splitmetrics for their ASO efforts. And today let’s take a behind-the-scenes peek at…

How we optimized Prisma – the Best App of 2016

When Prisma’s viral growth started to explode, the team behind the app quickly realized that at this scale even a slight increase in conversion rates will result in millions of additional installs. SplitMetrics was chosen to optimize Prisma’s app page on the App Store and we worked closely with Prisma team to make sure that we leverage the media and viral spotlight to the fullest.

Now we’re happy to share some of the successful experiments that we ran as well as the ones that didn’t turn out as we expected.

Experiment #1 (Screenshots)

When we started out Prisma was already a hit. It was all over the news, so we didn’t want to mess with the anything that could hinder app’s recognizability (so icon tests were out of the question). We went after the low-hanging fruit: optimizing the screenshots as they have the biggest impact on conversions. For our first experiment we chose the low-risk/high-reward strategy: tweaking the screenshots according to the best practices.

Original Screenshots (Control)

experiment-1-var-a-expanded

Optimized Variation

experiment-1-var-b-expanded
  • Moved caption text to the top of the images
  • Bolded the font a bit and dialed up the contrast
  • Removed repetition from caption texts and highlighter more features
  • Added comments and likes and to third screenshot to make it more dynamic
  • Added 5th screenshot

Note: we also experimented with screenshot order, but the original positioning was actually the most effective.

The result? 12.3% uplift in conversion rate for the optimized variation!

experiment-1

Experiment #2 (Description)

We didn’t sit idly waiting for the results of our first experiment. Instead we started optimizing Prisma’s description. We tested a number of approaches: opening on a call to action, bullet lists, flashing cool press reviews, bragging about top chart positions, etc.

descriptions-experiment

Adding bullet lists and using a prominent call to action both seemed to help conversion rates. However while we were running the experiment Prisma became App Store Editor’s pick. This also meant that the app now had Editor’s Notes description on top, which moved the regular description way down below the fold, rendering its impact on conversion insignificant.

editors-notes

So we decided to stop experimenting with the description, but you shouldn’t! Unless you have Editor’s Notes that is :) We, however, focused on the screenshots.

Experiment #3

For those unfamiliar with how SplitMetrics A/B Testing platform works: it creates web pages emulating the App Store where you can change and edit any element of the page (screenshots, icons, etc.). You then drive traffic to these pages (usually from ad networks, cross promo or your website) to see what works and what doesn’t.

Now when running our first experiment for Prisma we decided to kill two birds with one stone. We were running Facebook ads to fuel the experiments, so we first created a set of different banners with different images and ran them against each other to:

  1. Select well-performing banners to drive traffic to the experiments
  2. Get ideas for screenshot imagery

Given the nature of the app we were particularly interested in 2 things: which images to feature and which Prisma filters to apply to them. So we created a set of images of people and also some inanimate objects and processed them with the most popular Prisma filters.

banner-tests

After that we launched a series of Facebook app install ads for Prisma featuring these images as banners to shortlist those that perform best. We’ve also made sure to throw in the girl from the first screenshot into the mix to use as a performance benchmark.

control_fb_1200x628_prisma_1

We targeted the ads at users broadly interested in photography, selfies, photo apps and art to get faster results and better conversion rates, while not going too narrow since Prisma is really a mass market app.

Once we had our shortlist of the better performing creatives it was time to see how they will behave as screenshot creatives. We set up a split test with 4 different variations:

prisma-experiment-2

When the results were in, however, it turned out that none of the contending images seemed to be beating the control by any significant margin. The Variation C was performing ever so slightly better, but we decided that the difference wasn’t enough to take action on it. Moreover, the image of the girl in the first screenshot was picked up by the media, so she was already recognizable and was adding to the app’s credibility, which was critical since the App Store was swimming in copycat ‘prismas’.

So even though we didn’t beat the control in this experiment, it was still a success for a number of reasons:

  1. We validated our assumptions
  2. We saved a lot of time, money and users – if we were to try all the screenshot ideas on the actual App Store it would’ve required much more resources and still leave us wondering at the validity of results
  3. We actually were onto something here for a future experiment (so read on…)

Experiment #4

As Prisma team were getting ready for a major update – introducing video filtering, we were tasked with reflecting this in the new screenshot. It was a big deal and we needed to draw attention to the new feature. And we had just the guy for the job…

Remember this fella?

conversion-guy

This guy performed really well both in the split test (even though he didn’t win) and as Facebook banner so we decided to give it another shot and placed it in the second screenshot. We also added symbols for photo and video to further convey the idea of the update and altered the first caption to say ‘Turn every photo and video into art’.

The result? 19.7% increase in conversion rate!

video-experiment

And the lesson here? Don’t give up on first try, analyze the outcome, tweak your ideas, make adjustments and have another go at it.

And always, always repurpose things that work: Have a really great performing banner? – See how it performs in app store screenshots, in-app purchase dialogues, etc. Did your last news newsletter subject line score amazing open rates? – Try it as an app description, screenshot caption, or banner text. And naturally, run A/B tests to check your assumptions

And we would be happy to help you get your app on the ‘Best of 2017’ list :)

Start A/B Testing

  • Mike Chernetsov

    lots of work that brought impressive results. nice!

  • John Wegner

    Wow, impressive results… well done.

    • Thanks John, we’ll keep the good stuff coming)

  • Dmitriy Perelstein

    How do you guys measure conversion uplift that came naturally from higher recognition of the App by the users rather than from the tweaks that you’ve done? In other words, how do you eliminate selection bias?

    • Good question, Dmitriy, this was rather tricky, especially with an app like Prisma that was growing virally and had a lot of press coverage. To make things even more complicated Apple was featuring it on and off which added a lot of noise to the analytical data. So first and foremost we relied on the results of our A/B tests that we conducted using Splitmetrics platform. Because of the controlled and clean environment we could get very fast initial results that minimized external factors and then follow up over a longer period of time.

      But you’re right some of the uplift could probably be credited to the app picking up more steam – and that’s why we made sure not to tinker with any elements that were already highly recognizable.