Mallzee: Learning Value of Store Optimization Negative Results and Getting 10% Conversion Improvement

Scaled By
UK’s top non-retailer shopping app Mallzee understood the true value of negative results within their store optimization activity. A/B testing prevented disastrous icon changes that could cost the publisher 25% conversion decline. Eventually, a series of experiments identified the icon design that resulted in 10% boost.
Mallzee is a data and technology company which offers a range of solutions for both consumers and retailers. They stand behind the original multi retailer fashion shopping app, allowing users to shop for hundreds of brands from their phones.
Share this case


When you get down to store optimization via split-testing, there is one thing you should remember. The fact is not all app A/B experiments show a boost in-app conversions.

Other two types of results app developers often get from mobile A/B testing within app store optimization are:

  • No difference between optimized and control results.
  • Optimized results perform worse than control ones.

App developers interpret zero-results as lack of app A/B testing benefits for them. Yet, the reality is that if a hypothesis works worse or shows no difference, A/B testing allows you to see it on time and save money you would spend if changes in the live version of the app page were implemented.

It also helps you understand which direction leads you to your final goal – conversion improvement via store optimization. Basing on data app developers get from unsuccessful tests and App Store analytics, they continue developing ideas that will ultimately bring significant results.

Mallzee is the UK’s top non-retailer shopping app for iOS and Android, helping users quickly find and buy clothes from 100s of high street brands. After overwhelming user growth in 2016, Q1 2017 at Mallzee was about further refining each area of their marketing strategy.

With good organic visibility and very strong paid acquisition channels, they identified store optimization of conversion as a key area of focus that would impact across acquisition strategy.

Rachelle Garnham, a digital marketing manager at Mallzee, shares their case study on getting valuable audience behavior insights after running a series of tests with negative results.


We formed a thorough plan for our tests, ensuring the changes were minimal to allow a clear impact to be understood and an expectation of iterating and adapting our plan after each result. We’ve tested multiple areas from screenshots to description, icon, and titles.

For our experiments, we have chosen SplitMetrics platform as it allowed us to build the tests very quickly and iterate on our results. It made our store optimization efforts easier.

Our icon tests have been particularly interesting. We inverted the colors of our logo for a more vibrant and appealing look, assuming that this would improve conversion.

Did we see the results we expected? In fact, this variation was 14% less effective than the original. Quite a significant difference on such a small change.

Similarly, we trialed a more fashion-focused icon within our store optimization plan. This time the original proved itself to be 28% more effective.

Whilst it was initially disappointing to see that our new ideas didn’t have a positive impact, testing on elements like this has been invaluable, even when the results are negative.

If not for app store optimization, this type of change would previously have been made without testing. Assuming that it wouldn’t have such an impact, we would have lost valuable conversions to download.

Mallzee optimizes icon with SplitMetrics

This taught us the importance of testing any assumptions before implementing them live on the App Store.

We were keen to implement new app store optimization options, and so we also tested title ideas. However, the two title options that fitted our store optimization plans were not that effective. Once again we saw a 25% and 17.5% lower effectiveness compared to the existing text.

This was an important lesson in balancing the value of app store optimization rankings and on-page conversion optimization. On-page conversion also impacts our paid acquisition channels, so it’s usually most important to prioritize this over app store optimization elements.

Whilst we’ve run some tests that had no significant winner, we now test all App Store changes as standard. We are constantly working to improve our overall conversion via store optimization.


We ultimately iterated through enough versions to find success with our brand name logo which is now live on the App Store and has improved conversion by 10%.

app icon optimized with SplitMetrics

It’s highly important to prepare yourself for one important store optimization truth – your assumptions will usually be wrong! It’s important to test and not assume that your opinion will extend across your entire potential user base.

Small changes can have a big impact. Even if you think a change isn’t significant enough to test within your store optimization strategy, it could have a substantial impact on conversion. You don’t want to make that mistake live on the App Store if you have 1000s of impressions to be impacted every day.

Don’t make your App Store decisions based entirely on branding rules. You, of course, want to keep your branding consistent but when small design changes can help convert users, it’s very important to be flexible. It can even inform wider design decisions away from the App Store.

When you get down to store optimization, remember to test individual changes independently and thoroughly so you know exactly what change has had an impact.

Want to achieve similar results?
Schedule a demo call with one of our optimization experts
Book a Demo
Share this case
Take your app performance to a whole new level
Empower your app conversion rates on the App Store and Google Play perfecting product page assets
Book a Demo