Hobnob Growth Engineer: SplitMetrics Completely Altered our Company’s Trajectory

Group 71

Over the course of 5 months, Hobnob (an app that helps people create professional-looking event invitations and distribute them via text message) used SplitMetrics to restore and improve their app store conversion rate, which was declining due to a recent rebrand. Hobnob was then able to apply these learnings across their inbound traffic channels which lead to an increase in top-level app growth.

Furthermore, Hobnob used SplitMetrics’ email collection feature to test each portion of their Android listing while the app was still in development. This ensured an optimized Android listing on launch day and an email list of extremely qualified users, which translated into a 60% conversion rate to download in the Android app’s first month on the app store.

Ashwin Kiran, Growth Engineer at Hobnob is sharing these and other insights with us in our interview.

— Ashwin, what stage was your company at when you started using SplitMetrics?

When we first started using SplitMetrics, we had an iOS app in the store for less than a year, and had just started developing our Android app. We just rolled out a major rebrand, and our updated look was affecting the conversion rate to download of our iOS app. We needed insight into how to fix this, and we needed take action quickly. Incidentally, this is what first lead me to SplitMetrics.

How icon optimization can bring unexpected additional benefits.

— What were your first actions? What was the first thing that you decided to test?

Can you guess which icon delivered 64% increase in conversions?

The first thing I did was start an icon test. Apple wasn’t reporting the number of search impressions Hobnob was getting yet, so I wasn’t certain which step in the search -> product page -> download funnel was affected the most by our rebrand. I decided on an icon test because it’s the only graphic asset that’s shown in both the search results and on the product page, so optimizing this would have a positive effect on both steps.

At the end of this test, we found an icon that performed 64% better than the rebranded icon, and immediately pushed the winning icon live with the next release. This icon was much brighter and had much greater depth than the rebrand icon, and was clearly more conspicuous and dynamic.

After our new icon went live, we immediately saw our product page -> download conversion rate jump back to where it was before the rebrand. Inspired by this gain on the App Store, we adopted this new icon and color, and retroactively re-rebranded ourselves across the web.

This had an unexpected benefit: conversion rates for other inbound tools using our app’s icon increased, and engagement on social media also increased. This had a large impact in the overall health of our business. In fact, this test alone had the single greatest effect on the trajectory of our business because it improved the efficiency of so many of our tools. It’s amazing what just an app icon can do!

Why you should never assume what works for Android will work just as well on iOS (and vice versa)

— What were the next steps in your strategy?

After testing the iOS icon we tested our screenshots. We were running tests on both iOS and Android at the time, and, because of our aggressive release cycle and focus on growth, I generalized the results of tests on one platform to the other platform. Our screenshots tests are a good example of why it’s better to test before implementing.

We tested a set of 5 screenshots with the same background against a set of screenshots with different backgrounds. On Android, the set with the same background lead to an increase in conversion, so we rolled this style out to iOS as well. Instead of the increase we expected, we saw a decrease in conversion rate, which prompted me to take the time to test these screenshots on iOS. We found that these screenshots actually reduced our conversion rate by 20%, after which we restored the screenshots to the way they were. This wasn’t in line with our new branding; however, you can’t argue with the numbers!

Once we implemented the final results from our icon test and screenshot tests, our app conversion rate jumped straight back to where it was before rebranding and increased from there.

How to benefit from a test with no winners

— It seems that these two tests have brought you real success. What tests do you consider not that successful?

Not every test has a winner. In fact, most of my experience with SplitMetrics was running tests that didn’t result in a statistically significant change but provided valuable insight nonetheless. One such test was on our iOS title. iOS titles have competing purposes: as a ranking signal for App Store Search Optimization (ASO), and as a human-readable title that influences users’ decisions to download the app. The perfect app title is one that serves both of these objectives, and that’s what I was striving towards.

Through multiple title tests, I found that the length and wording of the title had no significant effect on the conversion rate to download. This indicated to me that we could use this field mainly as a tool for ASO, rather than one that needed to elegantly convey the name and purpose of the app. This gave us the green light to stick our most important keywords in the title, which had a significant effect on our App Store search visibility.

— At the pre-launch stage you were using an option for collecting e-mails. Did you manage to find the best way to make use of them?

SplitMetrics’ email collection feature was integral to our Android launch success. Not only did we use SplitMetrics to test each element of our Android listing before launch, we were able to collect the email addresses of users who clicked the “Install” button after seeing the app listing. This allowed us to build up a mailing list of our most qualified Android users before we even launched.

By collecting super qualified Android users and keeping them warm through a keep-warm email campaign, we were able to launch our Android app with a 60% conversion rate from product page viewed -> download, which is far above what we’d expect given the need-based nature of Hobnob. This initial boost of traffic, downloads, and accompanying reviews and ratings allowed us to have an extremely strong first week in the Play Store. As a result, we were able to shave a couple months off our growth projections because of the head start this email list afforded us.

Why understanding the user’s pain is vital for testing descriptions

— You also ran a description test twice, and each time you achieved a very good improvement in conversion rates of approximately 13%. Are there any recommendations you could share?

App descriptions are heavily dependent on the product, growth strategy, and competitors. Hobnob is a need-based app; most people who download and use Hobnob find the app by searching for “invitations” or “invitation app” on the app store. As such, Hobnob’s app store listing is often the first touchpoint that users have with the brand.

Hobnob faces a unique challenge in the app store. We compete in a space dominated by two well-known players, and we offer a service that most people don’t even know exists (Hobnob sends invites by text message instead of by email. It’s a really cool way to send and receive party invites, but this isn’t something that people are searching for.)

Within our text metadata, we have to orient potential users to our product offering (Send invites by text message? Okay, I understand what that service is.), and then educate them about why they’d want this (Why is this better than the other services out there? Oh, I get more RSVPs faster? Okay, that’s compelling). This differs from other apps that provide a service people are already familiar with- there’s far less education involved in “selling” an app like that.

Because of our title test (see above), we already learned that we can put “text message” within our title, which piques the user’s interest when it appears within the search results, and contextualizes the rest of the app store description within the product page. We’re then free to use our above-the-fold description to speak directly to the top three pain points we knew that Hobnob solves for event hosts – educating them on the benefits of a service like ours and making the “sale”.

Test 1 Test 1 Test 2 Test 2

Overall, my advice is to fanatically study and internalize your user’s journey to finding your app- their motivations, their options, their pain, and what they’re looking for you to deliver to them. When you understand the road they’ve taken to get to your app’s description, you’re much better able to sell them the new bliss of what life will be like after downloading your app.

  1. If you don’t know at which exact step of a user’s journey you have lost users, it’s better to test every element until you find what has the biggest impact on conversions for your app.
  2. Don’t roll Android test results out to iOS (or vice versa). On Android, the screenshot set with one background led to an increase in conversion; on iOS the same background actually reduced our conversion rate by 20%. Always test before implementing.
  3. Even a test with no clear winner can provide you with invaluable insights. Even if you learn that some element doesn’t affect conversions – you now know that you should focus your efforts elsewhere.
  4. Before testing descriptions, carefully study your product strategy and competitors and define in which direction your messaging should take: educating users, highlighting product offering or selling the product. Then decide which problems you are solving with the title, and the rest unfold within the description.