A/B testing guide for mobile marketing message

Feb 13

A/B testing guide for mobile marketing message

You know that some experiments just won’t work:

Double toilets at sochi

Take your pick

But for Right Time Marketing, it is about finding that perfect balance of person, time, intention and location to provide the most effective offer or message. So what is that perfect balance? And how do you match it with the most effective call to action?

These questions are not just answered once. In order to learn and improve conversion rates you need to combine experimentation with effective tracking.

The simplest method for doing this is A/B testing.

What is A/B testing?

A/B testing, or split testing, is a simple experiment to determine which option, A or B, produces a better outcome. It observes the effect of changing a single element, such as the colour of the call to action button.
A/B testing of a call to action button
When the process of testing is ongoing, the process is known as champion/challenger testing. The current champion is tested against new challengers to continuously improve the outcome.

A/B testing vs multivariate testing

Multivariate testing is a more complicated form of experimentation that tests changes to several elements of a single page or action at the same time. One example would be testing changes to the colour scheme, picture used and the title font of a landing page.

The main advantage is being able to see how changes in different elements interact with each other. It is easier to determine the most effective combination of elements using multivariate testing. This whole picture view also allows smaller elements to be tested than A/B testing, since these are more likely to be affected by other components.

However, since testing multiple variables at once splits up the traffic stream, only sites with substantial amounts of daily traffic are able to conduct meaningful multivariate testing within a reasonable time frame. Each combination of variables must be separated out. For example, if you are testing changes to the colour, font and shape of a call to action button at the same time, each with two options, this results in 8 combinations (2 x 2 x 2) that must be tested at the same time.

Generally, A/B testing is a better option because of its simplicity in design, implementation and analysis

 

A/B testing for mobile marketing campaigns

A mobile marketing campaign composes of a powerful combination of elements. When these elements are all aligned, they create a user experience where your ad is welcomed, instead of seen as spam.
Some variables include:

  • User segment: who is the most relevant segment to respond to your campaign? Power users? Those that have not returned for a month? Those that have rated your app highly?
  • Timing: is this the “right-time”?
  • Location: should your campaign be triggered by a geofence? If so, where should it be viewed to provide the most relevance?
  • Message: the wording of the notification, including tone, length and clarity. Test different degrees of urgency, alternative headlines and anchoring. For example, if you are trying to re-engage inactive users: do you tell them about new features in the App or entice them with a special offer?
  • Notification type: in-app, push notification, sms or email?
  • Call to action format: direct offer in message, launch conversion webpage, launch video?
  • Offers: varying lengths for a free trial, different specials
  • Sign up forms: different combinations of fields

 

Some different variables to test in a campaign

Some different variables to test in a campaign

 

Steps to carrying out a Mobile A/B test

Lets take an example using StreetHawk’s campaign console. This is a campaign to reengage with lost users.

Example with a campaign from the StreetHawk console

Example with a campaign from the StreetHawk console

Step 1: Form a hypothesis around a question you would like to test. The question here is how to best attract lost users back to reengage with your App. The hypothesis may be testing the effectiveness of contacting them 7 days after their last use verses waiting until 10 days.
Step 2: Decide on a level of statistical confidence. This is how certain you want to be that the outcome of your test reflects the truth. Do not simply compare absolute numbers unless the two numbers are so different that you can be sure just by looking at them, such as a difference in conversion rate between 20% and 35%.
Step 3: Collect enough data to test your hypothesis. With more subtle variations under test, more data needs to be collected to make an unambiguous distinction of statistical confidence decided in Step 2.
Step 4: Analyse the data to draw conclusions. StreetHawk provides you with the engagement statistics for every campaign, including:

  • the number of total users that fall into your selected segment
  • the number of people that saw your campaign
  • the number that clicked through

Step 5: Build from the conclusions to continue further testing. Sometimes this might mean refining a new “Champion” and “Challenger” or scrapping the hypothesis. The most impressive results come from having a culture of ongoing testing.

3 essential tips for testing your mobile marketing message

  1. Start with a clear purpose for the campaign. Because there are many variables of segment, time and location that are likely to be interdependent on each other, the way to ensure all the testing works together is by aligning everything to a strong purpose. For example, the campaign above is clearly designed to bring back users that have dropped out. This gives a good idea of the target segment and a starting point for an initial campaign to be designed. A/B testing will allow you improve on this campaign, but it won’t fix a campaign without a clear purpose.
  2. Define your desired outcome carefully and track appropriate metrics. A successful conversion rate for these campaigns may not always mean a straightforward purchase. Be specific in what you are tracking. In the campaign above, a successful conversion is a customer that comes back to the app through the campaign. On a wider scale, your desired outcome might be an improvement in the App reengagement rate. On a campaign to encourage user ratings, success would mean an increase in both the number and quality of ratings.
  3. The results of each A/B test relates to a specific segment, which is the audience of your message. A campaign designed to encourage power users to subscribe is not the same as a campaign designed to ask new users to subscribe. You may draw some overall lessons from the insights of your A/B testing, but be sure to conduct separate A/B tests for each segment. Trying to reengage users that have been lost for more than a month would require a very different message than one that last used your app a week ago.

Let us know if you’d like more explanation in the comments or just tweet us at @StreetHawkApp

[hs_action id=”5503″]

  Get our Referral Program Guide