Eight Things To Split-Test In Your Email Campaigns

Continuous improvement in any field of work is driven by regular testing and evaluations. Email marketing is no different, and testing is essential in order to measure and cater to your database’s behaviours and preferences. Split-testing is a method that will enable you to compare the impact of different elements of your emails, which in turn will help you to improve your email marketing over time.

However much you might feel you “know” your average email database subscriber, the only way to truly find out how they respond to your content is to test, test, and keep on testing – because not only can people’s preferences change, but as the recent situation with COVID-19 has demonstrated, their circumstances do too.

If done consistently, split-testing will enable you to gradually improve your email marketing performance over time.

So what is email split-testing? And what should you test?

What is split-testing?

Split testing, sometimes referred to as A/B testing, is the name given to the practice of creating different variants of an email campaign, splitting your database so that some people receive different email variants, and measuring the results of each variant in terms of open rate, click-through rate, and engagement.

For example, you might send an email to your database and choose to split-test the effect of image type on how your database engages with the email. In this instance, you would create your email, duplicate it, then change that one factor – the image. Keeping all the other elements the same is essential in order to pinpoint the impact of the image alone.

Two of the most popular split testing methods are as follows:

  1. Sending your split test out to the whole database, and measuring the impact on those large sections of data
  2. Sending your test out to a small section of the database first, checking the results, then sending out the best performing content to the remainder of the database

Eight things to split-test in your email campaigns

There are lots of things you can split-test, but here are a few ideas to start with:

  1. Subject lines. Test the impact of different subject lines on the open rate. For example, you might wish to test the impact of asking a question in your subject line vs using a statement.
  2. Preview text. As above, try different kinds of preview text to see which resonates more with the recipients.
  3. Emoji use. You can test the impact of content with or without emojis, type of emojis, or number of emojis.
  4. Images. Test different images to measure the impact on click-through rate
  5. Video vs. static image. See if a video or a gif gets a different response from your database
  6. Time of sending. See if the time of day at which you send your campaigns has an impact on open rate and click-through rate, by sending your email out at different times.
  7. Length, tone, and style of your written content. See if your audience responds differently to different types of content. For example, you could split test a short, light-hearted email with a longer more formal one. Or you could split test an image-heavy email with a text-heavy one.
  8. Layout. You could split test different layouts, such as the positioning of your headline, images, or call to action.

Each time you perform a split test, review the results to discover which version performed best, and make a note of this for your next email. Your main key performance indicators (KPIs) to measure will likely be open rate, click-through rate, and the conversion into revenue value.

In addition to comparing your split-test results between campaigns, also keep in mind the standard industry benchmark for these KPIs. If your industry’s benchmark for open rates is 25%, and neither of your test results meet or exceed this, then it suggests that there may be more work to be done.

Do you already do split testing in your email campaigns? What have you tested – and how did it go for you?

What is Split Testing and What Should I Test? #emailmarketing Click To Tweet

Read More: