Warning: Undefined array key 0 in /home/staging-yoast/staging-platform.yoast.com/versions/a4c25e841414f0e83231dded987decb991c04b7e/web/app/themes/yoast-com/single-post.php on line 46

A/B testing your newsletters

In the whole cycle of optimizing your marketing strategies, your newsletter mustn’t be forgotten. Make sure your newsletter is of added value to your audience and is of high quality. Of course, there’s always something to improve. You can make improvements based on your intuition but why not test that intuition first by A/B testing your newsletters?

In this post, I’ll dive into newsletter A/B tests, by explaining what you can test. I won’t discuss testing examples, but I will tell you what aspects you should pay attention to when testing.

Subject line

With most email campaign tools, you’ll have the possibility to A/B test the subject line. That means you’ll be able to give your newsletter a number of different subject lines. If you test 2 different subject lines, ordinarily 50% of the subscribers receive the first variation, and the other 50% gets the other variation.

A/B testing your subject lines is only relevant for testing your open rate and not your click rate. The open rate is the percentage of how many successfully delivered newsletters were opened by your subscribers. The click rate percentage, on the other hand, gives insight into how many successfully delivered newsletters registered at least one click. The subject line won’t make a difference for your click rate, since it doesn’t affect anything within the body of the email you’re sending. That being said, testing your subject lines is still very important, as you want as many people as possible to read what you have to say. So you want your subscribers to open your newsletter, right?

One set of rules that our friend Jordie van Rijn (a great email marketer) taught us, which has greatly helped us is C.U.R.V.E:

  • Curiosity: try to pique the readers’ interest by asking them a question.
  • Urgency: create urgency by having limited time offers or offering things that need to be done now.
  • Relevance: Make sure you’re putting the content that’s most relevant to your audience in your subject line.
  • Value: Convey the value of the newsletter by offering something exclusive (this can be an exclusive product offer, but also exclusive content).
  • Emotion: Use punctuation, such as exclamation marks, to elicit emotional responses from your readers.

From name

Another thing you can almost always test is your from name. This is exactly what it sounds like: the name that shows from whom the emails are coming:

from name: newsletter@yoast.com

This is, again, something that will only affect your open rate. However, this is an aspect that people tend to forget about, because it’s such a small thing to change. However, the from name can be pretty important. It’s the first thing people see when your email arrives, so it had better be good. Testing this will make sure it is.

Send time

I’m not sure whether all email campaign tools offer this A/B testing option, but MailChimp does. You can test what send time (MailChimp calls this “delivery time”) works best for your audience. You need to do some work here beforehand, though, because you’ll have to decide at what time the variations go out yourself.

So, try to find out when most of your emails are opened or at least when the majority of your audience is awake. Especially if your emails go to an international group of people, like ours, this might be a good thing to test. Sending your emails at the right time can result in more people seeing your newsletter and getting invested.

Content

Content is the big one. This is where you can go all-out and test anything you like. Everything within the content section of your email can be tested, and that’s a lot. You have to think about what you want to test and treat these A/B tests as you would any other. We’ve written a post that will explain this: Hypothesize first, then test. In any case, it’s crucial that you test only one aspect at the time. Otherwise you can’t tell which part of your A/B test caused a higher click rate.

I always prefer to begin with this one, because it’s furthest into the subscribers’ process of receiving, opening and reading a newsletter. I test content first, because I don’t want to optimize a part of my email (say, the subject), while what the readers see next (like the email’s content) could undo all the optimization I did before.

Just a few ideas of what you could think about when you want to test your email’s content:

  • Your email’s header;
  • An index summarizing your email;
  • More (or less) images;
  • Different tone of voice;
  • More buttons instead of text links;
  • More ideas on Jordie’s blog.

Before testing

When you start testing, most email campaign tools will offer you two options:

  • send your variations to your complete list, or
  • send your variations to a percentage of that list, declare a winner and then send the winner to the remaining people who haven’t received a newsletter yet.

I’d strongly urge you to use the first option. Let me tell you why. First of all, sending multiple variations to just a sample of your list means that you’re cutting down on ‘respondents’. You’ll have more data when you send it to the complete list. And that means your results will be more reliable.

However, if your list is big enough, this probably won’t matter much. The reason I’d still choose the first option is that, using the second option, the winning variation gets sent out hours (or even days) later. Especially for newsletters, this can be problematic, because, well, at this point it’s not really “news” anymore. Using the second option also means you can’t determine the exact time the email will be sent. And, as I’ve already said: send time can be quite important.

If timing is less important to the emails you’re sending out, and you have a large list of subscribers, you could go for the second option. In that case, the remaining people in your list will always get the winner, which could be beneficial.

Results

So you’ve thought up some brilliant variations of your newsletter’s content, its subject, from name or send time. Time to send out that newsletter! Once you’ve sent it, there’s nothing more you can do. You just have to wait until the first results come trickling (or flooding) in. Make sure you take notice of the differences in results. Which version got the highest open rate? Which version had the best click rate?

When comparing results, click rate always has my priority. After all, a high click rate means your readers will probably end up on your site, where you have a lot more opportunities for selling, for example. However, we also always use custom campaigns on all the links in our newsletter. And since we’ve set up eCommerce tracking in Google Analytics, we can see which version of our newsletter generated the most revenue. If you have a business to run, that’s probably the metric that you want to see increasing.

Unless you’ve set up some kind of eCommerce tracking within your email campaign tool, this metric won’t be available in their results. So don’t value the results of these tools too much. Make sure you focus on what’s important for your business and check those metrics.

Also: don’t be too quick to judge. I usually wait for a few days, up to a week before I draw my conclusions because a lot of people will still be opening and engaging with your email after a few days.

Happy testing!

What do you think of the steps and rules we’ve set for ourselves? Do you have similar ideas that you follow? Or maybe something completely different? Let us know in the comments!

Read more: How to analyze your audience »

Coming up next!


3 Responses to A/B testing your newsletters