After pouring time into designing your email, writing a compelling subject line, and determining the right time to send it, you want your email campaign to succeed. You want to see open and click rates well above industry average and conversions flowing in.
In order to maximize the results of your email campaigns, it’s important to ensure you’re using the best possible version of an email. That’s where A/B testing comes in. You’ll create two different versions of an email by varying one particular detail — like the subject line — and run a test to find out which is more effective.
What to Test
Email can seem really simplistic, but there are actually a lot of elements that go into designing the perfect email. Each of these elements can affect your email’s performance, so it’s important to test several of them. Here are a few that you should definitely play with:
-
Call-to-action (CTA) — Color, placement, copy — your CTA should be super optimized for it all.
-
Subject line — Emoji or no emoji? Ask a question or tell what’s inside?
-
Layout — One-column or two? Where will images go?
-
Personalization — Will using the recipient’s name boost engagement?
-
Images — Will you include images? (Hint: YES) What will the image be? Where in the email will it go?
How to Test
If you’re using an email campaign tool like MailChimp or Campaign Monitor, it’ll do most of the work for you — randomly segmenting your subscribers into two lists and comparing the results from each version of the email.
Otherwise, you’ll need to create two lists by hand and manually analyze the resulting metrics. If you’re doing this, make sure your lists are randomly selected to ensure the most accurate and representative results. Exporting your reports to Excel can make it easier to analyze the data, too.
Best Practices
A/B testing doesn’t have to be complicated, but there are some simple guidelines you should follow to ensure the tests run smoothly and the data is actually reliable.
-
Test one variable at a time. Choose one variable to alter per test. Changing more than one aspect can make it impossible to determine which variable actually caused any changes in open, click, and conversion rates.
-
Use a large sample. It’s easy for biases to emerge when you’re testing a very small sample, and that means any data you collect isn’t necessarily representative of the list as a whole — making it essentially useless.
-
Randomly segment lists. If your lists aren’t randomly selected then data can very easily become skewed toward one version or the other. Both lists should be composed of open and click rates that are as equal as possible.
-
Send both tests at the same time to limit time as a factor. This goes along with only altering one variable per test — the time and day of the week can have a big impact on open and click rates, so keep them constant when testing.
-
Listen to the resulting data. We’re always going to have a hunch or favorite version, but if the data says something else, listen to the data. The whole point of A/B testing is to create the highest performing email, so don’t disregard what the results tell you.
Defining Success
When it comes to A/B testing and email campaigns, there’s no definitive answer to what makes a test successful. That’s why it’s important to define what your goals are — so you can assess whether or not the test is successful.
The first step is to determine which metrics you want to improve. Usually you’ll be choosing to focus on open rate, click rate, or conversions. Once you know which one, take a look back at historical reports and take note of the average metrics. What does your current open rate typically look like?
Once you know where you’re starting from, decide where you want to be. If click rates are typically around three percent, you might aim to raise it to five percent. If your numbers are regularly below industry average, a good goal is to bring them up to the average.
Now that you have a clear definition of what your focus and goals look like, you’re in a better position to determine whether your tests are successful or not.
Test and Test Again
Now that you have the process down, it’s time to start testing away. While A/B testing does take a little extra time and effort, the boosted performance of such an efficient marketing channel is more than worth the investment.
Have you found any elements that make a huge difference in your results?
Keep your eye out for the rest of our Perfect Emails series: