Instincts are great and, often, accurate. But marketing magic really happens when you can test your instincts out and generate a data point to back your gut.
Email A/B testing allows you to do just that in real-time. Even with all the predictive power of AI at our fingertips, it remains one of the best ways to learn about which buttons, messages, and creative are going to generate the best results for the company.
Introduction to email A/B testing
What is email A/B testing?
A/B testing, sometimes referred to as split testing, is an experimentation process where two versions of a variable are shown to a randomized group of people to determine which performs best. In email A/B testing, this could be two versions of an email in its complete form, or two versions of an element of that email — like the subject line, a call-to-action button, messaging, or images.
Most email marketing automation platforms have A/B testing functionality built in to make the process of setting up the test and reporting on results incredibly simple. Often, the test will be sent to a portion of the total audience, which is then split into a control group and a variation group, before a result is determined and the winning variant will be automatically sent to the remainder of your list.
It is possible to spin up an A/B test on your own, and this can be beneficial in some specific cases where a test needs to run for days or even weeks to get a reliable understanding of performance.
What are the key benefits of A/B testing emails?
When A/B tests are conducted strategically and regularly, marketers have a lot to gain and, if you are using email automation software to perform split tests before sending a winning variant to the rest of your list, there is almost nothing to lose.
Understand which messaging and creative resonates with your audience
With each test, you will learn a little more about the messaging and creative that resonates with your audience. You can then apply this insight to the rest of your email marketing, or test it out on other marketing channels, including paid social, SMS, or website copy.
Increase engagement
When you are testing the subject line of emails sent regularly and frequently (like a weekly newsletter, for example), even a 1% increase in opens can lead to a cumulative lift in engagement. That increase could be the difference between meeting and missing your KPIs, but perhaps even more importantly, it sends positive signals to email service providers like Gmail to help your email deliverability overall.
Increase conversion rates
Something as simple as a new call-to-action button can significantly increase click-throughs on your email and, ultimately, conversions. Given how simple A/B testing is, even an incremental increase on an email is worth the time taken to test, especially when the learnings can be applied to other emails.
Back your hunches with data
We are living in the most data-driven era of marketing ever and leadership has become accustomed to seeing the numbers behind every decision marketers make. With A/B testing you can prove (or disprove) your instincts and hunches to make a better case.
Getting started with email A/B testing
If you’ve never run an A/B test on your emails or you’re in the habit of doing random acts of A/B testing, the first step is to commit to a few simple tests to experiment with the process and see what you can learn.
What elements should you test?
When you break it down, there are a lot of variables in an email. To ensure you are able to learn something from the test, you won’t want to change too many of them at once. Instead, choose those that could have the biggest impact on your results and go from there.
Here are some examples of email elements to A/B test, based on the metric you are looking to improve.
If you want to learn how to improve your open rate, test:
Subject line: Let two subject lines battle it out, and your email marketing software will send the winning variant to the rest of your list.
Tip: Rather than just testing two random subject lines, think about how you can frame up the two variations to generate a learning that can be applied across all your email marketing. For example, does a time-based benefit (“Our latest feature will save you 1 hour, every day”) or a monetary benefit (“Our latest feature will save you $10, every day”) generate more opens?
Sender name: The sender name or from name can have a surprisingly large impact on your open rate. Test variants like a team member’s name, company name, team name, or, for major announcements, even your CEO’s name.
Tip: This is a test that might be worth running on more than one email, especially if you are testing a name. The sheer novelty of seeing a CEO or team lead’s name in the inbox can increase opens initially, but over time as users get used to the experience it could drop off.
Message preview: That smattering of words after the subject line is easy to ignore, and testing it generally requires a manual test rather than the automated tests your email marketing software can run. But in cases where you have a very short subject line or you send the email over and over again (e.g. a welcome email), it could be worth it.
Tip: Like with the subject line, rather than testing at random, aim to test ‘themes’. For example, using an offer to entice (“Plus, your free marketing eBook”) or a teaser (“Plus, a freebie that will transform your marketing”).
If you want to learn how to improve your click-through rate and overall email engagement, test:
Personalization: Use the subscriber’s name, add dynamic content fields to recommend products, features, or blogs, or test out different segmentation strategies to see which audiences respond best to which messages.
Tip: Don’t test all these things at once, as the data will be too noisy to translate. Instead, test one variable at a time and build on learnings as you go.
Plain text vs. HTML: Testing a plain text version of your email against a graphics-based HTML email to see which performs best — the answer might surprise you!
Tip: Plain text emails make the most sense contextually when sent from an individual rather than a company. Consider this in your test, but make sure you apply the same ‘from’ rule to the HTML-rich email, too.
Email layout: Different spacing, button placement, image placement, headers, and even footers can all play a role in your click-through rate.
Tip: Avoid testing versions that are wildly different from one another as you will struggle to identify exactly why one out-performed another.
Email content: Variations in messaging and word count can have a huge impact on the click performance of your emails, just like they can in subject lines.
Tip: Like subject lines, aim to frame up the two variations to generate an insight that can be applied to further emails. For example, a time-based benefit vs. monetary-based, or framing in the negative (“put an end to being bogged down in process”) vs. the positive (“your processes never looked so slick”)
Visuals: Test different images and graphic types (like gifs) to see which generates the most attention from your audience. You can also test different color variations for headers, footers, and the body of the email.
Tip: If you’re using text in your visuals, be sure to use a color contrast checker to ensure your text is legible. If you’re testing a color variant with low readability vs. one with strong readability, your results will be skewed.
Four things to consider when running an A/B test
Once you’ve identified what you want to test, you will want to use your email marketing software to set up the test and roll it out. To get the most accurate results, you will need to consider the following:
Sample size: Aim to send your A/B tested email to at least 1,000 contacts. This means when the A/B test is sent to 20% of your audience, you are sending each variant to at least 100 people which is a statistically significant number of test subjects.
Time frame: Around 23% of email opens occur during the first hour of delivery, so you will want to give your test around 4 hours to run its course before the winner is chosen and sent to the remainder of your list. More time can be better, but after 24 hours an email’s chance of being opened drops to 1%.
How you’re measuring success: You will need to identify what ‘winning’ the test looks like. Is it opens, click-throughs in general, or click-throughs on a specific link? Whatever it is, ensure you’ve set up your reports before you start the test so results are generated in real time. In email marketing automation software, this will be done for you in the campaign.
Statistical significance: You want to be realistic about how much the change is going to impact your results, but you also want to set a strict standard to establish how significant the difference between results needs to be to roll the change out permanently.
Examples of A/B tests from real marketers
We asked marketers to share what they’ve learned from A/B testing in emails. The biggest learning? It really is always worth running a test before making a change; the results could take you by surprise.
Reengagement campaign: Embedded form vs. click-through form
Sometimes you need to run a few tests to figure out how you can break down the barriers that prevent users from taking action on your email.
Marketing team lead at SE Ranking, Tetiana Melnychenko, did just this for a reengagement campaign and saw incredible results.
“We have an email for inactive users where we ask why they stopped using the platform and offer to schedule a call to discuss the problems and how we can possibly solve them. However, the conversion and reply rates were low, and we wanted to increase them. Our goal with this email is to understand why users decide not to continue using our platform and see if we can fix it for them and other potential users.”
“We used to send it as plain text, and we had a hypothesis that a better-structured and more visually appealing email could make it easier for users to get to the call-to-action. So, we changed the template from plain text to an HTML-based email and embedded a Typeform with an evaluation of the platform’s user experience directly in the body of the email.”
“Here’s what we’ve got in an A/B test (users were split evenly): the old plain-text email had a conversion rate of 1.4%, while the new email (HTML-based) had a conversion rate of 8.3%. Now we continue working on this particular email in order to improve its performance even further. But the new template is definitely a winner."
Seasonal B2B email: Plain-text vs. HTML
Team Building held an assumption so many marketers do: The more beautifully designed an email, the more likely it is to perform, especially when your content is holiday-themed. But they weren’t ready to make any sweeping assumptions and risk performance suffering, so they started with an A/B test.
CEO Michael Alexis shared, “We A/B tested our B2B plain-text email versus graphics-based HTML email. While our team loved the polished and professional look of the HTML format, ultimately it underperformed with half the click-through rate of the all-text version. Thus, we reverted back to our original method.”
This is a perfect example of where A/B testing can help you check your gut instincts before you make major changes to your approach.
“A/B testing is great because it informs and affirms decisions with data and helps you adjust your approach so you're getting the right results.”
Sales-driven emails: Subject line testing
As we’ve mentioned, testing ‘themed’ or ‘motivation-based’ subject lines against one another tends to generate more widely applicable results than testing subjects at random.
At digital marketplace DesignRush, General Manager Gianluca Feruggia did just this with incredible results. “A/B testing email subject lines have provided invaluable insight into resonating better with our audience of business decision-makers. We tested contextual subject lines like ‘New website design trends’ against urgency cues like ‘Expiring: Access top web design agencies.’ The scarcity-driven headlines consistently lifted open rates by an average of 22%.”
DesignRush also tests from/sender names and visual styles regularly to ensure they’re always learning and improving. “The key is never to assume you know what works best. Let rigorous testing reveal insights about your subscribers' preferences. The data has repeatedly shown us that effective B2B email marketing isn't necessarily intuitive. Aligning content with real behaviors drives results.”
B2B email newsletters: Subject line testing
Sometimes further investigation is required to identify which test variant should be rolled out to your wider audience.
Founder and CEO of Cleartail Marketing, Magee Clegg, was working with a B2B client who wanted to increase their open rates. Naturally, they started with the subject line.
“We decided to A/B test the subject lines of their email newsletters. Group A received a subject line that was benefit-driven, ‘Get X results in Y time frame’, while group B received a curiosity-driven subject line, ‘This trick will change the way you do Z.’
“The results were revealing. The curiosity-driven subject line resulted in a 20% higher open rate than the benefit-driven one. However, the benefit-driven subject line led to a higher click-through rate and more conversions. After further analysis, we realized that while the curiosity-driven subject line spiked interest, it didn't always attract the right audience. The benefit-driven subject line, on the other hand, attracted those genuinely interested in the product, leading to more conversions.”
“This experiment taught us the importance of identifying and focusing on the right metrics. While open rates are significant, they aren't the be-all and end-all of email marketing success. The end goal should always be conversions and ROI.”
Sales and lead gen emails: From line and length of content
Testing variants one by one can help you piece together the optimal email for your needs. Growth Marketer and Automation Expert, Abhi Bavish, discovered two interesting insights from different tests.
“We once tested the 'From' line in a Fintech client's email. Version A used 'Company Name'. Version B used 'Company Representative’s Name + Company Name'. We thought personalization might boost open rates and we were proved right. Version B had an 18% higher open rate.”
Content length is another variant worth testing and the results might take you by surprise. “In an example from the B2B lead gen space, we tested long-form versus short-form content in emails. Version A was a short summary of a blog post. Version B was a detailed overview of the blog post. Version B had a 27% higher click-through rate, showing this audience preferred more detailed content.”
A/B testing: Final word
If you have a goal to improve open rates or click-throughs, A/B testing can help you hone in on what needs to be adjusted to generate the results you want. Tempting as it may be to test everything at once, it’s important to focus on one element at a time, generating learnings that can be layered on top of one another to help you build high-performing emails, every time.