A/B testing, also known as split testing, is not as complicated as it sounds. A/B tests drive growth – if you do it right. Qubit found that poorly conducted A/B tests can hurt revenue because the wrong changes were made.
You’re changing a single element of your email (version A) and testing it against another version of the same element in the same email (version B). A subset of your test group gets version A and a different subset gets version B. The best-performing version is your new email and will be deployed to the remainder of your list.
Benefits of A/B Testing
It’s cheap. You can test as many times as you want, whenever you want. And we encourage you to never stop testing.
It’s easy. As long as you do it correctly (and we’ll show you how below), the execution and evaluation processes are simple and accurate.
It’s honest. You’re using your own database of real customers, so you’ll get the best gauge of how your market will react.
A/B Testing Best Practices
1. What can you A/B test? You can test just about any variable, but here are some big ones for email:
- Call to action (Example: “Book Now” vs. “Book My Room”)
- Subject line (Example: “Here’s 15% off your next stay” vs. “Want 15% off your next stay?”)
- Personalization (Example: “Mrs. Smith” vs. “Amy”)
- Images (Example: destination vs. people)
- The specific offer (Example: “Save 20%” vs. “Book 2 nights, get 3rd free”)
- Design (Example: colors, fonts, font sizes, bullet points vs. paragraphs, etc.)
2. Set up a control and variation. The purpose of A/B testing is to see if a variable change will improve email click and conversion rates. You should test the control (the original email) against a variation (the email you’ll test against your control).
3. Only test one variable at a time. One of the most important rules is to only change one element at a time. For example, you don’t want to test a new subject line and a new CTA at the same time. If your conversion rate spikes, you won’t know which new element influenced that change.
4. Don’t make changes mid-test. Don’t make any changes to the test until it is finished. If you interrupt the test before it ends, your results are no longer reliable. This includes adding new variables and changing your test group and the number of people who see the control or variation. Altering your goal will also invalidate your test results.
5. Figure out your test group. Two important components make up your test group: the number of people and the type of people.
- Your sample size needs to be a sufficient representation of your email list. Otherwise, any decisions you make based on the results may be flawed. The Optimizely sample size calculator is great to calculate your sample size. Input your current conversion rate plus the percentage increase you would like to see. The calculator will find the optimal test group size you’ll need for your A/B test.
- Segment your test groups. A/B testing needs to be done on like-minded subscribers. Otherwise, your results may be skewed since the engagement rate may differ based on segments.
6. Create a schedule. Determine a test period between email send time and analysis time that is long enough to produce reliable, accurate results. You should conduct your tests at the same time, day, month, etc. with seasonality in mind.
The time varies depending on how many variations you are testing. The more variations, the longer the test should run. If you know your emails don’t have many opens right now, spend a little longer A/B testing.
7. Analyze the right metrics the right way. Look beyond the one variable you’re changing. When your email open rate goes up, what happens to the number of direct bookings? Visits to your spa or restaurant? An accurate analysis is a comprehensive analysis.
Measure as far down the marketing funnel as possible. Ensure that you use a reliable tool to read your results, such as Google Analytics or campaign performance reports from Revinate Marketing.
8. Determine your end goal. What do you want to achieve with this test? With emails, the goal is usually to improve open, click, and conversion rates. Establish a hypothesis to guide your A/B test.
Here’s the best way to create a hypothesis:
- Observe your current trends. What is happening that you want to change?
- Find a possible reason that may be keeping you from your goal.
- What element should you change to fix this?
- What is your goal rate? How will you know when you are successful?
Revinate’s A/B Testing Solution
4 Things you should be A/B testing: subject lines, visuals, content, and CTAs.
1. The perfect subject line
The subject line is one of the most prominent elements of your campaign, and it plays a large role in your open rate. Through A/B testing, you can determine which subject line gets the highest opens. Here are some elements to consider when A/B testing your subject line:
- Length: What is the optimal length? A recent study from Return Path shows it’s around 61-70 characters. Short subject lines have also proven to be eye-catchers, like “Sale” or “Last Chance” or “A Personal Note”
- Word order: Should you present the main offer at the beginning or towards the end? Consider these examples: Use this discount code to get 25% off your next purchase vs. Get 25% off your next purchase using this discount code
- Personalization: A simple first name can make all the difference. Compare “Tom, celebrate fall in Chicago!” with a generic “Celebrate fall in Chicago!”
- Emojis vs. no emojis: A highly debatable topic, but it all depends on your brand and the type of campaign. Remember, relevance is key
2. Which visuals work best
Using visuals in your email campaigns can be a powerful way to get your message across. But too much and the wrong usage can also distract the recipient. Consider these elements:
- Images: Number of images, scenery vs. people, exterior vs. interior
- Design: Layout, background, bullet points vs. paragraphs
- Colors: Bright vs. pale, branded, contrast
- Fonts: Style, size
3. Which content has the biggest impact
The copy is another important thing to think about:
- Length: Will your text fit on one page, or will your recipient have to scroll? I suggest a maximum of three scrolls to view the entire email
- Header: One line vs. two lines, question vs. statement
- Personalization: Opening your message with the recipient’s name vs. no name and keeping it generic
- Offer: Does “Save 30%” perform better than “Book 2 nights, get 3rd free”?
4. Which CTAs get the most clicks
Lastly, the call-to-action (CTA) helps in getting your customer to the next step, often one step closer to booking! Consider these:
- Button vs. text: Buttons vs. hypertext links
- CTA copy: “Book Now” and “Stay With Us” are examples of copy you could test to get your recipients to click
- CTA placement: Where does your CTA perform best? In the middle? Before the scroll?
- Number of CTAs: Do you include several CTAs (e.g., “Book A Stay and “See Menu”) or a single focused CTA like “Learn More”?