Skip to main content
How do I set up an A/B test?

This article tells you how to set up an A/B test for a trigger or newsletter.

Francine avatar
Written by Francine
Updated over 2 weeks ago

By A/B testing an email, you learn more about your audience. Test for example on open ratio, click ratio or number of unsubscribes.

You can run an A/B test for a campaign and a trigger.

Before we begin, it is important to know the differences between an A/B test for a campaign and a trigger. The following differences mean that for a trigger, no target group size and test duration are set.

Campaign (newsletter)

Trigger

Target audience

You determine the size of the test groups and the remaining group before the start of the A/B test.

Variant A and variant B are alternated among the matching profiles. You have no control over this.

Test duration

A campaign is sent at your chosen time. This is why you set a test duration for an A/B test of a newsletter.

After the end of the test duration, the winning variant is sent to the remaining group.

A trigger is sent automatically as soon as a profile matches the targeting filters. Therefore, an A/B test for a trigger has no end.

Variant A and variant B are alternated among the matching profiles.

Tip: Before activating an A/B test, consider what you want to achieve with the results.


In this article:

  • Step 1: Set test group size for A/B test (newsletter)

  • Step 2: Set test duration A/B test (newsletter)

  • Step 3: Set up degree meter A/B test

  • Step 4: Sender and subject

  • Step 5: Create A/B test

  • Step 6: Send the winning variant and analyze results

To enable A/B testing, in the Timeframe step of a new trigger or newsletter, turn on the A/B testing toggle.


Step 1: Set test group size A/B test (campaign)

This setting is only for newsletter A/B testing. Are you setting up an A/B test for a trigger? Then proceed to step 3: performance outcome.

For a campaign, you determine the size of the two test groups for the A/B test. This option appears after turning on the toggle behind A/B testing (in step 2: email settings).

You determine the distribution of the target group in percentages. This way the size of the target group is always accurate based on the total audience of the newsletter. You won't see total numbers of your test groups here.

To perform a representative test, we recommend using a minimum target group of 400 profiles.

Is your target group smaller? Then you will be notified. You can still continue the test but the performance might be less representative.


Step 2: Set A/B test duration (campaign)

This setting is only for newsletter A/B testing. Are you setting up an A/B test for a trigger? Then proceed to step 3: performance outcome.

The test duration of the A/B test affects the sending of the winning variant to the remaining target group. The test duration should be long enough to collect data. If you choose a longer test duration, you give recipients from the test group the chance to open your email.

When will the winning variant be sent?

  • Do you choose to send the newsletter immediately? Then the test starts immediately after you click “send.” The winning variant is sent after the specified test time is over.

  • Do you choose to schedule the newsletter? Then the test starts at the scheduled time. The winning variant is sent after the specified test time is over. For example: the newsletter is scheduled on June 1 at 9 a.m. with a test duration of 12 hours. The winning variant will be sent on June 1 at 21:00.

Tip: set the test duration to 24 hours. That way the newsletter will be sent at the same time as the test.


Step 3: Set up performance indicator A/B test

In an A/B test for both a trigger and newsletter, you can test for three results:

  • Open ratio. The variant with the highest open ratio wins. To test your open ratio, have the sender name, email subject or pre-header differ from each other. Always test one of the three. Otherwise, you'll never know which change was the deciding factor in picking the winner.

  • Click to open rate. The variant with the highest click rate wins. The click to open rate is about the content of your email. Make the content of variant A different from variant B. Change only a button, hero image or text. Which variant gets the most clicks? Test it and make your content match your target audience.

  • Unsubscribes. The variant with the fewest unsubscribes wins. Many people unsubscribe from the newsletter because the content is not interesting. This gives you insight into which content best connects with your recipients.

By default, the indicator Open rate is set. To change the performance indicator, click on the field. Select the desired indicator and, if necessary, remove the unwanted performance indicator by clicking on the cross.

Always test one element at a time. If you want to test all three indicators, they all weigh equally. So each degree measure weighs 1/3. Are you testing two grade meters at the same time? Then also weigh them equally against each other by 50/50.


Step 4: Sender and subject

In the next step, sender and subject, let variant A and variant B differ from each other:

Step 5: A/B test creation

For an A/B test you create two versions of the same email: variant A and variant B.

In the e-mail settings, under the settings of the A/B test, you will find all sender settings for variant B. Fill in the desired sender, subject and pre-header. Are you testing for open rate? Then you need these settings to differ from variant A.

Click through to the design of the email. At the top of the green bar you will find the option to switch variants.

Always create variant A first. This is automatically duplicated to variant B. As soon as you save variant B and switch to variant A, the variants are separated. Changes in variant A are no longer automatically transferred to variant B.

In the summary, you will again find all settings for both variants.


Step 6: Send the winning variant and analyse results

The steps that follow after starting the A/B test differ depending on the type of e-mail you are testing (campaign or trigger):

  • Are you running an A/B test for a campaign? Then you will see the status “testing.” Once the test duration is over, the winning variant is automatically sent to the remaining group of recipients. You don't have to do anything for this. Read here how to analyze the results of the A/B test for a newsletter.

  • Are you running an A/B test for a trigger? Read here how to analyze the progress of the A/B test for a trigger.

Did this answer your question?