It is possible to A/B test your trigger. For example, do you want to A/B test your open ratio, click ratio or number of unsubscribes? You can! In this article I will explain how to do this.

Step 1. Create a new trigger (or use an existing trigger)

You start by creating your trigger. In step 1 you select your target group.

In step 2, the magic of A/B testing begins.

Step 2. Enable A/B testing

Now you are ready to create your A/B test. To enable A/B testing for your newsletter, click 'Enable' under A/B testing:

Now you see that the screen for A/B testing expands. It is now possible to fill in everything for your A/B test. You now have this screen in front of you:

Step 3. Explanation of all fields in your A/B test

The screen in front of you can now be divided into an A variant (top), the A/B test settings (middle) and B variant (bottom). I will cover all the fields you see in front of you now.

A variant:

The A variant is the first variant you will see on the screen. Here you fill in the following fields:

  • Name of sender

  • Sender e-mail

  • E-mail subject

  • Text for the pre-header

This is the same as creating a campaign

The A/B test settings:

  • Which indicators do you want to test for?

Fill in what you want to test for. You can choose from Open Ratio, Click Frequency and Unsubscribe. If you choose all three, we will test the newsletters on all three parameters and you can decide which newsletter will be the winner.

B variant:

The B variant is the second variant you see on the screen.

Here, you fill in the following fields:

  • Name of sender

  • Sender e-mail

  • E-mail subject

  • Text for the pre-header

Have you filled out everything? Then go to the next step.

Step 4.

In the next step you create the e-mail that belongs to your trigger. At the top you will see a drop-down menu where you can choose between variant A or variant B:

Always create variant A first. Variant A is as it was duplicated live to variant B. This way you don't have to create your newsletter twice. Now save variant A.

❗When you save version A and continue with version B, the variants will be disconnected from each other! This means that any changes you make to variant A will no longer be applied.

Now continue with variant B. In variant B you make some changes that you want to test. You will see that variant A stays as it is.

Did you also create the layout of your mail as usual? In the summary you can see again that you are running an A/B test and what the differences are:

Tips for testing

We would like to give you a few tips on what to test for:

- Open ratio

To test your open ratio, let the sender's name, e-mail subject and pre-header differ from each other.

- Click Ratio

We measure the number of clicks in your newsletter. Your click ratio is about the content of your email. Make the content of variant A different from variant B. Change only a button, hero image or text. Which variant gets the most clicks? Test it and make sure your content matches your target group.

- Unsubscribes

We measure the number of unsubscribes per newsletter. Many people unsubscribe from the newsletter because the content is not interesting. Try to put yourself in your customer's shoes: why would you unsubscribe from a newsletter? Are there no interesting offers in your newsletter? Or is it the articles themselves? Test thoroughly why your customers unsubscribe from the newsletter.

Always test one element at a time. If you want to test all three indicators, weigh them all equally. So each indicator has a weight of 1/3. Are you testing two degree markers at the same time? Then we weigh them equally 50/50.

Step 5. Results of your A/B test

You can see the results of your A/B test live. This is your dashboard for your A/B test results:

On the left-hand side, you can see the probability in percentages for the best performing variant.

The grade shows how well the variants are doing compared to each other. It shows how much of the total score was achieved by the individual variants. A 50/50 distribution of the score means that both variants are equal. For example, a 1/3 (33.33%) versus 2/3 (66.67%) distribution of the pie chart indicates that one of the variants scores twice as much as the alternative. The score itself is calculated on the basis of the performance indicators (clicks, opens, etc.).

Next to that you have Variant A with the performance:

And then variant B with the performance:

Are you curious about the "regular" statistics that you always have with a trigger? Click on one of the lines of variant A or B. This will take you to the unique performance of the newsletter in question.

At the bottom left you will see the test progress. Here you can see how far the test has progressed and what the winner might be on the basis of the data collected:

On the bottom right you will then see the performance of exactly what you are testing. In this example, you can see that the test is being conducted on open ratio, click ratio and a number of unsubscribes:

By default, your unsubscribe count is set to 100%: this means that there are no unsubscribes. When your bar goes down, there will be more unsubscribes. The number of unsubscribes is seen as a negative factor (while click and open ratio are positive factors). Therefore, keep in mind: the higher these bars are, the better.

Based on your performance, you can also see a plot below that provides full insight into how variant A and variant B perform:

Have you changed something in your e-mail in the meantime? You will also find this in the plot:

An adjustment to a variant may cause the variant to perform better or worse. This change in performance is immediately visualised by the plot. If an upward trend can be seen after an adjustment, then the adjustment has had a positive effect on the performance indicator and vice versa. Please note that an increase in the number of unsubscribes is never a good sign.

Sample Rates

Your sample rates indicate in percentages how often variant A and how often variant B should be sent. The default is 50/50.

👉Would you like to use our Artificial Intelligence assistant to optimize your triggers?

Then turn the slider on:

  • The "Use relative performance" switches the "AI assistant" on or off.

Based on what you are testing for, our software will determine for you which variant scores best. Based on that, the bar will automatically shift for you. Have you set the bar at 40/60 for example? Then the system will see if it can be matched. Through AI, the bar will gradually shift. The sample rate moves slowly.

How does this work? 🤔

The AI assistant (if enabled) scans daily how well the variants perform relative to each other and automatically adjusts the sample rates accordingly. The assistant takes care to avoid adjusting the sample rates extremely hard when one of the variants has a 'bad day'. Enabling this function ensures that variants that perform well are used more often and avoids having to keep track of it manually.

👉Do you want to be in control of your sample rate?

Then turn off the red slider.

  • Red sliders adjust the sample rates directly and indicate how the sample rates are currently set.

When you change the sample rate, the system will immediately apply the change. The sample rate will change immediately.

Just like a normal trigger, an A/B test with a trigger is an ongoing process, without a final winner.

Samplerate for 100% - 0%

The sample rates can be set to 100% for one variant and 0% for the other variant. This effectively ensures that one of the variants is never selected. This is actually the same as selecting a 'winner'.

❗However, it is important that the 'AI assistant' is not enabled, as this function causes the sample rates to change.

Good luck with testing! 🙌

Did this answer your question?