1. Home
  2. Samba Enterprise
  3. Automatic A/B testing
  1. Home
  2. Extra
  3. Automatic A/B testing

Automatic A/B testing

This post is also available in: Czech

Automatic A/B testing allows you to test more variants of the same campaign simultanously before sending the winning variant to the rest of the audience. The winning variant, which is evaluated after specified timeframe using specified metrics, will be used in the following runs of the Flow campaign.

How to set up the Automatic A/B testing?

First, set up the variations you’ll be testing. Each of the customers belonging to the test group will receive one of the variants. The rest of the database will automatically receive the winning variant after evaluating the test.

Right now, the only action you can use as a follow-up, is Email.

Variations

  • It can contain up to 10 variants at most
  • Each variant should be only different in one specific tested attribute (for example in the subject of an email for Open Rate testing), all other settings of the “Email” action should be the same. Thanks to this, you will be able to easily determine, which attribute has the most influence on the specified metrics.

Testing Timeframe:

  • Time period, in which the winner (based on the specified metrics) is determined.
  • Timeframe starts counting from the moment a customer is added to the Competitors audience

Check the contents of your email template

If there is an issue with your email template, the newsletter might not be sent. Check, for example, the coupons and other template-specific items.

Types of evaluation metrics and its examples

  • Delivery rate: For email delivery testing
  • Open rate: For testing subjects of emails
  • Click rate: For template and click-throughs testing
  • Unsubscribe rate: For unsubscripton testing
  • Conversion rate / AOV / Revenue per email: For conversions testing

Evaluating the Automatic A/B test

The winning variant is selected based on the specific metric in the testing timeframe. If the value is equal between multiple variants, random variant will be selected.

As soon as the evaluation is done, the winning variant will be sent immediately (to the winner audience from previous cycles/Flow campaign runs). All follow-up runs of the Flow campaign will include 100% of the whole audience.

The results are showing up data for the whole timeframe of the test – you cannot use date picker to set a specific date here.

Activating the A/B test

Campaign will be sent even if there is no date selected in “Flow activation” action.

Please bear in mind, that only fully deactivated Flow campaign turns also off the A/B automatic testing – archiving the campaign has no effect on the Flow campaign (de)activation.

Advanced settings for the A/B testing automation

  • Changes in the automatic A/B test settings, while the test is still active:
    • Instant evaluation
      • If you set the time frame, which has already passed, you will be notified about this. If you still save the campaign, an immediate evaluation of the automatic A/B test will be performed.
    • Test reset
      • If you want to re-evaluate the whole test (and drop all gathered results), you can use the Test reset.
      • Information about winning variants, time of evaluation and time of the beginning of the test will be deleted.
      • This does not have to be used if you are only trying to change the evaluation metrics, timeframe or you just want to add another variant to the test. In that case all previously gathered data will be considered.
    • Manual variant refuse
      • If you want to refuse the result of a specific variant prematurely, you need to remove it completely. Only disabling the Email action will not suffice, because such variant can still win the test.
Long-term A/B testing

If you wish to use long-term A/B testing (for example through 60 days), we recommend you to use the “Split” action (manual A/B testing).

Based on the example above you can use the “Split” action with 50:50 and keep the Flow campaign active for 60 days. After this time, manually determine, which version brought you the best results. You can then create a follow-up actions based on this.

This post is also available in: Czech

Updated on April 7, 2021

Was this article helpful?

Related Articles