Create an A/B Test

Use data to increase your conversions with the A/B test feature!

With Triggerbee's feature for A/B testing, you can test different offers, form lengths, button texts, tone, and much more to find out what has the best performance for your audience. Experiment to find the key to success!

Instructions

There are two ways of creating A/B Tests in Triggerbee. Method 1 is to create an A/B test from scratch, while method 2 uses a current, unpublished Onsite Campaign. Below are instructions for both ways:

Method 1

1

Navigate to Onsite Campaigns --> A/B Tests.

2

If you don't have any tests already, press "Create your first A/B Test". Otherwise, press "New A/B Test" to create a new test. 

3

Create your variant A just like you would create a normal Onsite Campaign, by selecting a purpose, template, and design as you wish.
Here's a guide on how to create a campaign

4

When you are done with your design and content, proceed to Campaign Settings. 

5

This is where you enter your A/B Test Settings, according to:
Hypothesis: Describe in a few words what you want to test, eg. "Percentage discount or Static discount" or "Popup or Callout". 
Length: Should the test run for 7, 14, or 28 days? The more traffic you have on your webpage and in your selected audience, the shorter period. As a rule of thumb, if you have less than 1000 conversions per month, select at least 14 days. 
Conversions or Click: Select whether to calculate the winning variant based on conversion (collected email addresses) or click. Note! Conversions should always be selected if you have an input field for email in your campaign. 

6

Enter the rest of the campaign settings just like a usual Onsite Campaign, with an audience, URL-targeting (optional), triggers, and repetition.

Note that regular variants can not be used in A/B tests, so if you want to have a desktop/mobile version, please do separate A/B tests (might be a good idea anyway to test different devices individually). 

7

Now it's time to create your variant B - the challenger for the test! Click "Create Challenger" next to your variant A:

8

A copy of your variant A is now created. Click the image to open the editor.

9

Make your adjustments to the design according to your test, eg. changing the layout, colors, headline, etc.

Note! Make sure to only do one (1) difference in your test, eg. changing the headline, to get correct statistics for the test. 

10

When you are done with the design of variant B, the test is ready to go! Click "Publish" to start your test. If you don't want to start it right away, just leave it as a draft for now. 

Note: From the day you start your test, the test will run for the entered number of whole days, ie. if you start your 7-day test at 10:00 (10 am) on a Thursday, the test will stop at 23:59 (23:59 pm) the next Thursday. 

An A/B Test can not be paused or put off/on during the test, but you can always stop the test at any time.

Method 2

If you already have an Onsite Campaign that you would like to A/B test, it's possible to create an A/B based on that campaign. 

You create your test by clicking "Create A/B Test" inside the campaign settings of your campaign. The campaign needs to be unpublished in order to create an A/B Test, so if your campaign is active, Triggerbee will create a copy of your campaign to start over with. Make sure to stop your original campaign so they don't run parallel. When you create your A/B test from an Onsite Campaign, the new campaign is moved to the A/B test section, where you can enter all your settings according to method 1. 


You have a winner! How do I interpret the results?

When the A/B test is over, you will receive an email from Triggerbee that presents a winner. You can also select a winner by yourself in Triggerbee. Triggerbee selects the winner based on what variant got the most conversions or clicks in relation to the number of views. It might be tempting to proceed to the next tests once you have your winner, even if the differences between the two variants little or close to none. Here are some things to think about when evaluating your test results:

  • Measure the outcome of the other steps of the customer journey. If Variant B won over Variant A by 15%, did it affect the next step in the funnel? Did it result in more members, added products, or purchases?
  • If you have less than 1000 conversions per month on your webpage, you would want to see a difference of at least 15% between the two variants to have a good statistical base.
  • If you have over 10 000 conversions per month, you would want to see a difference of at least 5% between the two variants.
     

    Example: You want to test a new copy for increasing the number of newsletter signups. In Variant A you offer a discount, and in Variant B you'll only offer tips without any discount. When the test is over, you'll see that Variant A has 1200 conversions, and Variant B 1100. Thus, the difference is 8%. Even if Variant A looks like the winner based on actual numbers (1200 over 1100), the results might gradually decrease and go back to normal levels within a few weeks. 

  • Most tests don't have a clear winner. This is why it is important to not take the result from the A/B test as absolute truth, but rather look at the whole funnel. 
  • Use a hypothesis in every A/B test and prioritize your ideas to know exactly what you are testing, why and what outcome you expect.

It's a tie!

Sometimes the test can end without a winner, in a draw. A test that ends with a tie is unusual but can happen if you have selected to measure conversions for a too-short time period (7 days) or with a too narrow audience. If you get a tie, you can redo your test with the following in mind:

  • Make big changes! The more your variants differ, the bigger difference there's gonna be.
  • Increase the length of the test if you have less than 1000 conversions per month. Öka testlängden om du har under 1000 konverteringar per månad. 14-28 days is recommended.
  • Widen your audience targeting.

Still need help? Contact Us Contact Us