Gorgias logo
Gorgias logo

All articles

Use A/B split tests to create effective campaign messagingUpdated 21 days ago

You can use A/B testing with your Convert campaigns to compare how different versions of your campaigns perform with shoppers who visit your store.

A/B testing allows you to take a data-driven approach to connecting with your customers. When you set up and run an A/B test, you can monitor how different campaign messages affect key metrics like engagement and conversion rates.

Regular testing can help you identify the most effective messaging strategy and continually fine-tune your campaigns to reach your goals — whether you want to increase your average order value, add more shoppers to your newsletter, or highlight specific products.

Requirements

  • You must have an active Convert subscription
  • You must have Admin permissions to create A/B Tests

Create a new A/B test

When you create a new A/B test, your original campaign automatically becomes the “control variant”, keeping all your current settings, audience triggers and messaging.

  1. Go to the Convert menu
  2. Select your store from the sidebar, then select Campaigns
  3. Select a campaign from the list or create a new campaign
  4. Scroll down and select Create A/B Test

Selecting a campaign from the Convert menu in Gorgias, then clicking on Create A/B test

Setting up your test variants

After you create a new A/B test, you’ll want to set up variants to compare how your campaign performs with different messaging.

You can create up to two additional variants to compare against the original campaign (your control variant). Your new variants always have the same settings as the original campaign, like your audience triggers and conditions.

Make changes to your variant campaign’s messaging to measure its effectiveness with shoppers who visit your store.

  1. Go to the Convert menu
  2. Select your store from the sidebar, then select Campaigns
  3. Select a campaign with an A/B test
  4. Select the Variant A tab to write an alternate campaign message

    • To add a another variant go to Test Settings, then select Add Variant
  5. Select Create to finish

In the Variant for your new A/B test, entering a message that you want to test against the original campaign; then clicking Create to finish

Running your A/B test

When you’re ready, you can run your A/B split test. Once started, the test will evenly distribute traffic to your store across your variants to measure their performance.

While the test is ongoing, you cannot edit or add new variants. If you need to make changes, return to the Test Settings page for your campaign to pause the test and then edit your variants.

  1. From the Convert menu, select a campaign for your store with an A/B test

    • You can use the A/B Test filter to quickly find existing campaigns with split tests
  2. Select the Test Settings tab
  3. Select Start to begin the test

In the campaign that you want to A/B test, selecting the Test Settings tab to click on the Start button and being the test

Review your A/B test performance and select a winner

While your A/B test is running, you can monitor the ongoing results in the Performance dashboard. Pay attention to metrics like clickthrough rate (CRT) and conversion to determine which variant performs best.

When you’re ready, select Stop Test to end the A/B split test. You’ll be prompted to select a winning variant.

  1. From the Convert menu, select the campaign that you’re testing from the sidebar
  2. Select the Test Settings tab, then click on Stop Test
  3. Click on New Campaign from Winner to launch the winning variant as a new, standalone campaign

In a campaign with an active A/B test, selecting the Test Settings page to then stop the test and select a winning variant to adopt as a new, separate campaign.

Best practices for A/B testing

  • Test a single variable → for clear results, focus on testing one change to your campaigns at a time. This way you can attribute changes in performance to the specific variable that you’re testing.

  • Define success metrics → clearly outline your goals for the test. Whether you want to improve clickthrough or increase the number of orders that a campaign creates, clear metrics will help you measure success accurately.

  • Allow enough time → we recommend running an A/B test for at least two weeks to obtain meaningful results. Stopping a test too early may lead to inaccurate conclusions.

  • Account for external factors → consider external influences like seasonal trends, current promotions or shifts in customer behavior that might influence the results of your A/B test.
Was this article helpful?
Yes
No