Configure an A/B test in AB Tasty

1. Select Test from the top navigation bar, click Create in the top-right corner. Select the appropriate type of test for your campaign.

2. Name your campaign and set a hypothesis for your experiment based on the following model: If I apply [this change on my webpage] for [this audience], then it will impact [this behavior] and influence [this goal].

Here are some examples of how to influence goals and KPIs through experience optimization: Copy the URL of a page you wish to test and paste it into the “Sample URL to load in the editor” field. Select the type of editor you wish to use: a visual WYSIWYG editor or a code editor. Click “Save & go to next step” in the bottom right corner.

3. Set your targeting in the “Who”, “Where” and “Trigger” sections.

WHO section: this is the most important one, where you have to choose the right segment as a function of the message you have created in the editor.  WHERE section: choose the unique URL (or a saved Page) or the type of pages having the same construction (such as Product pages) on which your message will be visible). The URL used in the editor step remains a sample of the URL(s) you configure in this step. TRIGGER section: this step is optional. You can add specific session-based triggers, such as a required number of viewed pages before displaying a message, the landing page of the session, and so on. Click “Save & go to next step” at the bottom of the page.

4. Add JS and/or CSS code, use the visual/WYSIWYG editor, or add a widget from the library to create your modification in the unique editor view.

5. Set your goals. Here, you can select the goals on which you’d like to report. For example, you can select transactions, bounce rate, number of clicks, etc.

When you test a change on a clickable element (e.g., CTA button), always turn on action tracking to see how people respond to the new version. Choose the action tracking related to the element you have modified to determine if the change you’ve made converts better than other versions of the same element. User behavior is likely to be affected by such changes.

6. Set your targeting by selecting the audience segment for which you’d like to display the test.

Carefully enter the targeted URLs in the “Where” section of your test to ensure that it will display correctly on the pages you want to target (e.g., product pages). We recommend copying and pasting the URL to ensure the integrity of the full string. Use the Device criterion in the triggers section if you want to limit the test to visitors using a specific device. [Optional] Select your triggers to configure the actions or behaviors that will trigger your test to display.

7. Set your traffic allocation. Use identical traffic allocation amounts for each variation. For example, avoid the following distribution: original 20%, variation 1: 50%, variation 2: 30%.

8. Use the QA Assistant to make sure that the modifications appear correctly on all targeted devices and that your action tracking is set correctly.

9. Before you launch your test to your targeted audience, disable the QA mode, make sure the targeting of your test has been saved, and traffic allocation is evenly divided between the variations.

10. Run the test for at least 15 days. Then analyze the test results and apply changes directly to your website.