Run A/B Tests

A/B testing (or split testing) is the biggest tool in your conversion rate optimization toolbelt.

If you do it regularly, your conversion rates are practically guaranteed to improve over time.

Here are 3 tips for A/B testing your CTAs and landing pages:

1) Use Google Optimize.

Google Optimize is a free tool that makes it easy to split-test just about anything on your site.

Optimize lets you quickly edit your page content in order to create variants to be tested against the original.

That includes changing text, colors, and even page layout.

You can also use a redirect test to experiment with a completely different page design. Or use the tool to set up personalizations based on geolocation or URL parameters.

Google Optimize test types

2) A/B test your button text with your headlines.

According to Joanna Wiebe of Copy Hackers, one of the most common mistakes people make when they run a split-test is this:

They focus too much on the headline, and not enough on the button text.

When in truth, the button text can have an even bigger impact than the headline. Better yet if you optimize them together.

Of course, general CTA text like “click here” can work really well.

But most pros recommend testing phrases that either A) repeat the promise of your headline, or B) address what your audience will get when they click the link.

For example:

“Get Access Now” or “Double Your Leads”.

3) Don’t waste time on pages without enough traffic.

An A/B test is meaningless if the page doesn’t get enough traffic.

It’s the law of large numbers:

If you flip a coin 5 times, it might come up heads every time.

But if you flip it 5,000 times, you’ll get about 50% heads and 50% tails.

Unfortunately, most of the pages on your site probably don’t get enough traffic to get conclusive test results.

The best way to find out how much traffic you need is to use a statistical significance calculator. Like this one by Optimizely.

To use it, start by filling in the page’s current conversion rate (the “baseline conversion rate”).

Split test statistical significance calculator example

Then choose the “minimum detectable effect” and the “statistical significance threshold” (aka confidence rate).

Usually you can keep the default numbers for those. But feel free to play with them to see how the sample size number changes below.

The blue sample size shown at the bottom tells you how many visits you’ll need to each version of your page in order to properly test it.

Note: that number is per variation.

If the calculator calls for a sample size of 5,000, you’ll need that many visits for each version of the page. So for a simple A/B test where you’re only testing the original version of the page vs. a single additional variation, you’d need 10,000 total visits.