Start A/B Testing Today with 5 Simple Steps

by 5 07/01/2014

It’s often said, “If you’re not testing, you’re guessing.”

Or, as Kathryn Aragon points out here:

The trouble is, if we aren’t testing, we’re fooling ourselves. We don’t really know what works. We’re just guessing.

But, did you know that not running the proper tests could be just as bad—or worse—than guessing?

For instance, let’s say you tracked your conversation rates in March. At the end of March, you made some significant changes to your website and watched the results over the next month.

At the end of April, you have two months of data.

Maybe your sales went up… Great! But…

Your test isn’t accurate!

Too many factors are influencing the results when you compare two different versions over two different time periods.

To perform A/B testing accurately you need a few things:

  • Two versions (A and B) tested at the same time. Both versions are identical expect for ONE variation.
  • A control (typically your current version or A).
  • A modified version (or treatment—also known as B).

Here’s a good visualization of A/B testing from Optimizely.com:

A/B-Testing-Example-5

As you can see in the above screenshot, they’re testing ONE element—the color of the “Buy Now” button.

In the original/control/A the button is gray. In the variation/treatment/B, it’s red. Red increased sales by 4.5%!

That’s how effective a simple change can be!

Let’s look at another visualization. This one is from VisualWebsiteOptimizer.com:

A/B-Testing-Example-6

In this example, the variations were delivered equally to the visitors (50% to each one). The page with the orange area performed better than the green one.

Let’s take a look at a real example from WhichTestWon.com:

A/B-Testing-Example-7

In this example, they tested the button copy—“See Product Video” vs. “Watch Demo.” Everything else is the same.

So which one won?

Here are the actual test results (from WhichTestWon.com) –

Version A, the ‘See Product Video’ button call to action (CTA), increased form fills on the page following the button by 48.2% at a 98.5% confidence level.

As you can see, if this were your website and you guessed Version B, the button with “Watch Demo,” you’d be wrong (and leave a ton of money on the legendary table).

Here’s another example—also from WhichTestWon.com:

A/B-Testing-Example-8

In this example, a non-profit wanted to increase their donations. They did an A/B test where the only variable was the gift amounts (see above).

The winner?

It was the second version (B), which increased the average donation amount per person by 15%. (Overall revenue increased 16%.)

Clearly A/B testing—when completed properly—will bring you far better results, so let’s get started…

5 Simple Steps to Start A/B Testing Today

1. Determine your goal.

Your goal will vary based on your business.

For instance, a Business-to-Business company might be focused on generating more leads for their sales staff. On the other hand, an e-commerce Business-to-Consumer website might want to increase sales.

What do you want to increase?

Quick Tip: If you’re not sure, start at the top of your funnel.

Why?

Because when you optimize your lead generation form, you’ll have more leads going into your sales funnel.

Alternatively, if you improve your checkout page, you’ll get more sales, but you’ll have the same amount of leads going in—effectively placing a limit on your improvement.

Note: Don’t spend a ton of time making this decision. Ideally you’ll do a lot of A/B testing once you have a few tests under your belt. The important thing is that you are testing and improving—whether it’s your lead generation form or home page headline.

2. Decide what to test.

Now that you know your goal and the page you’re going to test, it’s time to decide what element you’ll test.

Here are some options:

  • Your headline.
  • Your offer text.
  • Your button text.
  • Your form fields.
  • The color of your form button

I could go on and on. There are so many things to test.

Note that some sites will see a dramatic improvement in their conversions just by changing the color of their opt-in button. Other sites, however, will see little improvement from a button color change. They’ll have to test bigger elements—like their headline, offer, and USP (unique selling proposition).

Why? Well, one reason could be, if your website visitors don’t understand how you’ll help them, they won’t opt-in—no matter what color the button is.

Quick Tip: If you’re having trouble deciding what to test, choose something—anything. Let’s just get a test up and running. You can always test other elements later.

3. Create your test.

There are two parts to this step.

The Creative Part

The first is what you might call “the creative”—or making another version of the element you’re testing.

For instance, maybe your headline doesn’t immediately tell your visitor “What’s in it for me?” Write another version of your headline, this time focusing more on your visitor.

Note: You don’t have to know that your alternate headline is “better.” That’s why we’re A/B testing. Let the data reveal the best headline.

Important! Don’t try to trick or mislead your visitors to get better results. Read this article by Russ Henneberry to find out why.

The Tech Part

The second part of creating your test is the technology you use to deliver each variation to your visitors.

If you’re intimidated by this step, fear not. Modern technology makes this super easy and fast.

In fact, before writing this article, I’d never used Optimizely.com (or even visited the website). However, in less than five minutes I created an A/B test for my headline. It’s running as we speak.

Quick Tip: Set up your first test at Optimizely.com. They have a free trial so you won’t be risking anything.

4. Wait.

I find this step the most challenging of the five steps, mainly because waiting isn’t my strong suit. However, to get accurate results, we must wait.

Note: Go ahead, log in and check out your results during your test—just do not (I repeat, DO NOT) stop, pause, or edit your test until it’s complete.

When is my test complete?

You could select a time period (shoot for at least two weeks) or an impression amount (aim for a few thousand impressions).

However, I want to make sure my test is as effective as possible, mainly because I don’t want to make changes based on insufficient evidence.

That’s why I use a sample size calculator:

A/B-Testing-Example-2

In the above example, I used a 95% confidence level and a 4 for the confidence interval (or margin of error). I left “population” blank (mainly because my day-to-day traffic varies so I’m not sure how many people will see my test per day).

Access the sample size calculator here.

Quick Tip: Let your test run until you have a sample size of AT LEAST 600.

5. Determine your winner!

When your test is over, it’s time to calculate your results—or determine if your test (and the results) are “statistically significant.”

Although it sounds complicated, we’re simply making sure our results will perform the same over the long haul. In other words, how sure are you that your test results are accurate?

A split test calculator, like the one found here, can help you decide.

Here’s a hypothetical example:

A/B-Testing-Example-3

In the above example, the A/B test resulted in 4 goals (these could be sales or leads, depending on your goal) for the control and 8 goals for the variation (or B).

Some might look at this and say, “Obviously B doubled sales!”

However, there is no clear winner. If this were delivered to an additional 1,211 people, the results could be very different.

Here’s another example:

A/B-Testing-Example-4

In this screenshot, our sample size was exactly the same. However, our goals (or conversions) were much better. In this case, we can be 99.9% sure that B will get double the conversion rate of A—no matter how many people see the test. That is statistically significant.

At this point, it would be safe to change the control to B and test a new version against B. (Always be testing.)

Note: If your conversion rate is low, you’ll need a bigger sample size—such as a longer test or more traffic—to get statistically significant results.

Don’t Copy Your Competition!

Sites like WhichTestWon.com are extremely helpful in determining what we should test. However, a word of warning:

Do not use someone else’s test results—no matter how good they are—to make changes to your website.

Just because one of your competitors saw better results with a red button, doesn’t mean you will. Instead of blindly copying them, test the change for yourself. You’ll likely be surprised by the results!

Here is a great example, also from WhichTestWon.com:

A/B-Testing-Example-9

Based on common marketing advice, you’d probably guess a security seal would increase conversions… right?

After all, it’s a common trust element that marketers swear by…

But, the results of the above test show us that adding a security seal doesn’t always increase conversions.

In fact, in this test version B increased opt-ins by 12.6%.

We could guess at why. Maybe the target audience mistook the security seal as a sign they’d be expected to pay for something.

But, the important takeaway here is that A/B testing revealed the true winner.

For more surprising A/B split test results, check out this article.

So, do you use A/B testing to improve your conversions? Why or why not? Let’s talk about it in the comments below.

About 

Christina Gillick is a direct-response copywriter. She helps her clients create loyal customers and raving fans through relationship building copy and marketing. She is also an entrepreneur and founder of ComfyEarrings – The Most Comfortable Earrings on Earth.

Get our Daily Newsletter

Get conversion optimization, design and copywriting articles delivered to your inbox FREE

5 COMMENTS

Bret

“Red increased sales by 4.5%!”

A 4.5% increase…from $1,000 to $4,500? I’d love to see how you worked that out.

July 15, 2014 Reply


Leave comment

Some HTML allowed

Get conversion optimization & A/B testing articles FREE >>>