8 Easily Avoidable Mistakes That Will Ruin Your Landing Page Tests

by 6 02/27/2014

Talk to any conversion expert (like me), and you’ll hear them chanting the same song:  “Test, test, test, test.”

It’s monotone, boring, and probably annoying. But it’s the only way to improve your conversion rates.

The truth about testing is that it’s hard and slow, and you’ll encounter challenge, frustration, and the familiar feeling of wanting to smash your keyboard to smithereens using only your forehead. There’s this weird thing going on with testing, where it either scares the jeebies out of people, or they get this elitist arrogance about it.

The truth of the matter is that we all need to be running landing page tests, whether we loathe it or like it. We’ve got to test our landing pages, and we’ve got to understand why our tests are not producing wins.

The problem is, not enough companies are testing — only 71% according to econsultancy.com. In many cases, the more costly or important the marketing effort, the fewer companies were testing, as demonstrated in this chart:

Econsultancy study on the percentage of customers doing a/b testing

There are reasons why testing can cause frustration. What you’re about to find out are eight reasons why testing goes awry, and what you can do to sidestep these problems.

Each of these reasons are common pitfalls in the field of landing page testing. By being aware of these issues, you’ll have a leg up on the competition and be able to hone your tests with increasing accuracy.

8 Mistakes That Ruin Your Test Results

Reason #1.  You aren’t doing A/B testing.

A/B testing is when you test two different pages, and figure out which page converts better. Sometimes, it’s called A/B split testing. It’s basically an experiment to see which version — A or B — is going to make you more money. There is such a thing as “multivariate testing,” but you can hold off on that until you’ve done a few A/Bs.

The whole concept is pretty simple, but for some reason people don’t do it as often as they should. Why not? Maybe because it sounds intimidating or complicated, or they don’t exactly understand how drastically it’s going to help them.

Here’s an example from Optimizely, which demonstrates a conversion improvement of 3–5 percentage points by making a color change to the “buy now” button.

Optimizely graph showing lifts in conversion with button color changes

A/B split testing analyzes just two versions at a time. Here’s an example of an information architecture A/B test from Smashing Magazine:

Smashing Magazine's example of A/B testing information architecture

How do you conduct a test?

a. You need a testing tool. Yes, they cost money. Yes, they’re worth it. Unbounce, Visual Website Optimizer, and Optimizely provide solid initial testing platforms.

b. Figure out what you want to test. The options are endless. Different color scheme? Button placement? Picture of a man vs. a picture of a woman? Layout variations? Pick anything. Test it.

An effective test will isolate a single element of design. For example, if version A contains bigger font, brighter color scheme, and a different picture from version B, how do you figure out which of the differences are due to your higher clickthrough rate? It’s best to choose one design element to test at a time. The more you test, the closer you’ll get to conversion heaven.

Here are some examples from Smashing Magazine:

This test uses two different graphics to contact Jason — a phone icon, and a picture of Jason.

Example of a contact page with a persons face

c. Test for a realistic amount of time. Testing will be patently unsuccessful unless you test for a while. You need to track as many conversions as possible with both A and B, which means that you should leave the test going for a long time.

d. Analyze the results, and take action. This is the fun part. Put your test results side by side, and see which version gained more conversions.

That’s all there is to it. A/B testing is a straightforward and effective way to reduce stress, improve sales, and live happily ever after.

Reason #2.  You aren’t testing frequently enough.

At the risk of sounding cliche, I’m going to say it:  Testing should be a way of life. It’s the only way to succeed. Kyle Rush makes the point that frequent testing allowed him and his team to get the data they needed to make a campaign successful. Here’s the video:

http://moz.com/mozcon-videos (scroll about half way down the page)

Testing needs to happen early and often. Here are my two tips for using testing to the max:

  • Test your landing page at the same time each year. If your business is subject to seasonal variations, compare tests year-over-year at the same time each year.
  • Test your landing page on a recurring and regular basis. In order to be truly representative and action-oriented, you need to be testing all the time. I recommend monthly tests.

The more frequently you test, the more insight you’ll gain, the more changes you’ll make, and the better your website will do.

Kissmetrics experiences a constant deluge of traffic; as a result, they need to test as often as possible. By frequent testing, they realized that they could reinforce the brand and improve their proposition.

Their first landing page was good, but not outstanding.

Kissmetrics landing page version A

Frequent testing led them to test, retest, and eventually develop this final page:

Kissmetrics landing page b and winner

(Image from Unbounce)

The power of frequent testing will give you solid results. Do it, and do it often.

Reason #3.  You’re only testing 1 time and stopping.

One of the problems with irregular testing is that it can lead to skewed results. Distilled blog points out that testing without the right duration creates inaccurate results. A single test isn’t going to provide you with all the actionable metrics you need.

Let’s say you did a test, pulled some metrics and made some changes. Now what do you do?

Test again.

How do you know your test was representative? Was it a low-traffic slump or a high-traffic quirk? Did you just release a product or were you preparing to release a product? Did you just publish a press release, or did the New York Times just release a piece on your CEO?

Test again.

You’re going to be frustrated with testing if you did it once and then stopped. Don’t settle on what you think is a big win. Question your assumption, roll up your sleeves and test it again.

Reason #4. You’re only testing the obvious.

One of the major pitfalls of testing is only testing major things, obvious things. Some obvious tests are as follows:

  • Test a different call to action. “Free Download!” vs. “Download Now!”
  • Test a different headline.
  • Test long copy vs. short copy.
  • Test price differences.

The problem is, it’s not the obvious things that make the biggest differences. It’s the not-so-obvious things that lead to major wins. It’s helpful to examine non-intuitive features that can be enhanced or edited.

This landing page below was tested, and the designers considered it to be a bit too long. The editors reduced length by implementing a lightbox. Here’s what the page looked like before:

Right Signature landing page A page length test

And here’s what it looked like after:

Right Signature landing page B page length winner

The lightbox is a non-intuitive way to reduce landing page length. Here, the value of testing proved its usefulness.

Here are some other landing page testing categories that you should consider.

  • Design – What layout works best? What colors are optimal? Does button size make a difference? Does a parallax site convert better than a traditional one? What font reduces bounce rate?
  • Content - What is your stronger CTA? Does long copy work better, or short copy? Which slogan improves conversions? How do headings improve or reduce scroll below the fold?
  • Experience - How does a 1s load time with additional images compare with a .5s load time with no images? Is a one-step download better than a two-step download?

Reason #5.  You aren’t using analytics.

In testing, data is king. You need all the data you can get. The more, the better. If you’re just beginning testing, you’ll feel as if you’re swimming in a lot of useless information, but be patient. The longer you work with data, the more you’ll understand what is essential, what is tangential, and what is a dead giveaway that change is required.

One strategic way to ensure that you’re looking at enough data is to create a list of A/B testing candidates — color, sizing, information architecture, button size, layout, content length, etc. Then, move sequentially through the list, testing each one in turn. If you go through your entire list, go through it again with a new round of test.

Every bit of data that you gain from your tests will contribute to a better-performing landing page.

Reason #6.  You’re testing at the wrong times.

Testing at the wrong time is worse than not testing at all. If you test your landing page during a Pinterest-fueled traffic spike, your test results will be skewed. If you test your landing page during a slow season in a traffic trough during your slow season, you guessed it, your results will be skewed. If you make decisions based on the Pinterest test crowd, your conversions will suffer, not improve.

Before you test, you may wish to do a traffic review. Watch those ups and downs in Google analytics, and lengthen or shorten your test period accordingly.

For example, Macquarie University worked to improve their landing page. However, they needed to be strategic, because their traffic experiences seasonal fluctuations.

Macquarie University testing seasonality on their landing page

Reason #7.  You’re testing traffic from everywhere or who-knows-where.

One oft-overlooked feature of testing is traffic source. Let me explain. As you create your test, keep your audience in mind. Some of your audience comes from organic search, some from paid search, some from organic search, some from social media, etc. Each of these sources provides a differentiated audience sector.

It’s perfectly legitimate to test all traffic from all sources. You’ll likely come up with some very helpful data that will lead to profitable solutions. However, you need to test with an eye to these traffic differentials, and the way that it can affect conversions. As you test frequently, you should consider pulling test results that analyze your traffic sector by sector.

Here are some of the ways you can segment your testing:

  • PPC traffic - What are these people looking for? How often do they convert vs. organic search? What types of images improve the likelihood of their conversion?
  • Email traffic - If your email newsletters are generating clickthroughs, analyze this group carefully. Find out what causes the highest bounce rates. Find out if content geared toward email clickthroughs increases conversions.
  • Organic search traffic – Organic search traffic can be further subdivided. Traffic segment variance is a study all its own. What search terms led them to the site? Where are they located? Are they accessing your mobile site or your non-mobile site? In a report from Marketing Sherpa, Marriott Vacations tested their iPhone landing site, and discovered techniques that caused their conversion rate to climb from 5% to 7%.

Collating this data and then analyzing the results will give you powerful results. CrazyEgg, for example, uses heat map and scroll map reporting that provides even narrower visitor segmentation. I review these metrics daily in order to determine how visitors navigate each page of my site.

JeremySaid home page using CrazyEgg software

Reason #8.  You’re testing a crappy website.

A dysfunctional website will make testing extremely frustrating. If you’ve been hit with a manual penalty or been de-indexed, you’re basically done for. Rather than waste testing dollars on a compromised site, first take action to improve your site’s condition.

Some people actually have buggy e-commerce sites that have security warnings or even viruses. Yes, I’ve seen it happen.

Sites like this can’t be tested, because any test is bound to fail. Get your site ready before you test.

This website of DSRNY.com, which won the number-two spot on the 2013 list of WebPagesThatSuck, is probably not worth testing:

DSRNY home page

It’s not a guessing game

I’ve seen people try and fail playing the lottery with their e-commerce website. Hm. Maybe I should try a new color scheme?

You don’t have to guess. E-commerce isn’t roulette. It’s strategy. It’s data-driven, research-backed, tested-and-proven. All you have to do to get the relevant data is to do testing, and do it right.

There are a few things you can do at this point, depending on where you’re at in the field of testing.

1. Start Immediately. If you haven’t done any testing, start now. A November 2013 report from econsultancy stated that “companies whose conversion rates have improved carry out 50% more tests on their websites than companies whose conversion didn’t improve.” An easy way to begin is to use one of the many aforementioned testing services.

2. Build a Test Queue.  If you don’t have a regular testing schedule, create a list of landing page elements that you want to test first. Testing is a vast field. There are endless variations of tests that you could perform. Rather than throw up your hands in confusion, simply take one simple element and test it. One test will probably lead to another, and another, and another…

3. Always Be Testing. If you’re frustrated with your landing page tests, keep testing. Frustration is normal. The way to overcome your frustration is not to simply neglect testing, but to keep pushing until you accomplish a breakthrough. If you test enough, you will experience a breakthrough. And that’s a really good feeling.

Can you think of any other critical elements to be a better landing page tester?

Read other CrazyEgg posts by Jeremy Smith

About 

Jeremy Smith is a serial entrepreneur, trainer and conversion consultant, helping businesses like IBM, Dow Chemical, American Express, Panera Bread, and Wendys improve conversions and strategically grow their businesses. Jeremy’s experience as the CMO and CEO of technology firms has given him a powerful understanding of human behavior and profit-boosting techniques. Join thousands of in-house marketers by downloading a copy of his latest ebook: Landing Page Optimization for In-House Marketers.

Get our Daily Newsletter

Get conversion optimization, design and copywriting articles delivered to your inbox FREE

6 COMMENTS

Arun Sivashankaran

Great roundup of testing mistakes. I’d also add “not estimating a sample size” or “not estimating the test duration” to the list. Thanks for putting this together!

February 27, 2014 Reply

    Jeremy Smith

    Arun, thanks for the comment. I agree. Tests need to normalize and most marketers don’t give them the chance to do so. Everytime I think something is going to win, it normalizes and proves me completely wrong. It’s the one time where I love being wrong. Sample size and time are critical elements of any testers arsenal.

    February 27, 2014 Reply

      Brian

      Arun/Jeremy – To do this effectively, you have to determine sample size prior to starting experiment, and not stop the experiment until you’ve reached that. To do so, you have to provide the baseline conversion (average of your control so far), as well as the minimal detectable effect (MDE), or the lift you expect. This is hard, because who knows what the lift will be? (that’s why we are testing!)

      What is your guidance to marketers on how they can not fall for classic statistical perils (such as repeated statistical sig. testing and concluding a test premature, etc), while not running an experiment too long because the actual impact on conversion rate differs from the MDE they used to calculate sample size, (ie, since the actual effect differs from the MDE supplied, in reality you need a different sample size than you initially thought)

      Thanks, and great article!

      February 28, 2014 Reply

Akash Agarwal

Landing page is the page your website visitors arrive at after clicking on a link. Thanks for sharing this mistakes, It’s help to test us without any mistake.

March 4, 2014 Reply


Leave comment

Some HTML allowed

Get conversion optimization & A/B testing articles FREE >>>