5 Cases Where Best Practice Blew It: Conversion Tips for the Adventurous
Imagine for a moment that you’re looking to break into the billion-dollar soda pop market. Your new drink is fizzy and sweet. It’s refreshing. It’s tasty. And it’s cheap to make and sell. Should be a hit, right?
Then along comes a competitor that completely crushes every notion you ever had about your market. They’ve created a drink that’s not sweet. It certainly doesn’t taste good. And it’s more expensive than anything else on the shelf.
And if you think nobody would ever drink that swill, then you haven’t heard about Red Bull, which is a billion-dollar-brand in its own right, with over 40 billion cans consumed so far.
There are many cases where, yes, following a set of outlines can help you make more meaningful connections with your customers. But doing so blindly can harm your conversion rate more than it helps.
In fact, we found five cases when jumping on the best-practice bandwagon backfired—where doing the exact opposite caused conversion rates to soar.
Removing the Sign Up Form Increased Conversions by 60%
Vendio, an e-commerce store provider, had a typical landing page that was laid out just as you’d expect it to be according to “best practices.” They had a sign-up form on the left, compelling images, bullet-point benefits, and a large call-to-action button.
In a test, Vendio decided to tell best practices to go pound sand. They redesigned their page without the sign-up form and, instead, added a second page to the process. In this test, only when users clicked the sign-up button were they taken to the form.
In other words, they added an extra step to their conversion funnel—something that any conversion expert would wring their hands at while throwing around words like “obstacle” and “friction.”
Would you believe that Vendio’s sign-up rate increased by 60%?
Why’d They Do That?
Sometimes it pays to push the envelope a bit and try something that flies in the face of convention. In this case, by split testing a version with the registration form vs. a page without, Vendio was able to see that the challenger performed much better in their case.
Now, that isn’t to say that you should jump on your landing pages and rip out your sign-up forms. What worked well for Vendio might not work for you. And that’s precisely why you split test to determine what your audience expects and is likely to act on.
Outdated Design Trumps Trendy New Look
Uncommon Practitioners, a product page for trauma specialists, wanted to encourage healthcare providers to sign up and receive their treatment videos. As you can see, their control page looks rather dated in its design:
The newer design, seen below, encompasses a variety of web design “best practices,” such as the lightbox-style overlay, a lack of scrolling, and privacy details. Should send conversions through the roof, right?
After testing the two designs, it was discovered that the old page converted nearly 20% better than the new one. Which goes to show you that embracing trends for the sake of looking “new and hip” may be detrimental to your conversion rate.
Why’d They Do That?
Web designers are (generally) not conversion specialists, and what’s hip and fresh on the design scene may not always translate well in terms of conversion. But that’s not to say that you should be stuck in the past, either. Testing helps you find out what resonates with your audience, so you can create your set of best practices that are guaranteed to lift your conversion rate.
In the case of the Uncommon Practitioners site, their audience (45 years and older) couldn’t care less about new design trends. Plus, the founder’s photo adds credibility in their eyes. There’s absolutely zero confusion on what the site is about and what the end user gets. That means, in this case, simple and straightforward is best.
Video vs. Image Slider – Who Won?
You often hear conversion experts tout that home page videos can be a great way to improve conversion rates—and it can, in some cases. However, Device Magic wanted to see if a video or image slideshow would increase their conversion rates.
As you can see from their control image, they’ve taken all the “right steps” in creating what should be a solidly performing landing page:
They’ve got their video, bullet-point benefits and a call-to-action link to read more. Compare that with this more simplistic, cartoonish image slider design:
Which one would you choose?
If you choose the image slider cartoon version, you’d be right, as it increased conversions by 35% and sign ups by 31% respectively.
Why’d They Do That?
It’s worth noting that this was not a true A/B test, since it wasn’t only the slider and the video that changed, but also the headline, bullet points and call-to-action.
Ideally, you’d want to test one item at a time, but in this case, the company quickly learned that simplicity rules, especially for products that are complex or need a great deal of explaining to understand exactly what they do.
Color Changes Everything
Green is good, right? Green lights tell us when to go. Green is noted as a positive, fresh and rejuvenating color. So it makes sense to color your Buy Now buttons green—or does it?
GSM, a mobile phone retailer in the Netherlands, tested three variations of their buy now button: one as a green link, another as a green button, and another as an orange button:
Which one do you think performed the best?
As it turns out, the orange buttons increased engagement on the site by 5%. A larger test is being conducted to see if this also corresponds to long-term sales, but according to the original case study, there was an increase, but it wasn’t statistically significant (yet).
Why’d They Do That?
In the case of GSM, this test presented an interesting challenge. All of the buttons site wide had to be tested at the same time due to existing limitations with their e-commerce platform. So rather than skew the results, they created separate style sheets for each button’s color and style, and let the test run from there. Clever, right?
Here again, it simply shows that green does not always mean good, and that eye-catching buttons (especially ones that look clickable) will improve engagement rates considerably.
Product Page tells Social to Shove It
Social sharing buttons are everywhere these days. It seems you can’t shop for anything without being forced to like it, tweet it or pin it. But one Finnish hardware retailer had a different idea. They wanted to see if removing social sharing buttons would improve their clicks on their call-to-action button:
“But wait!” You gasp. “What about all that social proof! Who’s going to want to buy that product if there aren’t some likes or shares behind it?!”
Surprisingly, more people. Nearly 12% more, in fact, which is a sizeable chunk of the site’s demographic.
Why’d They Do That?
Although best practices would dictate that every little bit of social proof helps, in this case it served as a distraction to getting people to click that all-important call-to-action button. And removing distractions is something every aspiring conversion pro should aim for!
Share Your Thoughts!
Each of these examples clearly shows that best practices don’t always work when broadly applied to your overall strategic conversion optimization plan. That’s why it’s a smart idea to test and determine what works best for your audience—even if it seems to go directly against what your peers and competitors are doing.
Have you done your own experiments where the results flew in the face of convention? Share your findings and thoughts below!
Read more Crazy Egg articles by Sherice Jacob.