An Introduction To A/B Conversion Testing On a Shoestring Budget
In a perfect world, we would always test websites with people from our target audiences. In this world, we would also be able to make this an iterative process: design, test, refine, test again, repeat.
Unfortunately, we don’t always have the budgets for this type of testing and we are forced to make assumptions about how a user will behave when they see:
- the layout of a page and copy on it.
- a call-to-action button.
- the data we are trying to collect from them on an opt-in or contact request.
So, what do you do when your website isn’t performing up to your (or your client’s) expectations?
- Leads are just trickling in
- Bounce rates are sky high
- Site visitors are exiting like the website is on fire
The goal of this post is to walk you through the process of evaluating your problems, trying to identify the cause and doing some low-cost testing to see if your solution works.
“Houston, we have a problem”
When we have a hunch something is wrong, we first need to confirm there is a problem. This is where our analytics tools help – Google Analytics, CrazyEgg and GetClicky are three I use in particular.
The Problem: Conversion rates on a lead form are too low
So, if our problem is a lack of leads being generated by the website, let’s narrow down to some potential causes, with an end goal of finding something to test.
Possible Cause: Low traffic on the lead form page
Analytics can easily confirm or rebuke this theory. So, we log-in to our trusty (and free) Google Analytics account to check our traffic stats on the lead form page.
If there is low traffic on the lead page, we could test our our calls-to-action to visit the lead page. Is the call-to-action text compelling? Should we add in-body copy call-outs? Do we need to consider new traffic generation strategies such as search engine optimization, pay per click or social media?
If there is ample traffic on the lead page, we could test the organization of the form. Is the form functioning properly? What fields are required? Are we requesting too much information? Are the form field descriptions confusing?
For the purpose of this post, let’s assume that we have determined that there is ample traffic on the lead page. We’ve looked at our heat maps and people are clicking on our calls-to-action throughout the site. So, we decide to edit something about the form to see if we can get a bump in generated leads.
Skip the lab testing, test real users
To keep things low-cost, we are going to avoid lab testing, which can be costly. Instead, we will test the behavior of our real users on the site.
We’ve narrowed down to one testable thing, we think something about the forms are causing people to abandon them.
Now, onto the testing.
We create a new copy of our form page, and this time we try using form labels on top of our questions.
Interpreting your Results
After we’ve had a statistically significant number of visits to our forms, we ask the question.
Did the top-aligned labels make a difference in generating leads?
If the change increased leads to a point we are happy with, we remove the A/B testing script and make the new version of the form our default form page.
If they don’t significantly improve or actually reduce conversion, we create a new test and come up with a new version for testing version B.
Rinse and Repeat
I’ve described a pretty simplistic test here, but this process is something we do over and over again to ensure we have the most optimized site for our users. Once you feel comfortable with your first test, you can use A/B and multivariate testing for other page elements throughout the site including layout, headlines, body copy, font sizes, graphics, colors and buttons.
And you can do all of this A/B testing on a shoestring budget. Happy testing!
Image courtesy of Tempesttea