The Deceptive Danger Hiding In Your A/B Testing Data
It’s a dangerous game to play. Blind trust in A/B data.
Sure, data is seductive. After all, numbers don’t lie.
Using A/B testing data to make decisions about your marketing is smart, but it is only one piece of the puzzle.
When numbers are the only criteria used to drive our decision making, the tactics used to achieve those numbers can become very dangerous.
A great example is the figurative (and sometimes literal) asterisk placed next to Major League Baseball records attained during the 1990’s and early 2000’s. A time when steroid use was rampant in Major League Baseball. Sure, Barry Bonds hit more home runs than any other baseball player. But how he did it isn’t represented in the numbers. Bonds has admitted using steroids as a tactic to boost his performance.
When the goal is simply to “hit the numbers,” the methods used can become unethical and even unlawful.
When A/B testing the User Interface on a website, be particularly careful when evaluating the data.
There is a name for it…
Most of the discussion around User Interface revolves around identifying and fixing problems that exist because the UI designer was lazy or uneducated.
But what of UI design that is intentionally confusing? Intentionally tricky?
A wiki has sprung up exposing those that create user interfaces designed to trick people. It’s called Dark Patterns.
The Dark Patterns wiki covers a number of dubious tactics. Tactics that are used by marketers and designers to improve conversions at the cost of credibility, trust and brand reputation.
As with most things, there is gray area. There are subtle uses of trickery that are hardly noticeable and others that would make a con artist cringe.
Overt deception anyone?
I can imagine that if the goal is simply to make more money, that this pop-up ad tested through the roof:
It appears to be a pop-up from Windows, but is in reality an advertisement for a virus protection program. This is downright deceptive, and intentionally so.
Sometimes it’s not what you said. It’s what you didn’t say.
Audible.com was called out for designing a check-out process that failed to notify customers that prices were monthly.
Surely these check-out pages performed better in A/B tests than the current one, if conversion numbers were the only criteria:
To Audible’s credit, they quickly updated their process and released a statement apologizing for the confusing check-out process.
I can imagine the ensuing meetings between management and developers at Audible after this was exposed.
Management: Why doesn’t the checkout screen tell customers that this is a monthly fee?
Developer: Because when we didn’t tell them it was monthly, more people bought the subscription.
Management: Yes, but they are angry at us now.
Developer: You told us you wanted to make more money. Right?
Management: Yes… but, err… never mind. Go back to work.
Is this deceptive or just smart UI design?
Tiger Direct does well. They are undoubtedly masters of A/B testing.
If you know the consumer electronics business you know that there is significant profits to be made by selling “protection plans.”
When you add a product to cart on Tiger Direct, a protection plan is pre-selected for you. Note that you must click “Add Protection” to add the selected plan. It is not added without clicking this button.
Perhaps Tiger Direct is taking advantage of Jakob Nielsen’s adage “People tend to stick to the defaults” with this tactic.
Contrast this with how Best Buy handles their protection plans. In this case, there is no pre-selected plan set as default.
Personally, I see nothing wrong with what Tiger Direct is doing with this interface. I don’t think anyone would think this is worthy of reporting to the Better Business Bureau.
Is it subtle? No doubt.
Is it intentional? No question.
Best Buy and Tiger Direct have designed their UI with purpose. It’s been tested. Hopefully, in both cases, the test data was interpreted through the lense of larger business goals.
It starts with the goal
Let’s use a concrete example, shall we?
A company that sells a music download service offers a 30 day free trial. Obviously the free trial is intended to create paid customers after the free trial expires.
Landing pages are created to communicate the free trial.
But the goal of the initial test is vague.
Example A/B Test 1:
Goal: Increase paid subscriptions
Data: Landing Page B generates 77% more paid subscriptions than Landing Page A
Action: Use Landing Page B
It’s a no brainer right? Landing Page B generates more subscriptions.
Think Barry Bonds. The goal was to hit more home runs. The tactics were irrelevant.
The test is incomplete because there is no context applied to the data. The question is WHY does Landing Page B convert better? What tactics are being used to increase conversion?
Get context through analysis
Here is an example analysis for A/B Test 1:
Data Analysis: Landing Page B offers our free service for 30 days. After the 30 days, they are automatically converted to our paid service. The language used on the page does not clearly notify site visitors that they will be automatically added to our paid service.
Landing Page A clearly states that the free trial expires after 30 days and requires site visitors to opt-in to begin paying for the service after the 30 day trial expires which is likely causing fewer people to opt-in.
Be careful here
Let’s look at the same example A/B test, with a different goal.
Example A/B Test 2:
Goal: Convert landing page visitors into lifetime customers
Data: 87% of customers from Landing Page B cancel within three months of subscribing. 92% of customers from Landing Page A are still customers after one year.
Action: Use Landing Page A
Data Analysis: The deceptive nature of Landing Page B causes a high level of early cancellations.
Where is the disconnect?
In our second example, a different goal was used which eventually led to a different action than our first example.
The reason? The deceptive nature of Landing Page B was not congruent with the more specific goal of creating lifetime customers.
The disconnect lies in the goals communicated by strategists and acted upon by tacticians.
Landing Page B may not meet the following goals either:
- Build and maintain a strong reputation in our industry
- Create a recurring revenue stream through subscriptions
- Build customer evangelists that grow our referral network
Make more money? Sure, we can do that. Hit more home runs? No problem.
The goal drives our tactics and eventually our decisions.
Don’t judge me
I’m no Barney Fife. I didn’t just fall off the turnip truck.
I know that many will choose Landing Page B every day and twice on Sunday. The goal may be that simple — make more money. Reputation, lifetime value and customer evangelists … pfft.
But, for those that are in this for the long haul, beware. Your goals will drive the tactics that are used to achieve them.
Would Barry Bonds have done things differently if he would have known what it would do to his reputation? We can’t know for sure. The allure of “hitting the numbers” can be irresistible.
Business goals inform our decisions. This includes the UI decisions made from A/B test data. Communicate clear goals to everyone making decisions on a strategic and tactical level. Evaluate not only numbers but the context surrounding those numbers.
That is unless you want to be the Barry Bonds of your industry.