Two men walk into a bar.
One of them is a marketer, and the other is a ‘growth hacker’…
The growth hacker says to the marketer that their sales page is doomed because the colour of the ‘Buy Now’ button is red instead of orange.
The marketer laughs this off and tells the growth hacker he’s crazy. ‘We’ve carefully crafted our offer, our sales copy and our messaging. The colour of a button is not going to change anything.’
So: Which one of them is right?
The point of this neat little didactic parable is that we can’t know for sure which one of these characters is right. The point, you see, is that only testing out the hypothesis and collecting data can reveal who’s truly correct.
In the world of A/B or ‘split’ testing, HiPPO’s no longer rule. That is to say, the ‘Highest Paid Person’s Opinion’ is only as good as the data used to back it up. Otherwise, the only thing that matters is that like Dorothy following the yellow brick road, marketers follow the data.
What is Split Testing?
‘A/B Testing, as the name indicates,’ is a method of optimisation that pits two version of the same thing — for example, a landing page, a sales page, an email campaign, a website, a PDF opt-in, an advertisement meant for the same audience — against each other to see which one ‘converts’ higher or elicits a greater response — be it via shares, clicks, or actual purchases.
The reasons ‘why’ which might rest behind a user’s decision to take one action over the other on two different versions of the same thing matters not.
All that matters is that, once the ‘testing period’ is over, a clear winner emerges and this winner— with its combination of copy, design and functionality — is what gets used as the final or ‘live’ version.
At its core, what A/B testing is, well, testing for, are user interactions with a page, product, service or offer. In other words, data, in this case, is both the product and the tool because it can be used to create better products, which in turn generate even more actionable data from which businesses can gain even deeper insights.
Iterative, responsive and granular: That is split testing done right. And its use and efficacy? The numbers speak for themselves:
- Businesses with over 40 landing pages generated a whopping 12 times more leads than those with 1-5 landing pages. (Hubspot)
- President Obama raised an additional $60 million, using A/B Testing (Event360)
- A/B testing is the most used testing method for improving conversion rates (Steelhouse)
When A/B testing is in action, it measures the impact of differences. And, as you can see, those differences can be stark. Though it does take some time to run, the results can help marketers and optimisers make changes in a more agile manner.
Using Split Testing in Audience Targeting
Before the era of digital marketing, ‘conversion specialists’ were known as ‘direct response advertisers’. Running creatives against each other is not a new concept. In fact, David Ogilvy himself says,
Never stop testing, and your advertising will never stop improving — David Ogilvy
On Facebook, through its ads platform, split testing is not only a common practice but is now so prevalent, it’s become a self-serve tool, built standard into the platform for ads creation.
The structure of a Facebook ads campaign allows users to split one ad four ways, testing for variables across:
- Creative (image used and copy options)
- Deliver optimisation
- Audience targeting
The two most useful (and likely to give advertisers noticeable impacts) variables to test for here are ‘Creative’ and ‘Audience’, especially when it comes to Facebook ads marketing.
Because of the way Facebook Ads is set up, it will automatically start to show less and less of a ‘lower-performing’ ad, self-selecting the higher performing ad much earlier during your test period.
This means that it’s important for advertisers to focus on getting their ‘Audience’ targets right and test only one interest or behaviour at a time, to set up multiple campaigns with a focused audience target down the lane.
After an A/B test, these audiences can then be saved and re-used for retargeting and building a sales funnel that begins its first leg with a Facebook ad.
A/B testing with saved audiences can also be used to test another advert against the same audience, getting a sense of if the product, service or offering is resonating with the audience forecasted.
And, finally, A/B testing can help create something Facebook calls a ‘Lookalike’ audience, which is harnessed by creating an audience, capturing those who clicked through an advertisement and, using Facebook’s platform, creating a very ‘similar’ audience in behaviours and demographics, though completely fresh.
This is an excellent way to get an entirely new crop of eyes on your ad without any extra effort in digital marketing.
Using Split Testing to Predict Behaviour
This strategy of A/B testing focuses on behavioural targeting, not simply conversion and click-through rates.
Predicting behaviour gets easier the longer your user or visitor remains on your website, takes multiple actions and takes advantages of the multiple offers you might have posted.
For example, a blog post of particular interest to one kind of “ideal user” or “ideal visitor” might be visited by that user, and they may end up downloading an offer you’ve embedded in your blog post, as a content upgrade.
Why is this important? Because it tells you that your content upgrades are working. It may also tell you which content upgrades are working. And, as you zoom out, you can tell where your visitors are coming from and what they’re clicking through first, next and last.
A pattern may soon start to emerge as more than one user clicks through the same ‘route’ or takes the same actions. The results of an A/B test on a website will reveal a number of things, including, what parts of your homepage and website need further refinement, so that you’re guiding your users where you want them to go (i.e. user experience), rather than risking having them be overwhelmed, confused, annoyed or, worst of all, leave.
It can also show which parts are most popular (in which case, you can begin to brainstorm how to ‘give the people more of what they want!’ so to speak).
Using A/B testing on a homepage to track behavioural actions requires a smart hook up between Google Analytics and Google Tag Manager. It’s not in the scope of this post to get into the details, but the rough idea works like this:
- A business owner using his or her website for sales or an online entrepreneur hoping to make sales using his or her website sets up a regard home page with a menu bar and makes sure their site is being tracked by Google Analytics
- The site/business owner also sets up custom tracking scripts using Google Tag Manager
- GTM can be set up to be “triggered” once an “event” (a particular action) is taken and then stored in a variable that gets reported on the GA dashboard.
The interesting thing is that sales and marketing teams can use the tracked behaviour over time, and especially with users who return and have been assigned a UserID’.
By understanding clearly the motivation of the buyer, the preference they have to gather information and make decisions, you can use that information to create the A/B testing scenarios that test your customer experience hypothesis and refine your understanding of the buyer. — Carole Mahoney, CEO, MiM
So what about predicting that behaviour?
Well, over time, the data being tracked over behavioural actions will help marketers and website owners put together a very distinct picture of the user’s experience of the site as well as be able to segment users and use their actions to build a couple of “unique/user personas”.
You’ll be able to tell what users value and which actions they are most likely to take next.
As a side benefit, you’ll also be able to tell who’s there who is not supposed to be there and how, based on that, your marketing messages can be shifted, refined or thought out again.
Using Split Testing to Transition Online
This last example brings together the power of Facebook Ads marketing along with external landing page software tracking and Google Analytics to help physical, brick-and-mortar businesses start to build an online presence and lead generation funnel.
A/B testing is the perfect way to build a campaign that is going to be evergreen and there is no more evergreen marketing goal than generating qualified prospective buyers.
Let’s take a look at how this might work.
A pool-and-landscaping business that has been running locally for a while decides that they would be using their efforts in a more cost-effective way if they ‘automated’ and ‘streamlined’ using digital marketing practices.
Their goal is to build an ’email list’ of subscribers who, essentially, will also be a ready-made list of potential buyers, after a series of interactions (which would be the job of content marketing — a tale for another post!)
At this point, the company is not sure about user personas — they’re still learning. They also are not sure if their web design actually serves their business. They’re looking for data on both.
What’s a business to do with these three goals? Simple. Create a lead-generation pipeline that starts with a Facebook ad, redirects the user to a landing page, where they can download a lead magnet in exchange for their email (which the business can now use for remarketing purposes).
Using A/B testing, users will be sent to two different versions of the landing page, two different versions of the homepage and will be allowed to download two different opt-ins: One regarding pools, and one regarding finished basements.
One landing page, in one version of the ‘test’, may include a link to the website while the other, for example, may not. Once a user is on the website, what actions are being taken? This is where Google Analytics comes in.
This A/B test can also reveal whether their opt-in was in alignment with what they might purchase. Let’s say they downloaded the opt-in for pools because this was the ad that was shown. But, once on the website, this user might skip the menu information for pools and headed straight for basement finishing services.
Conclusions drawn from the incoming data results of an A/B test not only help to give a physical business a digital footprint, but they can also start to reveal even more potent and unintended information.
In this example, if a majority of users ended up clicking on information for basement refinishing, this physical business might have to re-think their offerings — if not in ‘real life’ then, at least, online.
A/B tests are incredibly useful and, when hooked up to other aspects of digital marketing practices, can be very powerful. While Big Data is making a major push and bigger businesses are certainly pivoting, there’s no reason why a well-run, strategic A/B test run in any of these scenarios would fall short of the kind of insight Big Data might promise.
Now that you’re a believer, there’s only one note of caution to leave you with: A/B tests work best when they have a definite goal, an actual question and a hypothesis that follows.
However, the caveat to that is that A/B testing can be used to further test the results of a more general A/B test previous to it.
No matter what you choose, A/B testing is a true tool now in your arsenal.
ABOUT THE AUTHOR
Matt Williamson is the Company Director of Digital Marketing Adelaide. He keeps busy doing what he enjoys most – talking. He has a passion for social media and all things digital marketing.
Matt is an internet entrepreneur, social media expert, multiple business owner and digital marketer who specialises in assisting businesses to generate more revenue via the power of online advertising and digital marketing. Matt regularly speaks at events as a keynote speaker focusing on digital marketing & social media.
View Matt’s LinkedIn: https://www.linkedin.com/in/matthewdjwilliamson
[button size=” style=” text=’BACK TO BLOGS’ icon=’fa-arrow-up’ icon_color=” link=’https://www.digitalmarketingadelaide.com.au/blog/’ target=’_self’ color=” hover_color=” border_color=” hover_border_color=” background_color=” hover_background_color=” font_style=” font_weight=” text_align=” margin=”]