A 5-minute read.
If there’s one thing you need to know about good marketing practice, it’s that good marketers test their work. If you’re a business owner or marketer you should be testing everything that goes in front of a user. Whether you’re looking to increase revenue, sign-ups, conversions, social shares or engagement, etc. A/B testing and optimization can help you achieve the best results. There a good joke in the marketing world the A/B testing actually stands for “Always Be Testing.” It’s a great reminder that you can’t get stellar results unless you compare one strategy to another, and A/B testing examples can help you visualize the possibilities.
You wouldn’t invent a product and send it out into the world without asking a few friends if it was a good idea, right?
You shouldn’t assume that you know what will catch a buyers attention or make the most people fill out your sign-up form. That’s why data-driven decisions can be the most effective.
So, what’s the recipe for high-impact success?
Truthfully, there isn’t one. What works for one business won’t work for another. And vice versa.
But just because you can’t replicate the same result, that doesn’t mean you can’t get inspired by other tests.
Here are 7 high-impact case studies. While the same test may not get you the same results, they can get you inspired to run tests of your own!
In-Depth Case Studies
1. Electronic Arts
The Goal: Increase Revenue
When Electronic Arts released a new version of one of its most popular games, they wanted to get it right. The homepage for SimCity 5 would undoubtedly do well in terms of sales. however, EA wanted to capitalize on its popularity.
EA wanted to maximize its revenue from the game immediately upon its release as well as through pre-sale efforts. Electronic Arts wanted to A/B test different versions of its sales page to identify how it could increase sales exponentially.
The original version of the pre-order page offered a 20 percent off a future purchase for anyone who bought SimCity 5. It featured an offer displayed as a banner across the top of the pre-order page. But according to the team, the promotion was not driving the increase in pre-orders they had expected.
The variation eliminated the pre-order incentive. The test leads to some shocking results: The variation with no offer messaging at all drove 43% more purchases! Fans of SimCity 5 weren’t interested in an incentive, they just wanted to buy the game.
The A/B test for EA revealed important information – many people who play a popular game like SimCity don’t play any other games. Consequently, the 20-percent off offer didn’t resonate with them.
If you make assumptions about your target audience, you’ll eventually get it wrong. Human behaviour is difficult to understand even without hard data, you need A/B testing to generate data on which you can rely.
Most people believe that direct promotions drive purchases, but for EA this turned out to be false. Testing gave them the information needed to maximize revenue in a way that would not have otherwise been possible.
The Goal: Optimize Homepage
Wallmonkeys, a company that sells amazing wall decals for homes and businesses, wanted to optimize its home page for clicks and conversions to ultimately drive more sales.
The original homepage featured a stock style image with a headline overlay. Nothing wrong with this. The image is attractive and not too distracting while the headline and CTA seem to work well with the goals of the company.
First, Wallmonkeys used heat-maps to see where users were navigating on the homepage. Heat-maps and scroll-maps allow you to decide where you should focus your energy. If you see lots of clicking or scrolling, you know you people are drawn to those places on your website.
As you can see above, there was lots of activity on the headline, CTA, and search bar.
After generating the user behaviour reports, WallMonkeys decided to run an A/B test. They exchanged the stock-style image with an alternative that would show visitors the opportunities they could enjoy with their products.
Conversion rates for the new design versus the original were 27% higher.
However, they wanted to keep testing. For the next test, they replaced the headline with a prominent search bar. The idea was that customers would be more drawn to items they were specifically interested in.
The second A/B testing resulted in a conversion rate increase of 550%.
By not stopping after the first increase of 27% the company enjoyed amazing profit potential and better user experience for its visitors.
Just because one A/B test yields increasing results doesn’t mean that you can’t do better. WallMonkeys realized that and launched a second test. It proved to be super successful!
The Goal: Generate More Leads
comScore wanted to run an experiment on their product pages to generate more leads. Their page featured a customer quote, to test this they wanted to test different variations of the quote on the page.
The original layout features quote mixed amount other content and displayed on a less-than-eye-catching grey background.
The team experimented with several different designs, orientations and layouts to see if a different visual treatment would make their social proof convert more visitors into leads.
They tested 2,500 visitors in the experiment and soon saw variation 1 was the winner. Using a vertical layout with the client logo displayed prominently on top of the testimonial increased the conversion rate of the product page by 69% compared to the original.
Quick Case Studies
Moveexa tweaked their headline to add the word “supplement” and increased conversions by 90%!
Highrise tested different headline and subheadline combinations to see how it affected their sign-ups. The variation telling visitors to sign up is quick (the one labelled test) resulted in a 30% increase in clicks.
Humana tested two different banners on their home page. The original was cluttered with a ton of copy and a less noticeable call-to-action. The variation was cleaner, with a strong CTA. The variation led to 433% more clickthroughs.
RummyCircle wanted to test how differently written mobile Facebook ads affected sign-ups. In previous testing with desktop ads, the team found engaged users – who commented on the ad – were more likely to click to sign-up. Commenting, therefore, facilitated lead generation. But on mobile, the test showed that asking users for comments actually decreased conversions. The second version had a 225% increase in sign-ups.
These companies saw amazing results because they started testing! If you want to see amazing results you’ve got to get started too. For more information and help on A/B testing, be sure to contact us 🙂
Be sure to subscribe to our mailing list for free tips and more directly(like blog updates) to your inbox. You can subscribe by clicking here.
Thanks for reading and if you have any questions, email us or drop a line below 🙂 We look forward to hearing from you!