A/B Testing: Surefire way to boost traffic on your website

Canvs Editorial
UX Collective
Published in
7 min readJul 20, 2021

--

If you are on the internet, you are participating in A/B testing without consciously knowing about it.

Let’s talk about the most common example we observe every day. Whenever you log into Netflix, the featured show on the home page is always different. I remember watching a travel show primarily because it showed up on my home page and looked quite aesthetic. There have also been instances where I watched a show because the thumbnail caught my attention.

Well, that is not by chance. That is a part of a very well structured Netflix strategy, showing different versions of the same element to users.

This is A/B testing.

The basic idea of A/B testing is to present different content versions to different users, gain an insight on which version performs better, and use those results to strategise further to improve user experience. The performance metric could be user traffic, more conversions, more engagement, etc.

It is a decision-making tool and the most reliable way to learn user behaviours.

If leveraged correctly, it can move mountains. For instance, Unbounce increased the click-through rate by 90% by just changing “your” to “my” on the CTA button copy: “Start your free 30-day trial” to “Start my free 30-day trial”.

A designer can test a myriad of design elements ranging from imagery to copy on CTA buttons.

Check out this interesting story of how Google tested 41 shades of blue to see which got the most clicks.

Let’s look at a few examples of how making simple changes to digital products has led to dramatic improvements.

1. A/B testing at Netflix:

Netflix homepage
Featured show on the home page (Source)

Netflix hypothesised that personalising the content displayed on the home page will result in more people retaining their subscriptions. Netflix tests different variations of an experience with certain subsets of their users, keeping one variation as the control experience. They iteratively improve the experience by keeping the winner variation as the control in the next series of tests.

“Thinking of product development as a series of experiments leads to stronger designs.”

Different artworks for Netflix show
Source

According to Netflix, a typical Netflix member first glances at the artwork and then the other details of a show before diving in to watch. So, Netflix decided to experiment with different artworks of the same show to see which one increased the click-through rate. It helped them identify artworks that enabled members to find a story they wanted to watch faster and engage more with each title.

Moreover, the featured show on Netflix’s home page changes every time you log in, log out, or switch users. These changes are a part of deep research and experimentation by Netflix to get you to watch more.

2. A/B testing article headlines:

Buzzfeed articles
Source

Many media agencies, news websites, and other digital products A/B test certain elements of the content they put out.

Buzzfeed, known for its pop-culture content, brings in millions of global readers. To ensure such high levels of engagement each month, Buzzfeed’s editors make sure that every element of the article catches the viewer’s attention.

“We A/B test pretty much everything. Not just the headline but also the number (listicles) in the headline. The thumbnail that’s rendered on Facebook or Twitter: Sometimes people don’t even see the headline; they just see an image and say, ‘Wait. What is in that image?” — Menendez, former data infrastructure engineer at Buzzfeed

3. A/B testing email campaigns:

Email opener subject line
Adding the subscriber’s name in the subject line increases email opens by 14%. (Source)

A/B testing an email campaign relates to sending different versions of an email to subsets of subscribers to recognise which version brings in better results.

The variations can be chosen from a myriad of choices, for instance, changing the subject line, adjusting the content length, or adding the personalised sender’s name (rather than a company name).

Even the tiniest of changes can garner better results, like changing the word order or adding a receiver’s name in the subject line.

Consider these 2 subject lines for the same email:

  • Use this discount code to get 25% off your next purchase
  • Get 25% off your next purchase using this discount code

The 2nd example works better since it emphasises the benefits at the beginning of the sentence itself.

4. A/B testing landing pages:

A/B testing landing page
Source

Landing pages are one of the most crucial parts of a conversion funnel. In the limited time to form a good first impression, it’s essential to optimise the landing page.

Many elements directly increase conversions. like changing a word, the size, or the placement of a headline. Other such elements include the copy, the CTA, sign-up forms, and the page’s overall layout.

Here’s a customer story about how tweaking the CTA button design increased the CTR by 20%-47%.

5. A/B testing YouTube thumbnails:

YouTube artwork
Source: Anna Yashina on Dribbble

One of the most advanced ways to improve your YouTube channel is to do A/B testing. It’s a practice that YouTube creators use to determine anything from choosing the right thumbnail to selecting the suitable video titles, tags, more targeted keywords and descriptions.

As of now, YouTube doesn’t have the A/B test feature, but TubeBuddy to the rescue. Creators can leverage the product to analyse which video gets higher traffic and a higher CTR.

Here’s a detailed video of how one can earn more views by A/B testing thumbnails.

A/B testing is an excellent tool to grow a YouTube channel as it allows one to make data-driven decisions.

Here are a few other case studies about products that benefited from optimisation through A/B testing:

  1. How Discovery Channel saw a 6% lift in the click-throughs into the video content.
  2. How Zalora increased its checkout rate by 12.3% by optimising its product pages
  3. How Obama raised an extra $60 million just by changing his CTA button text from “Sign Up” to “Learn More.” in the 2007 U.S. election campaigns.

The downsides of A/B testing:

1. It takes a lot of time and resources:

Developing a hypothesis demands a lot of background research, data collection, surveys, trend analysis, and much more.

Once that’s done, it requires tremendous effort to boil down to just 2 variants. Moreover, since the test requires different versions of the same thing, designers and developers must work double as hard and use more time and resources.

2. It’s challenging for low traffic websites:

A/B tests work just fine for larger companies, like Amazon, Netflix, and Google, with millions of users to test on. A higher number of users help in gathering insights faster. Whereas low traffic websites can take weeks to see any significant result.

3. It can be risky for financial products:

It can be a risky game to A/B test certain design elements on a financial service product. These products cannot afford to experiment with certain users since there’s real people’s money at stake.

4. It only refines the existing designs:

A/B test works only for refining the existing designs, not building products from scratch. So make sure you wield this tool like a scalpel, not an axe.

Bottom line:

Underhanded strategies can show up in any design form, and a designer needs to steer clear of those. Testing the content which tricks users into buying something they didn’t intend to may result in a higher conversion rate, but that would be a short-lived success and not reap long-term results.

The result of the A/B test should be a win-win situation, enhancing the user experience for users and increasing conversions and traffic for the brand.

Conclusion:

In a world where data is king, A/B testing is a powerful tool to gain valuable insight into your business and how to improve it.

Leveraging A/B testing can help improve your website, increase its conversion rate, and learn more about how your customers behave online.

Let the results speak for themselves.

The Canvs Editorial team comprises of: Editorial Writer and Researcher- Paridhi Agrawal, the Editor’s Desk- Aalhad Joshi and Debprotim Roy, and Content Operations- Abin Rajan

Follow Canvs on Instagram and LinkedIn. Don’t forget to follow us here on Medium as well for more design-related content.

The UX Collective donates US$1 for each article we publish. This story contributed to World-Class Designer School: a college-level, tuition-free design school focused on preparing young and talented African designers for the local and international digital product market. Build the design community you believe in.

--

--