How A/B testing can transform your B2B marketing

(Click here to expand)

Recently, we brought you a recap of a CRAVE session about benchmarks and analytics by Movéo’s director of data and insights Jiani Zhang. In a followup session, she presented an in-depth look at the power of A/B testing. Here are some of the key takeaways:

Types of A/B testing

There are two forms of A/B testing, gradual and radical. Gradual A/B testing changes only one element at a time so that you can definitively attribute any differences between the two to the variable. If you are testing two differently shaped call-to-action buttons, you would only change the shape of the button, not the color, text, placement or anything else.

In radical A/B testing, many elements are changed at once, and the whole is treated as one variable. For example, two dramatically different versions of a landing page, with different images, copy and form design may be tested against each other.

While radical testing cannot identify the effect of each individual change, it is often a good choice for B2B marketers. It has the advantage of being able to test larger changes quickly. and especially when it is difficult to gather enough impressions for a statistically significant result. Before applying the results of a radical test to your entire project, consider repeating it in a few places to confirm the results. You may also wish to, if possible, run a few gradual tests on specific elements as well.

Why perform A/B testing?

A/B testing compares two variants of a given landing page, email or digital ad to determine which one performs better based on a specific metric or metrics. It’s an efficient method to optimize campaigns and to gain understanding of the elements that drive success.

Customer journeys are enormously complex in the digital sphere, and A/B testing can pinpoint aspects of a campaign as small as the text on one call-to-action button or the color choices in a banner ad that draw customers in or push them away. Ultimately, the increased understanding of customers’ values and behaviors offered by A/B testing can help you develop more relevant content, build better user experiences and improve ROI.

For the most effective A/B test, perform qualitative research first to determine which elements to test. Then, ensure that you are tracking results in a way that is clear and easy to measure. Make sure that you collect data in the same manner for each version you test to streamline analysis.

What can be tested?

All forms of A/B testing compare two versions of the same element on a certain channel. You can test calls-to-action, copy, forms, videos, content display, site navigation and more. Even within those categories, there are numerous things to test. Try these on for size:

    • Calls-to-action: text, colors, sizes, shapes, locations of images
    • Copy: headlines, paragraphs vs. bulletpoints, shorter vs. longer copy
    • Forms: length of form, inclusion of special offer
    • Videos: autoplay, must be clicked
    • Site navigation: style of menu, order of menu options


For test results to be valid, they must have an appropriate sample size, which is considered in terms of number of impressions. The minimum and ideal sample sizes will vary based on the expected conversion rate, so take that into account when planning your test.

Case study: Movéo for Molex

While working on the Chinese version of Molex’s website, Movéo helped design and execute an A/B test on a landing page with a form submission for a gated report. In this radical A/B test, the original version of the landing page offered several calls-to-action for visitors to engage with, including one that led to a form to download the gated content. The variant landing page focused entirely on that single content offering, presenting the form and call-to-action.

The test focused on two metrics: bounce rate and lead conversion rate. In the test, the original landing page demonstrated a lower bounce rate but also a lower rate of form submissions than the variant. While it may at first seem that the test ended in a tie, further statistical analysis revealed that the difference in bounce rates was statistically significant while the difference in form completions was not. Since it kept more visitors engaged on the site, the original landing page was therefore considered to be the winner.

For more on Movéo’s work with Molex, read our case study.

Comments are closed.