Everything we do in Marketing is a test. We may be confident in what we promote because we have historical and current data that helps us understand what could be successful but positive ad performance is still never a guarantee.

When it comes to testing on LinkedIn Ads, just like any other digital ad platform, we can either throw spaghetti at the wall and hope something sticks or we can test systematically (hint: the latter is the better option).

So how should A/B tests look on LinkedIn Ads? You came to the right blog post. Let’s hit it!

 

What Does a Proper A/B Test Look Like on LinkedIn Ads?

 

When we talk about A/B testing on LinkedIn Ads, we’re talking about testing between two different variables in your ads.

This could be a test between two variations of intros, headlines, images, etc, and leaving all else in the ad the same. This results in a true test because you can directly attribute an ad’s success to one of your differing variables.

For example, say you’re conducting a test between two ads. Both ads have different intros, headlines, and images. If one outperforms the other, it’s hard to conclude which variable (intro, headline, or image) led to success.

On the other hand, if you conduct a test between two ads but the only difference between them is the intro text, all else remaining the same, then if one outperforms the other, you can have confidence that the intro text is what led to the success of your ad.

Note that you don’t necessarily have this luxury when A/B testing different offers. When testing offers, your imagery, intro text, and headlines are likely going to be different, but finding which offer is going to outperform the other is still a very worthwhile test.

Once you’ve found an offer that works and generates the desired results, you can optimize for better performance by A/B testing other variables. If you need help deciding which offers to start with, here’s our guide on which offers tend to work well on LinkedIn Ads.

So with all this being said, does it mean you can only be limited to running two ads at a time in each campaign? Not necessarily.

You can increase the Frequency (the average number of times your ad is delivered to a single member of your target audience) of your ads to up to five times in a 48 hour period if you choose to run at least five different ads in a single LinkedIn Ads campaign.

However, Frequency caps at five in 48 hours, so running any more than five ads in a given campaign won’t increase your Frequency beyond that.

That said, when it comes to A/B testing, you may not want to run that many unique ads anyway. What we’ve found works well is running up to four ads in a given campaign but still only testing two ad variations.

What you’re doing here is essentially testing between two ad variations but duplicating each ad one time, resulting in four ads total in a campaign. This allows you to still A/B test between one variable while increasing your ad Frequency at the same time.

 

 

Measuring Results and Concluding Your Test

 

When it comes to measuring the results of your A/B tests, there are several metrics you can compare. The most relevant metrics are likely going to be your click-through-rate (CTR) and conversion rate (CvR).

Many of the variables you test at the ad level are likely to affect your CTR. Variables like your intro text, headline, and imagery, for example, may influence whether or not someone clicks on your ad.

Variables like the offer you’re promoting or your landing page experience are likely to affect your CvR.

So when analyzing ad performance, depending on the variables you’re testing, these are the metrics you can measure for success. To know where your ad performance stands, check out this post for a list of LinkedIn Ads benchmarks.

You might also be wondering, how long do I conduct my test or how do I know when to conclude? To answer this question, it’s important to understand statistical significance.

In layman’s terms and as it relates to LinkedIn Ads, statistical significance is a mathematical approach to determining whether or not you can say with confidence that one ad is outperforming the other.

To illustrate this, say the ads your A/B testing have about 10,000 impressions each. If one ad has 500 clicks and the other ad has 5,000, you can probably say confidently that the ad with 5,000 clicks is the clear winner.

However, if you only have 100 impressions per ad and one ad has 10 clicks and the other has 50, you may see a clear winner here, but you’re also analyzing a substantially small data set. The next 100 impressions could drastically change the results of your test, in this case.

So when analyzing the statistical significance between two ads, it’s important to make sure your data set is large enough to confidently conclude a winner.

Going back to our question, “how do you know when to conclude your test?”, the short answer is, until you have enough data to where you can confidently identify statistical significance between your two ad variations.

Statistical significance is not an easy topic to explain or understand. Thankfully, this podcast episode on LinkedIn Ads testing methodology explains it really well and sheds even more light on how to know when to conclude your LinkedIn Ads tests. Check it out!

 

How to Decide What to Test

 

Your offer is the most important aspect of your LinkedIn Ads. If your offer is good, you’ll generate leads at low costs. If not, lead volume will be low and costs will be high.

So, if you have multiple offers, start by testing between those first. As mentioned earlier, once you find an offer that works well, then start testing other ad variables in order to optimize performance.

If you’ve decided on an offer, the next important aspect of your LinkedIn Ads is the intro text. People tend to log in to LinkedIn with a specific purpose in mind, not for leisure. That means we as advertisers only have a limited amount of time to capture their attention.

Your messaging needs to be both captivating and compelling in order to see LinkedIn Ads success, so test intros to find the one that’s going to bring you the best results.

Following these two variables, you really have the freedom to test whatever you’re interested in testing after that. Testing offers and intros first is the most efficient way that we’ve found to identify early on what our audience is most interested in and then optimize from there.

 

Test, Test, Test!

 

Rather than going the “spray and pray” approach, systematic testing on LinkedIn Ads is a sure-fire way to find ads and offers that work. Rarely do we ever get ads right the first time. It takes patience and constant testing until we find the right combination of variables that result in LinkedIn Ad success.

What tests have you tried? What have you found to be most successful? What hasn’t been successful? We want to hear from you, so feel free to leave a comment below!

Also, if you can’t already tell, we really dig this stuff. 😉 In the 11 years we’ve run LinkedIn Ads, we’ve spent $150M+ on the platform, are official LinkedIn Marketing Partners, and have managed some of the largest LinkedIn Ad accounts in the world.

B2Linked increases your lead quality while lowering costs at the same time. Say goodbye to wasted ad spend! If you want to ramp up your LinkedIn Ad efforts, apply to work with our team of experts.

Thanks for reading and happy advertising!

 

Written by Eric Jones

Eric Jones - B2Linked