Show Resources

Here were the resources we covered in the episode:

AJ’s post describing how AB testing tool works

Episode on AB testing

Follow AJ on LinkedIn

NEW LinkedIn Learning course about LinkedIn Ads by AJ Wilcox

Youtube Channel

Contact us at with ideas for what you’d like AJ to cover.

A great no-cost way to support us: Rate/Review!

Show Transcript

Ad rotation settings on LinkedIn Ads. Should you rotate your ads evenly to get better data for your AB tests or leave on optimized for performance? It’s a trick question. And we’ll cover exactly why on this week’s episode of the LinkedIn Ads Show.

Welcome to the LinkedIn Ads Show. Here’s your host, AJ Wilcox.

Hey there LinkedIn Ads fanatics, as most modern ad platforms do. LinkedIn has an ad rotation setting that at first glance appears very intelligent to use, but it’s not. And we’re going to talk today about why. I’ll even share with you why I call this the “charge me more and show me less button”. In ad rotation, there are two different options. There’s the default called optimized for performance. And then there’s one called rotate ads evenly. I want to love rotate evenly because we’re always testing. But it is rarely the best option to use and we’ll cover why. First off in the news, Shae on our team brought to my attention a new rollout called comparison metrics. And I’ve noticed this in many of my accounts, since what it is, is when you choose your time range, you can choose the comparison range as well. So if you’re looking at the last week, it’ll automatically show you the changes from the previous week, along with percent changes next to each metric. So in the screenshot that Shae took for us, I can see that spend was up 37%, and visits were up 23.9% and costs were up 18%. So in this screenshot that Shae gave us, I can see that in the past seven days for this client, our spend is down 37% visits are only down 23%, which shows cost savings. Cost per click was down 18% and number of impressions were only down 11 and a half percent. It’s hardly what I would call a game changer, but this kind of data can be really helpful just for understanding what’s my performance doing over time. And being able to see that right within the platform. I applaud LinkedIn for coming out with changes like this. I wanted to highlight a couple reviews. We have DenisefromCumbria, and Denise I’m sorry, I can’t figure out who you are on LinkedIn, just searching for Denise and Cumbria. But she said, “At last LinkedIn ads takes the floor. AJ is an expert in the space and then sharing some detailed information here. Episode One is packed with detail”. Denise, thank you so much. I sincerely try to pack every episode with detail. So I do hope you’re still a listener, we’d love to give you a shout out there. And if I can find out who you are, I’ll shout you out by your full name and title and all that. Then we have username CGProverb, CG says, “A fantastic source of information five stars. I recently found this podcast when AJ was interviewed on another marketing podcast I follow. I have to say, after only listening to a few episodes, the content is invaluable. I am hooked. Thanks, AJ, for sharing this with us.” CG, I’m so glad to have you hooked. Thanks for listening. Same goes to you. If you want to reach out to me and let me know who you are, I’d love to shout you out full name and title and company on all that. Okay, back to the topic at hand. Let’s hit it.

So we get asked a lot about the rotate ads evenly option versus the optimized for performance option. And I first started playing with this feature back in 2011. And like I mentioned in the intro, I really wanted to love it because I’m always running an AB test. So of course, I would want both of my ads, or maybe even all four of my ads to be shown evenly. So I can tell which one is getting a better click through rate. As we know with the way that LinkedIn ads works, when you launch, let’s say two ads in a campaign, one is probably going to get a significant amount of impressions more than the other. So I wanted to even that out. So I started testing this and I actually started out testing it at a large scale. And I’m really glad that I was spending hundreds of 1000s on this test, because it allowed me to see the impact very quickly. And now I’m sharing this impact with you. Every single time I turned on this option, I noticed the same thing happened, impressions would drop. And this is significant. And my effective cost per click, no matter how I was bidding, would increase significantly. At the time, I had to go to my LinkedIn reps and ask what is going on here? Why would this be happening? But then LinkedIn and later years have come out and actually changed the definition of what this option is. So if you open up campaign manager, you have to actually go into editing one of your campaigns, and then get to the ad step. And right at the top of the page as you’re seeing your ads. It’ll say ads in this campaign, and there’s a little cog, a little wheel. And as you click it, you’ll see the two different ad rotation options. The definition for optimized for performance, it says recommended and I also recommend this, it says this option delivers impressions to the creatives evenly at first to learn which creative performs best, then more impressions will be delivered to the creatives with the best performance. Then when you read the definition for rotate ads evenly, it says, this has a nice clue in it, “this option enters each creative into the auction evenly”, I added a little bit of emphasis here, “giving the ad a fair opportunity to compete for an impression without taking performance into account”. So if you read into this enough, you’ll understand why this is actually deleterious for your performance. What’s happening is it’s not showing each ad evenly to your audience, what it’s doing is it’s entering each ad evenly into the auction. But the auction is very strict. With the auction, better performance wins. And worse, performance gets hammered. What’s actually happening here, let’s say you have one ad that has a relevancy score of seven, and another one that has a relevancy score of five, both ads are getting entered into the auction evenly. That’s exactly what it says it’s going to do. But the ad with a relevancy score of seven is going to win more auctions every time it’s put into the auction. And it’s going to win it at a better rate. So maybe your effective cost per click from this ad is going to be $10. But what about the ad with a relevancy score of four, it’s entered into the auction just as often as the other ad. But because it only has a relevancy score of four, it’s going to lose most of the auctions it’s put in there against. But when it does when you’re going to pay a significant premium, because the ad has a poor relevancy score. So when you run these auctions, one after another over and over, you start to see that you’ve lost a lot of impressions, because you were entering a loser into the auction just as often. And the times when those ads do win the auction, your prices went up, and you had to pay significantly more for them. So this is why I call this option the charge me more and show me less. This is not great dynamics. And because of this, I don’t recommend this option for most advertisers. There are limited cases where we do recommend though, and we’ll get into those right here after the break.

The LinkedIn Ads Show is proudly brought to you by, the LinkedIn Ads experts.

Managing LinkedIn ads is a massive time and money investment. Do you want to return on that investment? Consider booking a discovery call with B2Linked, the original LinkedIn Ads performance agency. We’ve worked with some of the largest LinkedIn accounts over the past 12 years. And our unique scientific approach to ads management, combined with our proprietary tools allow us to confidently optimize and scale your LinkedIn Ads faster and more efficiently than any other agency in house team or digital ads hire could. Plus, we’re official LinkedIn partners. Just go to, we’d absolutely love the chance to get to work together.

Alright, let’s jump back into the cases where we might actually recommend even ad rotation. And I know a lot of times what we tell you is don’t listen to LinkedIn recommendations, because they’re not always in your favor. But in this case, LinkedIn is actually really clued in. You’ll notice that optimized for performance is the default ad rotation setting on most ad formats. And it used to be actually for all ad formats. The one that I would pick out was sponsored messaging ad formats. So that was message ads and conversation ads. The only way to bid for those was percent, which if you put this into the same parlance as the rest of the advertising that we do, you can only pay per single impression that you were sending. So if you had two different message ads going head to head, the only way for LinkedIn to know what a click or an engagement was, which one of these was performing better was with opens. But an open is not a very true form of engagement here. It doesn’t tell you which ad was performing better. That comes to the click the actual click. But clicks on sponsored messaging, oftentimes will happen three, six days later, after the ad has already been sent to them. And that’s not enough notice for the auction to take it into account. So we would notice that when we ran sponsored messaging campaigns with optimized for performance, one of the ads would get shown a disproportionate amount of time, and the other just had next to no impressions. We used to do this manually where we would go switch all of our sponsored messaging campaigns over into rotate ads evenly. And this allowed us to do more of what we actually wanted to do, which is AB testing our two creatives. And this has been to three years now, but LinkedIn caught on and realized that this is the experience that advertisers should be having with sponsored messaging. So now when you open a sponsored messaging campaign, your ad rotation settings are going to be set to even. But all the other ad formats as far as I know, they’re all by default set to optimize for performance, which is exactly the right way to go in my opinion. I think defaults should be set for the majority of advertisers. This is one case where LinkedIn got the defaults totally right. This is an advertisers best interests. We also had a situation recently where one of our clients, they’re not going specifically for high click through rates, because this is very much a long play for them. They’re just doing brand awareness at this point. And what we noticed with this client is pretty much every ad that we launch has about average click through rate, but we have a lot of different creatives that we want to test and we want to learn about. So what we ended up doing was turning on even ad rotation here. And it’s actually turned out really well for us, but that is only because the ads had similar click through rates to begin with. If one of your ads has a lower click through rate, inevitably, you’re going to start getting punished with lower impressions and higher cost per click. So far, those are the only two cases where we actually recommend even ad rotation. But if any of you have any other great use cases you found, I’d love to hear about it. Reach out to us at

Okay, so that leaves a giant hole for us on LinkedIn Ads, because we still want to do this AB testing. But if we tried to use the ad rotation setting that was built for AB testing, but it ends up being bad for us, then what’s an advertiser to do? Well, LinkedIn heard us loud and clear. This is a tool that has been coming for a long time. And we actually posted about it a couple of weeks ago. If you go to the left hand navigation inside of campaign manager to test, you’ll notice you can now create an AB test. And we’ve talked about this in a past episode. But we also did a whole post about it, you’ll find this post down in the show notes if you want to check it out. And LinkedIn actually reached out to us about that post to let us know what was coming in the future. Because currently, we don’t find this feature very helpful. The main reason why is when you create a new test, whether you set up a campaign versus campaign, or an audience versus audience, or an ad versus ad, it creates two brand new campaigns, each one with a single ad in. And I do not think that this is the best environment for an AB test to have. These are two campaigns competing against each other from the very beginning with no history. So honestly, we haven’t even used this feature, I just don’t even find it useful. But LinkedIn did reach out to us on it and let us know. And I’m going to share a quote here. They said, “The good news is that customers can use the winning AB test campaign as an evergreen campaign following the conclusion of the test. Additionally, allowing for existing campaigns in new AB tests is in our roadmap. So more to come.” So backing up a little bit, one of the biggest reasons why we don’t like having new campaigns created every time we launch a new test is those campaigns are now just littering the account. We’re going to now have several if not 10s, 20s, 50s, of campaigns that are absolute garbage, that we then just have to mark as archived and try to ignore them inside the platform. The AB testing feature that would be useful to me is AB testing ad creatives in a single campaign. And then that way, it’s not littering up the account. And as we move forward, the winner gets to stay in the campaign and the loser, we can just turn off. And it sounds like from this response. That’s going to be the functionality in the future. But right now, we’re stuck with having to create new campaigns. If this topic is really interesting to you, you may want to go back and listen to episode 36. That was all about AB testing on the ads platform. And we do cover it pretty intensely there. But I do want to share with you now the basics of how I approach AB tests on the platform.

First, I’m always going to start by leaving ad rotation on optimized for performance. Like I mentioned, except for when we’re using sponsored messaging ad formats, I’m also going to run multiple campaigns targeting the same audience at the same time. An example of how I might do that might be one campaign is targeting the same ICP by their job title, but another one might be targeting them by job function with seniority. So same ICP, but different campaigns and ways of reaching them, then I’m going to launch the same AB test in both of those campaigns. And it’s a little bit more complex than I’m making it sound because we’re probably going to have more than two campaigns targeting the same ICP. And the more you have, the better this data is going to be. But let’s say in one campaign, ad A wins and gets better reach at lower cost. But in the next campaign, LinkedIn actually decided that ad B one and ad B got more impressions at a lower cost. You might be wondering, okay, which of these ads is actually the winner. So that’s why you might want 3, 4, 5 campaigns targeting your ICP, which is one of the reasons why you would hire us to do this, rather than trying to do it yourself. But I realized that’s not in the cards for everyone. Hence why I’m sharing with you our approach here. But when you run this test over time, you’ll notice that the platform in general will start to prefer one of your ads, either ad A or ad B. And now you know, regardless of whether LinkedIn made the right call in each of those campaigns, as to which ad won or not, you get the overall data telling you that yes, ad A is the winner here. And you should go all in on an A. Or, you do have the data telling you that hey, in one of these targeting types, or to one of these sonorities ad B is actually a better ad creative. And so you can leave ad B running in those and leave ad A running in the ones where it won. You’ve got lots of options here. But the principle stands. If you run an AB test in just one campaign, LinkedIn may choose the wrong ad to give the better relevancy score to and over time, it’ll shake out it’ll get better, but I usually want better ad data right from the very beginning. All right, I’ve got the episode resources for you coming right up. So stick around.

Thank you for listening to the LinkedIn Ads Show. Hungry for more? AJ Wilcox, take it away.

Like we mentioned in this episode, the post that’s describing how the AB testing tool inside of campaign manager works, you can go and read that post. It’s a great one. There’s also episode 36. That’s all about AB testing. So we’ve linked to that as well. Now, if you or anyone you know, is looking to learn more about LinkedIn Ads and managing them, check out the course that I did on LinkedIn Learning all about LinkedIn Ads. It is by far the highest quality and the lowest cost course out there. If this is your first time listening, welcome, we’re excited to have you here. And I invite you to subscribe. If this is not your first time listening, please go and rate and review the podcast on whatever player you use. Most are doing this on Apple podcasts. I’ve heard several mention that they’ve done it on Spotify. I just don’t get to see the reviews on Spotify. But I’m imagining at some point they’re going to roll it out and I’ll be able to see them and I can start shouting you out for them. With any questions, suggestions, or corrections, please do reach out to us at And with that being said, we’ll see you back here next week. I’m cheering you on in your LinkedIn Ads initiatives.