Facebook Ad Relevance Score: What Does It Mean for Advertisers?

“A rating of 1 to 10 based on how your audience is responding to your ad. This score is calculated after your ad receives more than 500 impressions.” – Facebook Ads reporting tooltip

When ad relevance became available in reporting for my Facebook clients, my initial reaction was that the score merely acted as a proxy for click-thru rate. Glancing at my campaigns and ads, this appeared to generally be the case, and it made sense to me theoretically – the true measure of your audience’s reaction to your ad, and thus the ad’s relevance to that particular audience, is whether or not they click on it. So, great, we get to clean up our dashboards, and marketing execs too busy to discern between a 0.48% CTR and a 0.53% CTR now could make decisions based on a 1-to-10 scale and move on with their days that much quicker. However, now that ad relevance (AR) score data has been available for about two months, I decided to take a deeper dive and test my own hypothesis. Basically, I wanted to find out if there was any real meaning behind the relevance score, when the usual “hard” metrics that I use every day to judge ad performance are still readily available. The client account I used as an example operates in half a dozen major US cities, and their primary success metric is phone lead-generation, ultimately recruiting for a part-time job, all of which is clearly stated in all of their ads. Having spent tens of thousands of dollars on Facebook since relevance score was added on January 29, I pulled down engagement and conversion data alongside AR scores for 128 separate ads, and ran some basic pivots.   2015-03-24_16-32-422015-03-24_16-33-58 For reference: I grouped each ad into a cohort depending solely on its assigned relevance score, and there were zero ads with an AR score of 1 or 10 in my report. Click-thru rate is based on Website Clicks, and not overall Clicks. All ads are newsfeed ads, ranging across desktop and mobile (no right-hand side). If there’s a good reason to aim for a 9 out of 10, I’m not seeing it, at least when I view my ads in aggregate, and in retrospect. In fact, if you’re aiming for a high conversion rate, high CTR, low CPC, and a low CPA, it turns out that a 7 is better than an 8 or a 9…and I can’t imagine how bad a 10 would look! And if not one of those metrics matters to you or your clients, then I’d like to talk to you about a bridge I’m looking to sell out in New York. I’m stumped. My best guess is that ad relevance has very little to do with these KPIs, and everything to do with Likes & Comments (positive) and Flags/Reports & Ad Hides (negative). Even still, you would think that those qualities of an ad would correlate with direct response performance in aggregate! What do you think is going on? Are you seeing more of a correlation with high relevance score and peak ad performance? Is this all one big conspiracy orchestrated by The Zuck to drive me insane? Let me know in the comments!

  • Great post, Dylan.

    I’ve seen the same thing w/ another lower-spend client – certainly not a tight correlation between performance or business metrics and ad relevance. I have also noticed that even eCPM does not seem to correlate very well. At the very least, I’d think that low relevance scores would correlate to high eCPMs, and vice versa, but so far, that does not seem to always be the case.

  • I’ve dug into this just a little bit and I do believe you are correct in your assumption that Relevancy Score has mostly to do with likes, shares, and comments. I’ve noticed that our newsfeed ads that have 8+ RS have a lot of social interaction, along with conversions. Those that have a lower RS typically have a poor CTR, very little social interactions and few conversions. I just looked through our highest volume client and I noticed something… 90% of the RS over 7 are remarketing and have over 1% CTR. Also noticed multi-ad layouts on average have a higher RS. This may require trying to mash a bunch of different data to see if there’s any correlation.

    • Yes, we need to look at more data. I definitely think that we don’t want to assume that high RS will always (or even mostly) correlate with conversion performance. And, certainly when looking at any data, we’d want to break out retargeting separately from new audience targeting. Thanks, Curt.

Shares