‘If you can’t measure it, you can’t improve it’
“You’re joking, seriously?!” – this is the response I received from my new MD upon admitting that I had used the Advertising Value Equivalent (AVE) method of coverage measurement in my previous role before I joined the team at Conscious Communications.
“Not for every client. . . some clients just preferred it. . .” were the excuses I floundered through by way of explanation. Which did encourage me to think about measurement, the tools we use as PR professionals and the seemingly industry-wide problem with discrepancy in measurement strategies, in a way which I hadn’t since the theory module of my PR master’s degree many moons ago.
Let’s back up a little for the un-initiated – in a nutshell AVE is a method of measuring earned PR coverage, by comparing it to how much it would cost to buy an advert of a similar size or volume in the same publication. AVE attributes a monetary value to PR work and, as is often the case, can pull in really impactful numbers and demonstrate seemingly huge ROI statistics to clients. The issue being that the numbers often don’t add up, rendering the ROI empty as it fails to measure the impact of message delivery or the engagement of audiences in any way and serves only to produce a meaningless figure. In my experience with AVE, PR agencies hold on to the fact it provides clients with a tool to sell back the company’s PR investment to Financial Directors, framing it in way which fits pre-existing models and “speaking their language”.
As a measurement method it is littered with discrepancies and issues. Chief amongst them is its enslavement to frequently changing advertising rates. For example, AVE targets set at the beginning of the year or campaign period could suddenly become completely unattainable due to changing or falling rates.
Also, many PR professionals hate it. AVE doesn’t really speak to the value of truly effective public relations. The simplest example I’ve ever heard of this came from my PR MA lecturer, which I’m sure she won’t mind me re-hashing here. When working as a PR consultant, after lots of hard-work and relationship building she had managed to secure inclusion for one of her clients in a Which? Guide. Now, as a PR professional, I can recognise this as a big win. The Which? Guides are widely read and incredibly well respected, as well as having an impact as one of the strongest third-party endorsements in the UK. But, because the inclusion was only small in the physical guide itself, and the guide is only distributed to subscribers, the AVE figure the inclusion generated was incredibly low – circa £300.
£300?! This doesn’t accurately represent the impact of an endorsement from such a trusted and valued third party, nor the strategic targeting of consumers many times more likely to convert to sales than inclusion in a traditional media title would, and, most importantly, it definitely doesn’t represent the value of this piece of coverage to the client and its target audiences.
So, a summation of AVE, it doesn’t really work, and lots of people don’t like it. But to some, it’s familiar and it’s easy.
However, this is not the attitude of the dynamic team at Conscious Communications! Like so many great agencies now, we refuse to be part of the problem and have instead taken steps to become part of the solution, by delivering alternative methods to clients that actually work. We approach each client and project differently, taking time to establish, what will great look like?
We work with our clients to determine individual targets for each element of our, often-diverse, programmes, and then bake them into the activity from the very start. Quality of messaging takes precedence over quantity of coverage in our approach, and a matrix of key messages is mapped from the outset to ensure it runs through all of our work.
As the great business titan Peter Drucker pointed out “If you can’t measure it, you can’t improve it”. But Conscious Communications rises to this challenge with a measurement strategy that secures buy-in from all stakeholders from the outset, and then at review stages, enabling both us as the delivery team and the client to take a long, laser-focused look at what we’ve achieved. Being able to determine where the big wins have been, with any piece of activity, is just as important as mapping the challenges and areas for improvement to enable a more effective approach in the future.