Article: PR analytics: don't be a dinosaur!

Author: Olly Cooper
Date: Olly Cooper

I recently attended an event discussing best practice in PR measurement and evaluation and, whilst most PR professionals will (I hope) be implementing these principles already, it gave me some valuable insight on measurement that’s fit for the future, and how some of the largest institutes in the UK implement their evaluation strategy.

Evaluation is often seen as an intimidating term that’s referred to at the end of the month when sending off client reports. It often carries a heavy burden of expectation which can add unnecessary pressure.

Here are my top takeaways from the event:

1. Things have moved on

If you’re still using AVE’s (advertising value equivalent -  the cost of buying the space taken up by a particular article, had the article been an advertisement) to measure a campaign’s success, then you need to clean the cobwebs away and join the liberated masses.

The worldwide PR industry has denounced AVEs as flawed – for those still hung up on employing AVEs as a measurement metric, have a read of AMEC’s guide to why AVEs are invalid: . The main reason for not using them is simple: PR isn’t advertising and therefore can’t be measured as such.


Alex Aiken, Executive Director for Government Communications, argued that you can’t measure everything, and you shouldn’t. Measurement should be about evaluating things which contain an action to improve and develop, or to inform the next campaign.

OASIS is an acronym used for all government communication strategies in order to help bring clarity to planning and executing often complex campaigns.
• O: Objectives
• A: Audience insight
• S: Strategy/insight
• I: Implementation
• S: Scoring/evaluation

The aim is to help make the planning process simpler and easier to remember. OASIS should be viewed in the context of a wider campaign e.g. we want to increase traffic to the KISS website by 12% and we’re going to launch an influencer campaign to achieve this.

Monitoring outputs and outcomes throughout your campaign will allow you to make minor adjustments to the implementation and review and refresh the approach after each phase of the campaign. So, when it comes to evaluation, discussions are much more focused around areas to improve and develop rather than just hitting KPI’s.

3. Evaluation isn’t about reach

More than three quarters (78%) of Brits feel they’re better informed than ever before, however only 4/10 read beyond a headline. With growing distrust in social media, and as influencers tend to buy followers, traditional ways of measuring reach as a metric are becoming inaccurate. The Pepsi refresh project is a great case study in highlighting how measuring by reach alone can damage a company.

Clients are more likely to be interested in out-takes and outcomes. The extent to which the audience is aware of the message, has understood and remembered it, validates if the campaign is working, or acts as an early warning that the strategy may need adjustment.

Outcomes provide concrete proof, such as a rise in sales that can be traced to PR and can be the strongest basis for estimating a return on the PR investment.

The general consensus from the room was that PR analytics across the board are still very far away from being a well-oiled machine. Interestingly, 70% of PR pros recognise a skills gap in data analytics and many were calling for backing from professional bodies and backing from clients for what they want from success.

At KISS we can see how a large proportion of campaigns are now integrated. PR doesn’t operate in a vacuum, it’s part of the wider marketing mix, which means measuring impact isn’t isolated to PR activity. Evaluation should be a collaborative approach between client and agency – KPI’s shouldn’t be set by an organisation and then locked away in a dark cupboard until your next six-month review.

Evaluation should be used as a method in creating the most effective communication campaign, not as an indicator of how well you can do your job. Agreeing up front the different ways of evaluating is always a priority at KISS – and it will be different for each client depending on their desired outcomes.

View PDF Version