Metrics are useful because everyone can agree what success looks like before the work gets done…
…but there’s always a danger that success will start to look like metrics and stop looking like revenue and profit if each stakeholder doesn’t understand the metrics that have been put in place.
Examples of this include:
- Judging a site’s quality by something like Domain Authority – there are so many reasons why I don’t agree with this. This is a metric made up by Moz, and the team behind it openly admits that it might not correlate with PageRank and is based largely on links, and we now don’t know which links Google counts for a site as many may be in a disavow file.
- Judging someone’s influence by their social followers. This is an empty metric for a number of reasons: firstly, people can (and do) buy followers, so this doesn’t mean that they’re influential. Secondly, someone can be popular and therefore have lots of followers, but that doesn’t mean they’ll influence behaviour. For example, a parody account for a celebrity could have millions of followers but if they tweet about a product it doesn’t mean I’ll trust them and buy it.
- Number of links – targeting someone based on number of links not only wrongly motivates them to just go after anything they can get, but also completely goes against any kind of quality notion. You’d rather have five links from national newspaper sites that pass on a lot of authority rather than 25 links from low quality sites, right? But putting metrics to number of links means you end up with the latter and also implies that you see that as more valuable.
But, in a world in which so many people do care about metrics, it’s clear that they’re not going anywhere – so what are the ones that do matter? I think a lot of this can be learned after a campaign, rather than assessing something’s value before you’ve done it, and you can then use these observations to inform your future campaigns.
Note: it’s not just SEOs that come up with pointless metrics – we were doing it in PR for years with the most ridiculous metric of advertising value equivalent (AVE), so this isn’t just an SEO critique!
What is a good link?
A good link passes 3 things: authority, trust and relevant traffic (not just PageRank, TrustRank and any old traffic).
Authority (or PageRank)
Toolbar PageRank will never be updated again and has arguably been useless since 2011. The SEO industry is forced to rely on third parties’ best guess metrics as to how PageRank is transferred, with Moz’s Domain Authority usually considered a front runner.
“Domain Authority is a score (on a 100 point scale) developed by Moz that predicts how well a website will rank on search engines. Use Domain Authority when comparing one site to another or tracking the “strength” of your website over time. We calculate this metric by combining all our other link metrics – linking root domains, number of total links, MozRank, MozTrust etc. – into a single score.”
So Domain Authority is meant to predict how well a website ranks – not how much authority can be transferred from one site to another.
That would depend on, among other things, the link’s location and type. A link further up the page is likely to transfer more authority than one further down. Links in author bios are likely to be discounted completely. A sitewide link will probably transfer more authority – unless it’s in a sidebar or a footer, in which case it might transfer none.
The Mozscape Index is another sticking point. Have you ever audited a backlink profile? In my experience Open Site Explorer can pull 50-60% of the links accessible in a backlink profile at any one time – it has to be combined with Search Console, Bing Webmaster Tools, Ahrefs and Majestic to get a decent sample. Even then, Search Console displays only a snapshot of the links Google has access to. At one point we were estimating this at 10% – which is why a reconsideration request is sometimes turned down citing link examples that we haven’t even seen before.
Open Site Explorer also can’t see a website’s disavow file so it not only displays linking URLs that might not be counted but it uses these when calculating its metrics too. Several of our clients have upwards of 75% of their link profiles in their disavow files – several more gain upwards of 10,000 anchor text links every week which are disavowed but still used to calculate Moz metrics. This is obviously going to make any kind of “link velocity” metrics completely useless as well – another factor used to calculate Domain Authority.
An actual link profile with brand terms redacted. Yes, most of this is in the disavow file.
So Domain Authority is a made up metric, based on a factors including the number of links – which is inaccurate – and MozRank and MozTrust, which are also made up.
Returning to Moz’s use case for Domain Authority – “use Domain Authority when comparing one website to another”. Ignoring the fact that Moz states it should really only be used to estimate how well a site will rank and not how much authority could be transferred from one site versus another, I think it’s worth asking ourselves why we would need to compare two websites:
The answer is that we’ve got finite resources. But despite the industry’s dependence on Moz metrics and PageRank, I don’t think link metrics are the first thing any link builder looks at.
- First we’ll look at relevance. There are a bunch of websites in the Philippines with great metrics that the brand we’re working with just isn’t relevant to…and notice I said that we’re not relevant to them, not the other way around. When you reach out to a blogger you shouldn’t treat them like you’re asking them to do you a favour, or like they’ve won the lottery either. You’re giving them something – a story, something to write about – because your brand is relevant to them and their audience, not the other way around.
- Second we’ll look at legitimacy. We’ll visit the social profiles and read some posts. Does the site sell links? Is there a disclosure policy? Is there an affiliate scheme? Who does the blogger link out to? A handy trick is to go to bing.com (yes, seriously) and do a linkfromdomain: search (all one word, as if you were doing a site: search in Google) – you can see every external site a website is linking to.
- Then we’ll look at reach. Does the site get comments? Does the owner get tweeted at? Does the site get links? Your link target is not a domain – your target is a person. You should treat them like you’re trying to win a customer.
So regardless of whether you think link metrics are worthless or not, you probably only use them to differentiate between two sites with all other things being equal. Why would you need to do this? Well, if you’ve got a link budget. You’ve got £50 and only one blogger can have it. Link metrics are a hangover from the days when we paid for links and we just don’t live in that world anymore.
This is the exact same reason why there’s no point caring about block IP addresses for linking sites anymore. It never had any bearing on the amount of authority passed. But if one site on a network was caught selling links Google would take down the whole thing.
Trust (or TrustRank)
On the subject of paid links:
Paid links pass PageRank. When a website is caught selling links Google doesn’t prevent that site from ‘voting’ for other sites, which is basically how linking works – but it doesn’t trust those votes so it doesn’t count them.
Majestic’s Trust Flow metric is meant to emulate how TrustRank works and the principle is totally sound (I’d go so far as to say that I actually like the Trust Flow metric).
- Start with a list of “trusted” sites. Literally manually compile a list of websites you think that Google is likely to trust
- Calculate how far many links are in the chain between the website you’re looking at and a trusted site
The problems are basically the same as those with Domain Authority:
- We can’t really know which websites Google trusts and which it doesn’t. Even if we know the criteria Google is using, neither MozTrust nor Trust Flow has ranked as many websites as Google
- There’s no indication in the numerical score that none of the links in the chain come from sites that Google knows are link sellers. One paid link – or more worryingly, even a legitimate link from a domain that has been caught selling links – and the chain ends. No TrustRank is passed.
It’s for this reason that Toolbar PageRank was actually a better measure of TrustRank than PageRank – we could tell when a site had been caught selling links because its PageRank was halved or reduced to zero. It does beg the question: the sites caught selling advertorials a few years ago often had their PageRank halved – like the Independent. Does Google trust links from these websites?
Looking at the patent Google filed for TrustRank back in 2009:
“A search engine system provides search results that are ranked according to a measure of the trust associated with entities that have provided labels for the documents in the search results.”
So a website labels another website – e.g. links out – and since 2009 Google has determined the trustworthiness of the linking source. The patent also goes into “annotations” – e.g. anchor text.
“For example, an entity [this is 2009 and Google is talking about entities] such as a digital camera expert [remember Google is looking for expertise, authority and trust] operating a website devoting a website devoted to digital cameras, may create an annotation associating the label “professional review” with a particular review of a digital camera on some third party site (e.g., on the site of a news publication). In addition, the system maintains information about trust relationships between entities, such as individual users, indicating whether (or the degree to which) one entity trusts another entity.”
So let’s make some assumptions based on the patent:
- If your website is labelled (e.g. linked to) by another website using a particular annotation (e.g. anchor text), the links will pass more trust if you have that anchor text on your page because Google understands that what the linking site is talking about directly references what’s on the other side of the link. So for example: if you’re pushing out a survey and websites such as news publications are linking to your homepage and referencing the survey results you could improve the trust flowing through the link by creating an optimised page containing the survey results and linking to that instead. So I would suggest going through your link profile, looking at the anchor text you’ve got, and tweaking the copy on your page to make sure it reflects your inbound links where that’s possible
- The trust flowing through a link depends on the relationship between the two entities, so getting more than one link from the same author increases the amount of trust flowing through the links. I write 2-3 blog posts per week on various sites and about half of them link to Screaming Frog’s SEO Spider page. Therefore Google assumes that I really fucking trust Screaming Frog and as long as I’m not selling or buying links (I’m not) you can take those links to the bank. So “one night stands” in link building have got to end – maybe you get more PageRank with a more diverse link profile but you get more TrustRank if you keep getting referenced in the same trusted places.
- There is a trust relationship between the entities i.e. sites. A link is an indication that one website trusts another website. Google then corroborates whether it trusts the linking entity using a “second trust rank” (this is in the patent) to form what it calls a “trust factor”. The relationship between the entities is crucial and Google needs to understand whether the site being linked to trusts the site linking to it. Three years after filing the TrustRank patent Google launched the Disavow Links tool. Do you think Google did this to help penalised businesses? If Google cared about penalised businesses it would roll out Penguin more than once every 18 months.
The disavow file is a massively multiplayer trust rating engine, whether millions of SEOs dictate to Google whether they trust a linking domain or not. Google has flat out denied that adding a website to your disavow file – even at scale – will influence how that website ranks. Google will not say a word as to whether it will continue to trust a website that is in a hundred thousand disavow files.
As I said above: Moz and Majestic can’t see anyone’s disavow files; may or may not be able to detect whether a link is paid; and may or may not begin with a pool of trusted sites that looks anything like Google’s.
It’s obviously difficult to estimate in advance how much traffic a website is likely to send. Your best guess is to judge how engaged the website’s audience is, but the real differential here is what you’re offering to the audience on top of what the journalist or blogger is writing about.
This is what we call link earning.
Earning coverage is relatively straightforward:
- Once you’ve decided on your media target work out what you can offer them that’s newsworthy. Data or an expert opinion usually do fine as long as you’ve got your audience and timing right.
- Make it as easy as possible for the site to cover your story. Write a press release containing quotes from someone in your organisation, make sure the format is something that works on the website (e.g. if you want a website to embed something make sure the dimensions work). Journalists are busy and they don’t want to work for a story.
Turning that coverage into links is more difficult. You need to hold something back on your own website. As above: host the full survey results on your site; visualise the data; publish a full interview with your thought leader on your blog; create a quiz or a game that a journalist can’t steal. Having something prepared in the first place is much more effective than getting back in touch with a site owner who hasn’t linked and asking them to edit their page.
If a journalist annotates your website with a call to action – “see the results/play the game here” – people will click through. We make sure we do this because we report on traffic.
We report on links before and after we build them.
- Before we include the URL, name and contact details of the blogger or journalist we want to reach out to, as well as the angle we’ll approach them with. We know what they write about so we know what we think they’re most likely to cover. We send this over to our clients in the form of a seeding list – we cross reference with internal PR teams to make sure we’re not going after contacts the client already has a relationship with, for example.
- Afterwards we include the URL, the referral traffic, conversions and assisted conversions, and social shares and comments the post has received. No link metrics at all – and we make sure to point out that a post with more social shares isn’t better than one with less; it’s simply an indication that the page has been seen. We report on “nofollow” links; we report on non-linking mentions. A non-linking mention is an annotation – there’s every chance it can pass trust even if it doesn’t pass PageRank. A “nofollow” link can pass traffic even if it doesn’t pass trust.
Some reports get more detailed. We update referral traffic to see whether more has been sent. We use 90 day cookies in the ecommerce reports in Analytics to see if more referral traffic has converted. Know your customers’ purchase cycle. If you place a link on the 20th of Jan and report on the 1st Feb you won’t take into consideration conversions if it’s a 6-8 week purchase cycle. Revisit it. You could even use SEMrush to see what the linking pages rank for, estimating how much search traffic the page still gets. Funnily enough nobody asks for that.
Judging link value
If we really want to look at how valuable a link from a site is, then the only way we can do this is by looking in Google Analytics after we have the link to see how much traffic the link drove, as well as how many direct or assisted conversions it made.
Don’t get me wrong, I’m not expecting to have a high rate of conversions, as PR is not a DR channel, however we do see some coming through. This can then be used to inform strategy going forward, as you know which publications actually drive traffic to your site and potential customers. You can then look at who you might want to work on an exclusive with in the future, to ensure that you secure the link from them.
Again in Google Analytics, we can look at metrics such as time on site and bounce rate. This can help to judge the success of a campaign, both as a whole and at traffic from individual sites. We can also look at the bigger picture and check the amount of views the campaign page(s) received overall to see how much interest it generated within the target audience.
Remember to annotate your Google Analytics profile with campaign launch dates and links from big publications as these will skew your traffic (in a good way!)
Outside of GA, we can look at metrics such as social shares to see how many people were compelled to advocate and share with friends.
Depending on the campaign, you might also have included a download or some form of data collection. These are good metrics to measure as they actually show engagement and interest level from people.
Another metric to help shape future campaigns is response or success rate, which is really simple to work out from the number of people you contacted with the campaign and the number of replies you got (and, eventually, the number of links).
We use this to measure how different types of campaigns stack up for different clients, which helps to shape our strategy in terms of what’s performing best. It then gives you an idea of how many people you’ll need to contact in the future to get a decent amount of coverage and exposure.
Finally, you can use a metric like search visibility taken from Searchmetrics to look at the value of links and a campaign. You’ll probably need to wait a couple of months to see the impact, but you should be able to see an increase after a significant campaign. The example below shows how visibility for a client improved significantly after we launched a campaign for them in June 2014:
Metrics aside, what should we really be looking at to judge a person’s influence or a site’s quality?
A good way to judge the quality of a site is by looking at the amount of engagement it receives, which helps you to understand how it influences its audience. If a site receives a lot of social shares and comments, it clearly has an engaged audience, and if the social channel of the writer receives a lot of engagement it also indicates yet more influence.
One of the most important factors is relevance to both the campaign and the client. By this I don’t mean if you’re a fashion client only ever work with fashion blogs and magazines, as fashion has relevance for other niches – for example, travel is relevant if your campaign was about fashion capitals of the world. The relevance has to be there for both the campaign and the client.
Feedback and sentiment
Looking at the feedback that you get to the campaign is another great indicator of its success. This can be looking at responses from journalists or bloggers that you’ve emailed, or it could be from looking at tweets and other comments about the campaign.
This will give you a good feel as to the sentiment and whether it has been received well or not. Negative feedback can also be helpful to make sure you shape a campaign differently in the future.
Google Analytics can show you the next pages that people went on to visit after your campaign page(s), which is great to see whether they went on to read more content or visit commercial pages. This shows how relevant your campaign was to them and what people generally wanted to do after viewing it.
Sure, you probably could come up with some metrics for these signals as well, and I’m sure some people will, but it’s so interchangeable depending on the campaign and the niche that you’re targeting that it becomes restrictive rather than helpful. The point is that the person running the campaign should be expert enough to be able to tell if a person or website is going to be influential and valuable for the campaign, and you should trust their judgement.