Since the introduction of the Google Penguin and Panda updates last year, many webmasters have struggled to find a tool to help with the cleaning of their backlinks. We were faced with the task of sorting out which of our 3500+ backlinks to our site about diving in Gran Canaria (www.davyjonesdiving.com) were of any use and which were not, and which ones needed to be disavowed.
Fortunately we already had a comprehensive SEO Tool which has a backlink building tool and extensive backlink management options – SEO PowerSuite. This tool already has 95% of the features we wanted, and by using the tool carefully we have been able to radically cut the time required to analyse and clean up our backlinks.
The Suite consists of four major tools, but for this exercise we started with the SEO profiling tool .. SEO Spyglass. This already has an import option for Google Webmaster Tools output, so that was what we did first, download the ‘complete’ list that Google will let you see, and import it straight into a new project. One minor tip here is to ‘normalise’ the URL’s to either be all with or all without ‘www.’ before you import data as this removes duplicates and can make future database matching easier.
When the data is imported into SEO Spyglass it is immediately validated, and each link checked.. the link is read, and the page analysed, and then additional information for the page or the domain is read from other sources including Google, Alexa, Yahoo Directory and DMOZ. Based on their own algorithms a link value is created for each backlink, and this is where we start to use the product ‘back-to-front’. Normally I look at which pages are generating good link value, but here we can also sort and select the low (and zero) value links.
There is still a problem however with the volume of entries, so by using the ‘tags’ column I add different tags to different groups of records.. I make up tags such as ‘GOODLINK’, ‘404ERROR’, ‘SITEDOWN’ and apply them to the different groups of links.
It can still take several hours to work through a big list of links, but with many different parameters visible for each link, you can quickly slice and dice, especially when you combine with the ability to sort on any column and apply different filters. For example there are some links in the ‘zero’ category which have high relevance to my website, and by putting a geographic term in the filter I can find say all links with ‘canary’ as part of the domain. I was easily able to identify pages with over 100 outbound links (including one page with 26,000 outbound links!).
Finally, you end up with a tag against each .. showing what to keep and what to disavow and which to contact to request removal or change.
To complete the next step .. negotiating with website owners, I imported the group of URL’s into a second product in the suite, Link Assistant. This is normally used for link building, not link destruction! However it has an integrated email client with templates, and the ability to search out contact email addresses on websites. So, I modified one of the templates to become a ‘please remove me’ message, processed the data and sent it out!.
By using the SEO Powersuite I believe I have saved many hours, and when we came to review the data a couple of months later it took half the time to regenerate a disavow file for google, which disavowed sites that were down, pages where the link had been removed, and downright spammy sites.