Category Archives: Web Technologies

Posts related to Web Technologies such as SEO for diving centres and other small businesses

Tool for backlink cleaning

Since the introduction of the Google Penguin and Panda updates last year, many webmasters have struggled to find a tool to help with the cleaning of their backlinks.  We were faced with the task of sorting out which of our 3500+ backlinks to our site about diving in Gran Canaria ( were of any use and which were not, and which ones needed to be disavowed.

Fortunately we already had a comprehensive SEO Tool which has a backlink building tool and extensive backlink management options  – SEO PowerSuite.  This tool already has 95% of the features we wanted, and by using the tool carefully we have been able to radically cut the time required to analyse and clean up our backlinks.

The Suite consists of four major tools, but for this exercise we started with the SEO profiling tool .. SEO Spyglass.   This already has an import option for Google Webmaster Tools output, so that was what we did first, download the ‘complete’ list that Google will let you see, and import it straight into a new project.   One minor tip here is to ‘normalise’ the URL’s to either be all with or all without ‘www.’ before you import data as this removes duplicates and can make future database matching easier.

When the data is imported into SEO Spyglass it is immediately validated, and each link checked.. the link is read, and the page analysed, and then additional information for the page or the domain is read from other sources including Google, Alexa, Yahoo Directory and DMOZ.  Based on their own algorithms a link value is created for each backlink, and this is where we start to use the product ‘back-to-front’.  Normally I look at which pages are generating good link value, but here we can also sort and select the low (and zero) value links.

There is still a problem however with the volume of entries, so by using the ‘tags’ column I add different tags to different groups of records.. I make up tags such as ‘GOODLINK’, ‘404ERROR’, ‘SITEDOWN’ and apply them to the different groups of links.

It can still take several hours to work through a big list of links, but with many different parameters visible for each link, you can quickly slice and dice, especially when you combine with the ability to sort on any column and apply different filters.  For example there are some links in the ‘zero’ category which have high relevance to my website, and by putting a geographic term in the filter I can find say all links with ‘canary’ as part of the domain.  I was easily able to identify pages with over 100 outbound links (including one page with 26,000 outbound links!).

Finally, you end up with a tag against each .. showing what to keep and what to disavow and which to contact to request removal or change.

To complete the next step .. negotiating with website owners, I imported the group of URL’s into a second product in the suite, Link Assistant.  This is normally used for link building, not link destruction!  However it has an integrated email client with templates, and the ability to search out contact email addresses on websites.  So, I modified one of the templates to become a ‘please remove me’ message, processed  the data and sent it out!.

By using the SEO Powersuite I believe I have saved many hours, and when we came to review the data a couple of months later it took half the time to regenerate a disavow file for google, which disavowed sites that were down, pages where the link had been removed, and downright spammy sites.

Beware the statistics in niche SEO markets

How well are you doing with your SEO?

This is a question anyone involved in promoting an on-line business is constantly asking.  And if you have even the most basic information on SEO you will already know that the best place to start getting good actionable information is from Google’s own products – Google Webmaster Tools or Google Analytics.

However, when we have been watching these figures recently, there seemed to be some anomalies, and in particular the most popular search keyword was showing well below par click-through rates (CTR).

The website in question was about Diving Gran Canaria with Davy Jones Diving, which can in worldwide terms be considered a very tiny ‘niche’ market, with just a smallish number of main competitors and a market of visiting tourists.   The customers come from all over Europe and it was noticeable that the CTR for English queries was lower than the CTR for German or Spanish.

Then it finally dawned on us … because there are several companies all monitoring their position and rankings, probably on a daily basis, some of whom will use automated tools to measure their position, this will lead to a disproportionate increase in overall impressions in a small, niche market, compared to a bigger market with good search volumes.   In this case all the main players are managed in English and this has traditionally been the strongest market for Scuba Diving. So, if 6 competitors all check their status every day twice on average, then after 30 days .. they will have generated 300 impressions between them, so if the market is only running at say 2000 impressions per month… you can see that 15% of the apparent volume will come from competitive monitoring of the SERPS output by the main players in the market.

So the best advice in small markets, apply some caveats and segmentation to your stats, and you will have to learn how to interpret your data more effectively to take the best decisions from it.