If you read part 1, you would have seen I promised to talk about how to make your site fully indexable for Google.
Indexing is the process Google uses to scan the pages and page content of your site to provide a ranking on its search results pages.
Why is this important?
If your website is not set up to allow full indexing by Google it will not only hold you back from achieving ranking of your carefully developed content, Google will also see this as a bad sign of user experience and penalise your ranking potential.
Looking for Crawler Errors
Log in to Webmaster Tools. (If you already have a Google account (Adwords, Google+, etc) you can use this log in; and you would had to have put the Google Analytics tracking code on to your site).
Go to the Google Index tab and see how many of your pages are being indexed.
Below this it will show a list of pages and the types of errors.
404 page not found errors. As I mentioned this is where Google can’t find a URL for the page. This usually happens if you have changed domains or have moved the site around. But they need to be fixed – it creates a poor UX and if you have too many of them Google may slap you with a penalty. To fix these look at the error in the control panel and it will show the links to these pages. You will need to amend the links to prevent Google (and your users) navigating to them in the future, and remove the page itself.
Robots.txt files can be poorly configured. This file is a list of instructions telling Google what not to index. Check it doesn’t have a command ‘User-agent: *Disallow: /’ because this effectively tells Google to ignore the whole site
.htaccess file. This needs to be checked by a programmer. Needs too much detail to explain here but again needs to be configured correctly.
No follow commands – your pages all have meta tags in their coding. They should all be marked up with the target keywords you want to rank for. But check they don’t have ‘no-follow’ commands – for obvious reasons!
Sitemaps – if you see a sitemap crawling error it’s probably because your sitemap is out of date, so update and re-submit
DNS errors – if you have server issues Google spiders will struggle to index your site.
Issues brought over – if you have bought your site it is possible it had a penalty. You will need to file a reconsideration request with Google. If you do look at buying a site in the future always check the history – you could start by using Wayback to see the site and its files in the past.
Looking for Syntax errors
You should also check your site for what they call ‘syntax errors’. This is where the worng code has been used to build your pages. Run a check using the W3C markup validation service.
Looking for poor website structure
The dreaded in-links
Inbound links have become less important over the years but still remain a key area to ensure your site is indexed. I say ‘dreaded’ because there is still so much talk about their importance, and may think that simply having loads of links will sort their ranking. Wrong. The site is far more important, but yes they still have a role, and it is important to have quality links pointing in to your site from other sites. The quality is defined primarily by the ‘relevance’ to your site, and also but to a lesser degree the size of the site.
Links take a while to generate but in the short term you can update your social network pages with links to the site to get things moving.
You could also write offsite content – content relevant to your site but hosted elsewhere. Things like guest posts on blog sites. Make sure when you do this that the sites are good quality. Do a quick check on the links in to their sites – are they natural or paid for?
So I hope this helps.
Webmaster Tools is not the sexiest part of SEO work but can produce real dividends.
Keep your eye out of part 3 – so much to tell!