Google Webmaster shouldnÃ¢â‚¬â„¢t be something new for most webmasters out there. IÃ¢â‚¬â„¢m myself has been using this tool for almost 6 months. But the only features that I was using that time are sitemap submission and the statistic tab. I never care other tool until I read more SEO blog recently. Since that, I start fully utilizing this tool and this post will highlight about some of the important features that you should use.
1. The first page that you will see is the summary of your website. This page is very useful if youÃ¢â‚¬â„¢re managing CMS (content management system) based website which holds thousands of data everyday where most of the contentÃ¢â‚¬â„¢s URL is dynamically generated by your CMS script.
As you can see below, after I start implementing robots.txt around 2 weeks ago, I have around 13k URL excluded from the search engines.
There are also almost 2k URL not found due to the content has been deleted. Normally Googlebot is faster than me in looking for the new content submitted (I set the crawl rate to faster)
There are 3 unreachable URL which is exist but I think Googlebot unable to reach it that moment due to various factor such as network or DNS issue or your server is in error state
2. I read this before on few SEO blog but didnÃ¢â‚¬â„¢t take any action until recently. On this page, you have an option to display URL with www prefix in front of your URL or not. This simple tune is good in term of SEO and avoiding you web page treated as duplicate content
3. Crawl Rate. I have set the older site to fastest mode since the option site is available. However, please make sure your server is powerful enough in handling this request before turn it on. But for the new site, the Ã¢â‚¬ËœfastestÃ¢â‚¬â„¢ mode is disabled with message Ã¢â‚¬ËœThe rate at which Googlebot crawls is based on many factors. At this time, crawl rate is not a factor in your site’s crawl. If it becomes a factor, the Faster option below will become available.Ã¢â‚¬Â
4. If you have robots.txt created for your website, this where you can test if you robots.txt file is correct. If you plan to make changes on you file, you can test it here prior that. You also have an option to test it with different google crawl bot such as googlebot and
5. After discover that my website has a lot of pages in supplemental index, I have started removing the whole directory that I donÃ¢â‚¬â„¢t need, 404 URL, dynamic pages and also few other type of URL that I think will cause my content considered as duplicated. You have 4 different type of removal based on your needs. You can also remove entire site from index pages if youÃ¢â‚¬â„¢re crazy enough
6. Sitemap is one of the powerful tool on Google Webmaster. It will accept .txt and .xml sitemap which is now accepted by all search engines as a standard sitemap file. Google still recommend you to submit your sitemap to this tool, even though sitemap auto discovery has been agreed by all search engines earlier this month.