So, Why Aren’t You Using Google Webmaster Tools?
Posted January 26, 2015 by Billy McAllister in Internet Security, SEO
It’s one of those feelings where you just want to bang your head against the wall. A moment when you want to go to every internet marketing company and explain to them the benefits of using Google Webmaster Tools. It seems comical to “want to rank on page one of Google” for highly competitive keywords while ignoring messages from the search engine itself. While this post only lays out 4 reasons to use Google Webmaster Tools, the benefits for your website, your business, your sales engine, are plentiful.
1. Crawl Errors – Monitoring 404 Errors
The Crawl Errors page delivers Webmasters incredibly valuable details about errors that users, their customers or potential customers, are experiencing on their website. A Crawl Error is another word for ‘a 404’, meaning a User tried to access a page that simply wasn’t there anymore. In result, the user’s journey and experience on your site was interrupted. Users and Google, alike, do not like that. 404s do not do your business any good and they happen all of the time! Thankfully, thanks to Google Webmaster Tools, they are nicely compiled into a nice downloadable CSV or Google Doc spreadsheet.
Fixing a 404 is actually really simple. A 301 Redirect not only fixes the potential 404s that are occurring, but it also ensures that all of the “link juice” from that page that 404’ed is passed through. Otherwise, all of the “link juice”, well, dies. After you implement the 301 Redirect, you simply check off the box and report the error as “fixed”.
Navigate to your Crawl Errors Report by: Logging into Google Webmaster Tools > Crawl >> Crawl Errors
2. Demoting SiteLinks
Sitelinks are automatically generated by Google and displayed in the search results. They commonly appear when a user searches a ‘branded search’, meaning they were likely directly seeking information about a company or brand. These Sitelinks appears just below the first result and are likely derived from the Top Landing Pages within your Google Analytics. Remember that Google is just trying to serve users information as quickly as possible and the Sitelinks are Googles way of saying, “You’re probably looking for one of these pages.”
However, it’s not uncommon for a page on your site to appear as a Sitelink that you don’t want to be. While it may be a popular page on your site, it’s not where or what you want a user to see initially. Thankfully, demoting (removing it from the search result) a Sitelink is even easier that fixing a bunch of 404 errors! All you have to do is perform some branded searches of your brand and identify Sitelinks that you don’t want to appear for those searches. After you identify the Sitelinks you want removed, just punch it into the Sitelinks section in Google Webmaster Tools and they’ll go away! Boom!
Navigate to your Sitelinks section by: Logging into Google Webmaster Tools > Search Appearance >> Sitelinks
3. Security Issues
Wouldn’t it be amazing if there was some automatic way for you to be notified if your site is being hacked? Man, that would just be revolutionary….oh wait, there is! It’s in the Google Webmaster Tools ‘Security Issues” section. The caveat here is that you will not be notified if you do not have Webmaster Tools setup!
If your site is under attack, Google will actually add a line in the search results to let users know that there is some funny business going with the site (see image below). Now ask yourself, “How many users would actually voluntarily click on a search results that is being reported as hacked?” The answer should be, “Hardly any!”
4. Robots.txt Tester
Google Webmaster Tools has this relatively new robots.txt Tester that could save you a lot of headaches (and money)! A single typo or accidental keystroke could literally remove your entire website from Google! A robots.txt file is really important and an effective way to tell Google which pages on your site you do not want users to find in organic search. As a website owner or manager, you know there a lot of pages on your site that users have no business having access to. Adding those page slugs to your robots.txt file will eliminate those pages from appearing in search results.
If you really are not that savvy with technical side of website management, you can actually test it as many time as you want and the tool will let you know if you did it correctly.
Navigate to your robots.txt Tester by: Logging into Google Webmaster Tools > Crawl >> robots.txt Tester
In conclusion, if you want to rank well in Google search results and actually compete, there are rules you have to abide by. Google Webmaster Tools is just one of them. Maintaining the overall health of a website is no easy task. Thankfully, Google makes it a little bit easier and manageable and it is free.