This Search Engine Optimization Utility can Sometimes be Read Incorrectly
Intended audience for this post: Beginning users of Google Webmaster Tools
- Google Webmaster Tools can sometimes give warnings that can be ignored
- You sometimes have to be patient and do a little digging to figure out whether a warning is really valid
I think anyone who follows this blog much will realize how valuable Google Webmaster Tools is in the world of search marketing. It’s a veritable Swiss Army Knife of useful tools and information. Any professional SEO, and many webmasters wearing SEO hats from time to time, will visit Webmaster Tools regularly to assess the health of their site by looking at the number of pages Google is crawling, whether Google is reporting any problems, inventorying links, checking search interactions, and so much more.
Additionally, a quick check of Webmaster Tools from time to time can reveal some serious trouble spots before they cause major damage. I remember once instance where I made a change to a robots.txt file to block Chinese search engine Baidu from needlessly hammering our customer’s site. I put a line in the wrong order through carelessness and inadvertently blocked Googlebot from the entire site. Fortunately my able assistant Eliathah did a health check on the set the next day by checking Webmaster Tools and was alerted to the problem. In fact, when Googlebot is blocked from a site, Google considers it such an unusual event they put a warning message top and center on your GWT dashboard.
Google gives out other warnings and error messages on the primary dashboard of Webmaster Tools, and it makes sense to pay attention to all of them. But that doesn’t mean all of them require a panic-mode reaction and response.
Here’s a recent example. We had a client with a new website and one of our novice techs was disconcerted to log into Webmaster Tools and see a notice for a whole bunch of warnings, as illustrated in the next screenshot. In fact, this shows 7 times more warnings than there are pages in this brand new site. So here’s how I’d recommend following up on warnings, using this particular warning as an example.
First click on the warning text, which will take you to a page that shows you what Google calls “Examples” of the types of warning that they are reporting. Note the description. This one says that the URLs have been blocked by your robots.txt (see screenshot).
Also note especially the date in the “Detected” column. This will give you some context. What was going on around that date? In this case, is it safe to assume that the site was still in preview? Could it have been that the robots.txt was blocking these URL’s because the site hadn’t been published yet?
Think through anything that was going on that might have generated that warning, and then ask yourself if that situation is the same now.
So let’s say that you think this particular warning is no longer valid. There’s a way to test that, especially if the warning has to do with possible blocking going on by your robots.txt. As this next screenshot shows, click on “Health” and then “Blocked URL’s.” It will tell you how many URLs are being blocked in the table at the top. More importantly it will allow you to test a particular URL and see if it’s currently being blocked.
In the next screen shot I’ve illustrated how to use the “Test URL” function in the “Blocked URLs” section of GWT. First I selected one of the URLs that Google reported I’ve done is selected one of the URLs Google displayed (seen my second screenshot above) and I’ve pasted it into the URLs field (1). Now I can click on the “Test” button (2) and I will see the results of my test below it (3). In this case it indicates that the URL is “Allowed,” which means that we’re okay to ignore these warnings.
The bottom line on this is to remember that Google warnings were valid at the time they were detected, but that doesn’t necessarily mean you still have an issue. Take a bit of time to check it out, and you’ll save time in the long run by avoiding correcting problems that are no longer problems. (Thanks to Horizon Web Marketing for this demonstration example and screenshots.)
Google Webmaster Tools are just one of the many utilities that we cover in our Search Engine Workshops SEO Master class. To see our schedule of upcoming classes, click here.
Very informative article Ross. I am also having alike problem with my website. Around 2 weeks ago, due to a server error (which causes my site urls to redirect infinitely) my search traffic from Google dropped almost 99%. The site started working fine after a couple of hours but google has not started sending traffic to me till the date.
I checked GWT, there shows ‘1 bloked url’ but I am unable to know ‘which url’ as there is no warning at all. In my whole webmaster tools account everything seems fine except this one, and I am starving for traffic.
Before all this happen to me, I used to enjoy 600-700 unique visitors in a day but now they are limited to 10-12.
Would you like to help ? Thanks.
@Sanat, What does your robots.txt look like?
@Aashish Sorry I overlooked your question for so long. Are you still having a problem with this?
Your article helped SGMAG. GWT was showing warnings in our sitemap. I went looking for answers and found this article. Following your directions I found out I had mistakenly disallowed */media in robot.txt thereby blocking all files in this category. I wanted to block images I forgot I had a media category. Anyway, thanks to your simple and clear instructions we got the fix. Thanks a million!
I love getting feedback like this, Ally. Thanks for taking a moment to share it.
I have solved all the error in the robots. txt but still I got the warnings . and my traffic got dropped due to this. how to regain my traffic back
I am getting 404 error page warning in GMT starts with “clickLog?ck=” When i mark it resolved, it return again in couple of days.
How can i get rid of it permanently?