Published: Jul 14, 2007 - 03:51 pm
Story Found By: vanessafox 2967 Days ago
A warning is something like an invalid date on an entry within the Sitemap file. For something like that, the Sitemap would still be accepted, just the date value wouldnt be used.
In addition, it looks like theyve launched some new alerts for the warnings section. Theyll crawl a sampling of the URLs listed in the Sitemap and see if there are any problems with them. If there are, those will be reported. This may seem on first glance to be the same as the Crawl Errors report, but its actually different. The crawl errors are errors Googlebot had accessing pages during the regular crawl process that is part of indexing. These new crawling problems are reported before the regular crawling and indexing process.
You can use this to fix problems before Googlebot comes around trying to index the pages (which could help prevent a slowdown of the indexing process) and also can use it as a sanity check to make sure the process you use to create the Sitemap isnt creating a bunch of bogus URLs.
Good job webmaster central team!