There are plenty of reasons why you’d want Googlebot to recrawl your website ahead of schedule. Maybe you’ve cleaned up a malware attack that damaged your organic visibility and want a clean bill of health so rankings recover faster; or maybe you’ve implemented site-wide canonical tags to eliminate duplicate content and want these updates sorted out quickly; or you want to accelerate indexing for that brand new resources section on your site.

To force recrawls, SEOs typically use tactics like resubmitting XML sitemaps, or using a free ping service like Seesmic Ping (formerly Ping.fm) or Ping-O-Matic to try and coax a crawl, or firing a bunch of social bookmarking links at the site. Trouble is, these tactics are pretty much hit or miss.

Good news is, there’s a better, more reliable way to get Googlebot to recrawl your site ahead of your standard crawl rate, and it’s 100 percent Google-endorsed.

Meet “Submit URL to Index”

Last year, Google updated “Fetch as Googlebot” in Webmaster Tools (WMT) with a new feature, called “Submit URL to Index,” which allows you to submit new and updated URLs that Google themselves say they “will usually crawl within a day.”

For some reason, this addition to WMT got very little fanfare in the SEO sphere, and it should have been a much bigger deal that it was. Search marketers should know that Submit URL to Index comes as advertised, and is very effective in forcing a Google recrawl and yielding almost immediate indexing results.

Quick Case Study and Some Tips on Using “Submit URL to Index”

Recently, a client started receiving a series of notifications from Webmaster Tools about a big spike in crawl errors, including 403 errors and robots.txt file errors. These types of preventive measure alerts from WMT are relatively new and part of Google’s continued campaign to give site owners more visibility into their site’s performance (and diagnose performance issues), which started with the revamping of the crawl errors feature back in March.
(more…)