SEO myth: a page loses visibility or relevancy if not crawled frequently by the spider

tumblr_o3m22vVn6T1rtobsxo1_1280

I often hear SEOs say there is a link between a page’s crawl frequency and its visibility on the SERPs, like a spider request is related to a boost in visibility.

Even if it’s true that websites with high PageRank are crawled by Google more often and deeper, we cannot think there is a correlation with rankings. A resource is only crawled more often by the spider to check if the content has changed.

If a page has had the same content for years, there is no point asking the spider to crawl the page over and over, as the search engines already have the content of the page. If the content of a page changes often, then the spider should crawl the page more often in order to always have the up to date version.

Search engines are able to approximately calculate how often a resource changes its content. In this way, the spider can crawl pages that regularly change more often, and pages that don’t less.

Theoretically, webmasters can contribute to the spider’s decisions by creating an XML sitemap, adding the “changefreq” attribute to each page with the frequency the resource changes. This attribute, however, has been overused by webmasters previously, so Google no longer takes it in to consideration as much.

This post is the English translation of a chapter of the italian ebook “Mitologia SEO” written by Enrico Altavilla