Why the evergreen Googlebot is such a big deal [Video]

Jun 1, 2020 | Search Engine Optimization News




[embedded content]

The evergreen Googlebot was a huge leap forward in Google’s ability to crawl and render content. Prior to this update, Googlebot was based on Chrome 41 (released in 2015) so that the search engine could index pages that would still work for users on older versions of Chrome. The drawback, however, was that sites with modern features may not have been supported. This discrepancy created more work for site owners that wanted to take advantage of modern frameworks while still maintaining compatibility with Google’s web crawler.

Always up-to-date. “Now, whenever there is an update, it pretty much automatically updates to the latest stable version, rather than us having to work years on actually making one version jump,” said Martin Splitt, search developer advocate at Google, during our crawling and indexing session of Live with Search Engine Land. Splitt was part of the team that worked on making Googlebot “evergreen,” meaning that the crawler will always be up-to-date with the latest version of Chromium; he also unveiled it at the company’s I/O developer conference in 2019.

Twice the work. Before the advent of the evergreen Googlebot, one common workaround was to use modern frameworks to build a site for users, but to serve alternate code for Googlebot. This was achieved by identifying Googlebot’s user agent, which included “41” to represent the version of Chrome it was using.

This compromise meant that site owners would have to create an alternate version of their content meant specifically for Googlebot. Doing this would’ve been laborious and time consuming.

Googlebot’s user agent, revisited. Part of the issue of updating Googlebot’s user agent to reflect the latest version of Chromium was that some sites were using the above-mentioned technique to identify the web crawler. An updated user agent might have resulted in a situation where a site owner (that wasn’t aware of the change) did not serve any code to Googlebot, which could have resulted in their site not getting crawled, and subsequently indexed and ranked.

To prevent disruption of its services, Google communicated the user agent change in advance and worked with technology providers to ensure that sites would still get crawled as usual. “When we actually flipped . . . pretty much no fires broke out,” Splitt said.

Why we care. The evergreen Googlebot can access more of your content without the need for workarounds. That also means fewer indexing issues for sites running modern JavaScript. This enables site owners and SEOs to spend more of their time creating content instead of splitting their attention between supporting users and an outdated version of Chrome.

Want more Live with Search Engine Land? Get it here:


About The Author

why-the-evergreen-googlebot-is-such-a-big-deal-video Why the evergreen Googlebot is such a big deal [Video]
George Nguyen is an Associate Editor at Third Door Media. His background is in content marketing, journalism, and storytelling.

Source

WordPress Development

SEO NEWS

seo news

We’re listening.

Have something to say about this article? Share it with us on Facebook, Twitter or LinkedIn:

SHARE IT HERE:

Subscribe ToThe Weekly SEO Trade News Updates

Get the latest SEO, SEM and SMM marketing intel, tips and tricks from one of the best SEO Gurus online. 

Every Tuesday morning we send out an aggregated email listing all new posts on SEO Trade News.

Excellent! Now check your email to confirm your subscription.