No more server side rendering?

A couple of weeks ago, Google announced a long-awaited update on googlebot. It now runs chrome engine version 74, the same we’re currently running in our Chromes. What’s more, it announced that from now on it will keep doing regular updates on it to ensure continued support for new web technologies.

It’s still missing support for some features, but different people reported their websites being crawled and rendered perfectly by google bot now. You can use ES6 syntax and many more things without the need to polyfill or apply babel!, at least not so heavily as if the website was from 2015.

It’s not all rainbows and unicorns however, it’s known that google crawls websites with a javascript-enabled browser far less often than without it. It’s to be expected given than crawling a site with JS is fairly more expensive. So, if you have a client side website and you’re fine with google updating contents every other day, that’s fine, but if you need google to pick up updates multiple times per day you’ll still need to provide a SSR version.

Another thing to consider is the support for other search engines besides Google. Different experiments show that some of them are able to pick up javascript up to some extent, but surely not as well as google. If you care about other search engines, you’ll still have to use SSR too.

Still, this is a huge step forward from google. Websites have had a big increase in average size in the last years, and google also has to parse and take into account CSS to check for hidden content, so it’s to be expected that soon enough it will have enough capacity to crawl javascript sites as often as non-javascript ones. It’s just a matter of time.

Leave a Reply

Your email address will not be published. Required fields are marked *