News

John Mueller of Google Predicts Dynamic Rendering Will Become Obsolete in a Few Years

Google’s John Mueller suggests that dynamic rendering is likely to be a temporary solution for aiding web crawlers in processing JavaScript. He anticipates that eventually, all web crawlers will be capable of processing JavaScript, potentially eliminating the need for dynamic rendering in a few years.

Mueller made this prediction during a recent Google Webmaster Central hangout in response to a site owner’s inquiry about the viability of using dynamic rendering. The question posed was:

“We’re thinking of the option to start only serving server-side rendering for bots on some of our pages. Is this an accepted behavior by Google & friends nowadays? Or do you see any objections on why not to do this?”

Mueller assured that dynamic rendering is an acceptable approach by Google. However, he indicated that in the future, websites may not need to depend on it as heavily. Googlebot is already capable of processing all types of JavaScript pages, and Mueller expects other crawlers will advance similarly. He describes dynamic rendering as a temporary workaround while other crawlers improve, clarifying that “temporary” could span several years.

It’s noteworthy that dynamic rendering was only introduced last year at Google I/O 2018, and now, just over a year later, Mueller forecasts that this approach may only be necessary for a few years. Observers will be interested to see how this prediction evolves over time.

Mueller’s full response starts at the 18:38 mark, where he elaborates:

“So you can definitely do this, from our point of view. This is what we call, I believe, dynamic rendering, where you pre-render pages for specific users, typically crawlers and social media agents, which normally cannot process JavaScript. This can also be beneficial for users sometimes, significantly speeding up HTML delivery. While designed for bots, it’s worthwhile to explore if it’s advantageous for users too.

From our perspective, this approach is acceptable. However, over the long term, it might become less necessary. Googlebot can now crawl nearly all JavaScript-type pages, and I suspect other user agents will adapt over time.

Ultimately, I view this as a temporary workaround—potentially for a few years—until all relevant user agents can process JavaScript effectively.”

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button