News

Bing’s Tips for Optimizing JavaScript Sites for Search Crawlers

Bing has shared recommendations for optimizing websites built with JavaScript for search crawlers.

Crawling JavaScript sites is more complex than static HTML sites due to the numerous JavaScript files that need to be downloaded from the web server. This is known as server-side rendering, which involves multiple HTTP requests, unlike the single HTTP request needed for static HTML.

Having numerous HTTP calls to render a single page is not ideal. However, Bingbot has a strategy to handle this issue.

While Bingbot can render JavaScript, it doesn’t support all the frameworks available in the latest web browsers—neither does Googlebot.

Even though Bingbot can render JavaScript, processing it at scale and minimizing HTTP requests remains challenging. Googlebot shares this difficulty.

Bing provides the following advice to reduce HTTP requests and ensure its web crawler can consistently render the most complete version of a site:

– Program the site to detect the Bingbot user agent.
– Prerender the content server-side and output static HTML.
– Use dynamic rendering as an alternative to heavy reliance on JavaScript.

These recommendations will help improve the predictability of crawling and indexing by Bing and should benefit other web crawlers too.

### JavaScript for Dynamic Rendering = Cloaking?

A common question Bing encounters regarding rendering content for search crawlers is whether it’s technically considered cloaking.

Bing states that as long as the same content is displayed to all visitors, it is not considered cloaking.

Here is the exact quote:

“The good news is that as long as you make a good faith effort to return the same content to all visitors, with the only difference being the content is rendered on the server for bots and on the client for real users, this is acceptable and not considered cloaking.”

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button