In 1998 , when all our servers fit in Susan Wojcicki’s garage, few people thought about how JavaScript or CSS interact with the search robot: these technologies were used infrequently. Is that the creators of some pages using JavaScript made the design elements ... blink. However, since then much has changed. Now on the Internet you can find a lot of colorful sites with interactive design, which are widely used JavaScript. And today we will talk about resource indexing with a complex interface and structure:
why Google handles them the way modern browsers do;
why do you need to access external resources;
how our system analyzes javascript and css code
Previously, we processed only textual content in the body of the HTTP response and did not analyze how the page looks in a browser with JavaScript support. However, because of this, we were unable to include in the search results pages on which important content is displayed only using JavaScript. Neither the users nor the webmasters liked this.
To solve this problem, we began to process JavaScript code as well when analyzing pages. This is not easy, but the game is worth the candle. Over the past few months, our system has managed to scan a lot of sites like an ordinary browser that uses JavaScript. Unfortunately, indexing does not always go smoothly. Let's look at a few problems that affect the position of your site in search results, and ways to prevent them: ')
If access to JavaScript and CSS resources is blocked (for example, using a robots.txt file) and Googlebot cannot process them, then our system will not perceive pages as users. To make indexing more efficient, allow our robots to scan such data. This is especially important if you have a mobile site. Based on the structure of the JavaScript and CSS code, we can determine whether the pages are optimized for smartphones and tablets .
If your web server is not able to cope with requests for scanning resources , this can also prevent proper processing of content. Check if your servers can handle these requests in the right amount.
We also recommend creating a simplified version of the site. Then users will be able to view your content even in a browser that does not support JavaScript. The simplified version is useful to those who have this feature disabled. Finally, not all search engines can currently handle JavaScript.
If the JavaScript code is too complex or confusing , Google may analyze it incorrectly. It is worth considering whether it is possible to simplify the code without sacrificing functionality.
Sometimes JavaScript removes content from a page , rather than adds it, which also makes indexing difficult. If this content is generally available to users, you must ensure that it is accessible to Googlebot.
We are currently working on a new tool that will make it easier to find errors in the code and help webmasters understand how Google handles their sites. In the coming days, it will appear in Webmaster Tools . If you have any questions, ask them in our help forum and continue learning in our webmaster community .