📜 ⬆️ ⬇️

Updating the “Scanning Errors” section in Google Webmaster

The section "Scanning errors" is one of the most popular in Webmaster Tools. And today we have made a very significant improvement, which will make the section even more useful.
Now we discover and report many new types of errors. To help understand the new data, we have divided the errors into two parts: site errors and URL errors.


Site errors


Site errors are errors that do not relate to a specific address - they affect the entire site. These include DNS errors, problems connecting to your web server, and problems accessing the robots.txt file. We previously reported them as URL errors, but it didn't make much sense because they are not specific to individual addresses. Instead, we now monitor the failure rate for each type of error for the entire site. We will also try to send notifications when these errors become so frequent that they deserve attention.
Viewing site errors
In addition, if you don’t have (and haven’t been) any errors in these areas (as is the case for many sites), we won’t bother you in this section. Instead, we’ll just show you the green check marks so you know that everything is in order.
Site, without errors recently

URL errors


URL errors are errors that are specific to a particular page. This means that when Googlebot tried to scan the URL, it was all right with the DNS, it was able to connect to the server, was able to get and read the robots.txt file, and then when requesting this URL address, something went wrong. We divided URL errors into different categories depending on the cause of the error. If you have Google News or mobile (CHTML / XHTML) data on your site, we’ll show separate categories for these errors.
URL errors
')

Less is more


We used to show you up to 100,000 errors of each type. Attempting to absorb all this information is like drinking from a fire hose. And you did not have the opportunity to find out which of these errors were important (the start page did not work) or less important (some website made a typo in the link to your website). It was not realistic to view all 100,000 errors without sorting, searching, and without the possibility of marking the correction of an error. In the new version, we focused on giving you only the most important mistakes at the beginning. For each category, we have identified, as we believe, the 1000 most important errors. You can sort and filter these top 1000 errors, view information about them, and also inform us that the error has been fixed.
Filtering and sorting errors by any column
Some sites have more than 1000 errors of a certain type, you can still see the total number of errors for each type, as well as a graph of errors for 90 days. For those who worry that 1000 error descriptions plus the total number of errors will not be enough, we are considering adding an API that will allow you to download all the latest errors. Please let us know if you need more information.

We also deleted the lists of pages blocked in robots.txt, because although they can sometimes be useful for diagnosing problems with your robots.txt, there were often pages that you intentionally blocked. We really wanted to focus your attention on errors, so information about URLs that are closed to our robot will appear in the near future in the “Access for scanner” item in the “Site Configuration” section.

Dive into details


Clicking on a separate error URL from the main list opens a panel with additional information, including when the URL was last crawled, when the problem was first noticed and a brief description of the error.
Detailed error description
In the error information panel, you can click on the link of the page that caused the error to see for yourself what happens when you try to visit it. You can also mark the error as “Corrected” (more on this later!), View the help content for the error type, the list of Sitemaps that contain this URL, see the list of other pages that link to this URL, and even see how Googlebot get more information or check that everything works.
View pages that link to this URL.

Take action!


We are pleased to announce that in the new version of the “Scanning Errors” section you can really focus on resolving problems with URL errors, which is the most important. We rated the errors so that at the top of the list were the errors that you can fix now, whether fixing broken links on your site, fixing errors in the server software, updating your Sitemap files to reduce the number of dead links, or add a 301 redirect, so that users get "real" pages. This is determined based on many factors, including whether the URL is included in the Sitemap, how many sites link to this URL (as well as how many of them link to your site), and whether this URL has received traffic from a search recently.

After you have decided that you fixed the problem (you can check your fixes by viewing the URL as Googlebot), you can tell us by marking the error as “fixed” if you are a user with full access rights . This will remove the error from the list. In the future, errors that you have marked as fixed will not be included in the top list if we receive an error message when we try to rescan this URL.
Select bugs and mark them as fixed.

We have invested a lot of work in the new features of the “Scanning Errors” section, so we hope that this will be very useful for you. Let us know what you think, and if you have any suggestions, visit our forum !

Source: https://habr.com/ru/post/139884/


All Articles