📜 ⬆️ ⬇️

Ten Popular SEO Myths About Google Promotion

Hi, Habrahabr!

With the promotion of sites on Google in the environment of webmasters, several SEO - myths and misconceptions emerged and established.

The reasons - the lack of information or confirmation from the source (Google); inattentive reading or misunderstanding of reference information.


1. The existence of the "Sandbox"


Argued that Google applies to young sites a certain algorithm that affects their ranking.
')
Webeffector in his Wiki Encyclopedia wrote:
“Externally, a site in a sandbox looks like this. With the resource, it seems, everything is in order, it is in the Google cache, no complaints about it are expressed. However, on the issue page, a resource not only does not appear when key requests are unique to this site are typed in the search box, but it does not even appear in the first line of output when the site’s network address is typed in the search box! The resource does not seem to exist for the search engine. If so, then rest assured - the site is in the Google sandbox. The validity of the filter ranges from 6 months to 1 year. "

What really happens?

There is no “sandbox”, says Google employee Andrei Lipattsev, “a couple of months we collect all data about the site, and during this time it is ranked by what it is, and then it can begin to be ranked both above and below.

The same, but in other words, is written in Google FAQs:
“At first, the pages of a new site can be ranked without taking into account some factors [for example, the geo-dependence of content - author's note ], which have not yet been clarified for this site. Full and fuzzy duplicates of pages, content copied from other sites, poorly filled pages, poor-quality texts and other shortcomings of the site may not be immediately identified. But in the future they will affect, and the site will lose its original position. "

2. "Snot" Google


Probably every webmaster has heard of such a thing as a supplemental index (supplementary index) or its slang name "snot." It is considered that you can get a list of site pages that are in the main index using the operator site: yoursite.ua/& . Accordingly, the list of pages from the additional index can be obtained using the combination site: yoursite.ua -site: yoursite.ua/& .

Reasons to believe that the additional Google index existed were indeed. He was published on Google Webmaster Central Blog on July 31, 2007.

Almost 8 years have passed since the publication, and everything has changed.

“If you enter in the address bar site: pdapps.ru/, then the whole index of the pages, and if you enter site: pdapps.ru/&, this is the main index, which in theory should be in the search. Correct me if I'm wrong, ”asked Andrei Lipattsev on the webmaster help forum.

Indeed you are mistaken, he answered. - Adding & to the request does not give you any useful information.

A Google specialist, to whom we turned for an explanation of the situation, said that the existence of a supplemental index is outdated information .

3. Google Filters


Filter is a slang term used by webmasters that involves the processing of site data and the impact on racking and indexing.

Google does not use any filters . The use of such slang leads to a distortion of the meaning of messages, since it is not entirely clear what exactly one or another webmaster at this time understands as a word filter.

If you are discussing or posting any information related to Google, it is advisable to use precise terminology.

The Google search engine presents in a simplified form a kind of clockwork, where all the “details” are constantly interacting with each other algorithms . Therefore, if you are faced with a downgrade of the site ranking in the SERP, exclusion from the index, and so on, it is better to use the term “algorithm [name]” or manual measures.

Nowhere in the official help or publications of Google for webmasters, including in the interview, the word filter is not used .

4. The effect of behavioral factors on ranking


Another SEO - the myth is that Google takes into account the influence of behavioral factors for ranking purposes. On this topic, webmasters write articles in large quantities and lead discussions on forums.

Google does not use behavioral factors and social cues in the ranking of sites. About this at the conference CyberMarketing-2014 said the specialist of Google Search Quality Service Andrei Lipattsev [1] .

After this statement, almost at all conferences he is surely asked about it. Well, webmasters do not believe in the supposedly obvious, and that's it.

Later Andrei Lipattsev confirmed the previously expressed point of view:
“We do not take into account the data provided through Google Analytics when ranking. "Clicking" on the links in the issue through pseudo-services, similar to the user and others like him, is a waste of resources for webmasters who could more usefully use them in a different way. " “Matt Kats - indicators such as CTR and bounce rate are not taken into account in the ranking algorithm.”

To understand the general approach, we quote two more Google employees.

John Mueller said : “I do not assume that [user behavior] is what #Google will use as a ranking factor. [2]

Using clicks directly in the rankings will not make sense due to noise, said Gary Illyes [3] .

5. Until the first star can not


Remember this slogan in one of the advertisements of a famous brand? There is an opinion among webmasters that it is enough to add the corresponding valid markup to the page code, and advanced snippets will appear in the search results.

The bottom line is that the markup itself, even three times valid, is not a guarantee of the appearance of an extended snippet . We recommend that you carefully study and consider the answer of Andrey Lipattsev.
“... I would recommend you to remove the purchase links that lead to the site, since, if available, the display of extended descriptions may be turned off automatically, and simply rejecting them may not be enough . After another regular revaluation of the site, it was slightly below the threshold above which advanced descriptions are displayed for the site pages. Such revaluations occur constantly and for all sites. Considering that the display of extended descriptions is by no means guaranteed, I would simply continue to work on the site, as before, leaving Google to make their assessments and make the appropriate decisions.
I note that on request site: extended descriptions can also be displayed, since special criteria apply to it. ”

6. All pages of the site should be in the index


You can open any SEO-forum and find the topic "Why Google does not index the page ...". The myth is based on the fact that absolutely all pages of the site should be in the index.

The index is not rubber , ”Google engineer Rinat Safin said at a conference.

Googlebot periodically scans the site and passes the relevant information to the algorithms that are responsible for assessing the quality of the site, ranking and indexing pages.

Suppose the site has two pages with approximately the same content, but one of them contains better content than the second. Google’s resources are not limitless either, so in our example only one page will be added to the index - with better and more informative content.

Individual pages of the site can be excluded from the index. Additionally, read our article on Habrahabr. Does your site have a problem with Google?

The exact number of indexed pages can only be found in the Google Search Console.

7. The Importance of 404 Error for SEO


A common misconception among webmasters is that the presence of pages on the site that give 404 errors (Error 404 or Not Found standard HTTP response code) may adversely affect the indexing or ranking of the site. Many webmasters believe that the Panda algorithm takes this information into account when assessing the quality of the site.

Webmaster Trends Analyst Google Gary Illyes very clearly explained the position of Google: "The opinion that 404 errors entail any penalties for the site is fundamentally wrong."

The web is volatile, old links can lead nowhere. Guglobot such links are indifferent [read, do not affect any indicators - author's comment .]. If suddenly you find broken links, just correct them so that visitors can use the site normally, - said John Mueller at one of the videoconferences. - I would not consider this problem as something important from the point of view of SEO . This is just part of the site support work - to ensure that the links are up to date.

404 error on the wrong URL does not harm the indexing of your site or its ranking in any case . Even if there are 100 or 10 million such errors, they will not damage the rating of your site, - writes John Mueller on his page on Google plus.

On August 11, 2015, Twitter discussed the question of whether the 404 error affects how the Panda algorithm affects the rating of a site.

Gary Illyes gave a negative answer to this question.

8. Come back, I will forgive everything


It is no secret that most sites used or used to promote the purchase (unnatural) links.

Google is able to recognize such links and, like Victor Marianych from the movie “Our Rush”, - severely punishes. Having received the measures taken manually for the purchased links, the site may lose its rating and visibility in the search results.

Having achieved the cancellation of measures, webmasters are often perplexed and ask why the position of the site does not return . They forget that the site achieved earlier positions was fraudulently.

Analyzing one of these sites on the Webmaster Help Forum, Andrei Lipattsev replied:
“You completely overlooked that someone was engaged in buying links to“ optimize ”the site. The effect of this dope sooner or later had to stop, which is what happened. At the moment, your site is ranked as it should, without regard to the purchase links . "

9. I rejected everything


In continuation of the error described in clause eight, let us consider another myth - “Did the site receive measures for the purchase links? Reject the purchase (substandard) links and you will be happy . "

This is not entirely true. Indeed, adding such links to the disavow tool is an important procedure for “healing” the affected site. But this is clearly not enough.

According to the official documentation, the webmaster must first remove the problematic links . Then it is highly desirable to place a link to the evidence of the implementation of this process in the request for rescanning. Google’s requirement to physically remove problem links is partly due to the fact that deleting previously rejected links from the disavow file restores the status of this link.

Unnatural links that could not be removed are added to the disavow tool.

We read documentation :
Adding all backlinks to the rejection file is not considered a solution, and you will not be able to pass the test .
It [in the request for re-verification - note of the authors ] describes in detail the actions taken to eliminate the problems.
In it [in the request for re-check - note.aur .] Shows all the results of your actions .

10. Link rejection and Search Console


Some webmasters think that by sending a link from the Google Search Console to the disavow tool, it should disappear from the section “links to your website” in time.

These are different tools. A question was asked on the help forum - “... is there a version that only physically located backlinks are displayed in the webmaster panel and the disavow tool doesn’t affect this list?”. To which Andrey Lipattsev (Google) replied: "The correct version."

The links added to the disavow tool are not physically removed and therefore it is logical that they cannot disappear from the Search Consol.

We hope that the information presented in the article will be useful for webmasters.

Notes


1. searchengines.ru. Search engines. Google does not use social and behavioral signals in ranking.
2. @SeoTelegraph. Twitter magazine about SEO. Tweet from Aug 12 2015
3. Blogger. Search Engine Optimization - How does Google determine content quality?

Source: https://habr.com/ru/post/275909/


All Articles