📜 ⬆️ ⬇️

SEO for Google in 2018: well forgotten new

The times when updates of Google search algorithms were rolling out in large clusters and wore lovingly chosen zoological names are left behind. In the fall, one of the company's representatives casually noted that now the algorithms are being corrected several times a day and the general public is notified only of a small part of these changes. This was to be expected - as artificial intelligence is gaining momentum, the development of the ranking system is accelerating and becoming less discrete.



But for an ordinary site owner to separate the wheat from the chaff with this approach becomes more difficult. In order to feel more confident, I decided to conduct a brief revision of the innovations and strengthening trends that will affect search engine optimization in 2018, as well as recommendations that it makes sense to follow in their context. Under the cut you will find five major trends.

Quickly and permanently


UX manuals are written about the importance of fast loading, but as it turns out, it is important not only for impatient users. Literally the other day on the Google blog , information appeared that they intend to take this parameter into account starting from July of this year. The official statement reassures users, assuring that repression will affect only the most negligent, but the community has suspicions that the requirements will be very strict. So, if the average visitor, according to statistics, leaves the suspended page in the sixth second, for Google, according to one of the representatives, the optimal download speed is two to three seconds . To check how your site is doing on this front, official sources recommend using tools such as the Chrome User Experience Report, Lighthouse and PageSpeed ​​Insights. Many webmasters make a choice in favor of AMP, which significantly speeds things up, but before following their example, consider the negative aspects of such a decision.
')
Another temporary indicator, to which, according to an official statement of Nick Frost in September , RankBrain is paying close attention - the duration of the session on the site. The pattern here is transparent: the longer a user stays on the site, the more relevant the information presented there is to his request and the higher the site should be placed in the search results. According to SearchMetrics statistics , the average session duration for pages from the top ten is three minutes and ten seconds. Perhaps, with such indicators it is not surprising that the place in the search results also correlates with the volume of the text: those pages that contain more than 2000 words achieve the best results.

Requests "from the voice"


Now about 40% of users communicate with Google every day through the dynamics of their phones: speech recognition has reached such a level of accuracy that it really saves time and simplifies things and becomes a full-fledged alternative to the keyboard. Experts predict that in the next three years, the number of voice and print requests can be equalized - accordingly, you can begin to rebuild now. There are, however, some good news: the restructuring here so far is not required so radical, it is enough that the key inquiries fit into the canons of lively speaking. If you read them out loud, you don’t feel like an epileptic robot (“hd 1080 quality download torrent”) - most likely, it’s not that bad.



Expanded passage example

The fact that in this situation the queries formulated as a question (“what is a transaction”, “how to make a screenshot”) will grow in popularity goes without saying; it is also natural that the issue on them will contain Featured Snippets (specially designed extended descriptions), which the robot will read out first. As a result, the value of this microformat rises even more. There are no exact instructions on how to get into the circle of lucky ones selected by Google for advanced descriptions, but, according to experts, the following characteristics increase the chances:


Issue personalization


The policy of adapting the issue to the interests of a particular user can be considered a constant in the development of search services, and Google is no exception. In his striving to guess exactly what a person wanted to see, entering a request, the company collects, carefully stores and feeds the RatingBrain any information about him and his actions - what is indicated in his profile, what he was looking for before, on which pages he was detained, from which device Search. The main purpose of processing all this data hodgepodge is to reveal the user's intention, place it in the context of established habits and preferences, and select the most relevant results of this picture from the output.

All of the above is achieved through an in-depth semantic analysis of the page content. This means that simple times when it was enough to add a couple of direct occurrences of the short-circuit and adjust the level of nausea are a thing of the past. Now RankingBrain parses the entire text to be sure of the lexical composition that the source really knows what it is talking about - that is, that the keys do not stand alone, but are surrounded by the context of the words that the algorithm associates with the relevant topic.

In addition to the relevance of the content, the algorithm checks whether what the site offers is suitable for the user's basic intention - to find out, buy, download, calculate. In general, it plays into the hands of the owners: pages that give visitors different types of value (for example, a climbing site and an online equipment store), you need not worry about their competition for a place in the issue. In theory, the robot should separate them into different categories on their own, but for fidelity, you can focus on the keys that call the target action of the visitor.

Finally, a scrupulous study of the background, coupled with growing mobile traffic, leads to what is called hyperlocalization - the formation of the issue with a strong reliance on the user's geolocation. Correspondingly, companies whose services have at least some kind of geo-referencing do not fail, if they begin to focus on it in the semantic core.

Mobile versions in priority


Talk about the fact that Google will pay more attention to mobile versions, went back in 2016 , but judging by the stated dates, soon they will finally become a reality. Here it is necessary to clarify: the state of mobile sites will not just be taken into account when evaluating - it will take the place of a defining parameter, even if the user searches from a computer. Mobile version of the site is now considered as the main.

So what to do? First of all, of course, properly adapt the site for phones. Google stresses that it’s better not to have any mobile version at all (in this case, the robot will quietly switch to analyzing the desktop — there’s no court left) than to make it as necessary. In addition to technical characteristics, the page’s relevance to the principles of UX will play a role in ranking and, first of all, imputed markup. In addition, experts advise to make sure that the content on the desktop and mobile versions coincide - there will be little use from carefully optimized texts if they are cut to save space on a small screen.

Reference weight: from quantity to quality


Link exchanges, aggregators, bulk purchase - this is a whole epoch in SEO, but in 2018 it will probably come to a final after a long decay. Gary Illis shed light on Google’s position on the reference question in September last year in such a natural statement :

“In short, if you publish high-quality content and actively cite it on the Internet - I’m not only talking about links, but also about mentioning on social networks or just discussing a brand and all that crap - everything’s great for you”.

In other words, the point is not that the link mass has lost weight as a ranking factor (according to official representatives, it remains the leading parameter along with content) - simply mentioning without references gradually reaches a comparable level of significance. The same algorithms that Google uses for semantic analysis of content for relevance, allow it to keep track of whether other companies are talking about your company and, if so, in which key. Accordingly, the bad reputation on the Web will be harder to balance with a large number of “bare” links, and, on the contrary, companies that discuss with enthusiasm, especially at reputable sites, can earn a lot of points, even if no one indicates a link to the official website. Thus, SEO is starting to merge with good old PR. The recommendations here, in essence, boil down to keeping track of references and trying to make sure that more good things are said about you.

By the way, discussions should be stimulated not only on third-party resources, but also on their own website, although many consider the field for comments to be a relic of the past. It turns out that Google loves them even more than activity in social networks.

So, in general, in 2018, it would seem that we are not expecting anything shocking - good content is still in value, black and gray methods of promotion still meet with resistance. However, the effectiveness of technologies that are behind these rules is growing steadily, and the traditional game of cat and mouse all the more loses its meaning - it is more profitable to follow the rules than to try to circumvent them. Mobile focus, fast downloads, more specific and detailed search queries and compliance with the basic requirements for content and brand reputation - this is what the SEO community should tune in in the near future.

Source: https://habr.com/ru/post/347436/


All Articles