📜 ⬆️ ⬇️

What a web developer should remember to do everything on SEO

Today it is very difficult to remain a single specialist. And often, making a new site or making various edits, you can take care in advance about the main things that will help you immediately make everything clean and beautiful, not only in terms of programming, but also in terms of SEO.


So what should be remembered to do a good job, and SEO specialists were happy with your work?

We tighten the general knowledge


First of all, perhaps, you should improve your own knowledge of SEO (and maybe contextual advertising). Such things in the field of Internet marketing will never hurt, and they will give you an advantage - to do everything consciously, and not just “as it is written in TK”. It is also useful for self-checking. This is a general outline on which the remaining points are layered.

Close links from indexing



If you work with SEO audits, for sure you know what external links, circular links and linking nodes are.
')
In the general case: external links are those that lead to other sites and it turns out that they lead your visitors to another place; cyclic references - lead to themselves (also bad); hanging nodes - for example, links to download documents that do not lead anywhere to other pages. For all such links, we add the attribute 'rel = "nofollow"', which will give search robots to understand that this link is not necessary and should not be taken into account. This way we save the page weight. For external links, it’s better to add target = ”_ blank” in general - this way your site will remain open for the user, and there will be more chance of his return.

A frequent example of circular links is the menu, logo, bread crumbs. These typical elements often hide links to the current page, and it is important to remember to close them from indexing right away (hide from robots).

The most typical external links are the link to the site developer in the basement, links to social networks, news sources, company clients, phones, mail, etc.

We glue the mirrors



Speaking about the reduction of duplicates, it is impossible not to recall the gluing of mirrors. This means that your pages should be available only at 1 address. While yours may be available by:


That is, there are three main types of mirrors: by protocol (http / https), by the presence of a slash at the end of the pages, and by the presence of the coveted three letters “www”.

When using redirects, it is important to remember that for gluing it should be 301 by default (permanent redirect), and not 302 (temporary redirect).

If you have sorts \ filters \ page navigation through GET-parameters, as a rule, this also creates a double for the search engine. Therefore, on such pages it is better to at least hang up the gluing. For this, a special link <link href = ”link to the main page” rel = ”canonical”> is added to the <head>. This link tells the robot that this page is not the main one, but the main one is located at the address in the href attribute. This will usually get rid of many duplicates of your pages.

Set the correct meta tags and microdata



Equally important when designing meta tags are pages. The title and description properties should be filled in at least at the main pages (the main one, about the company, contacts, catalog, services), it is better to take care of the rules for the formation of meta tags for categories, products or news that are added later. You can take the title as the title, and add the general information to the title in the style of <meta name = ”description” content = “Your description. More information on site.ru ”>.

In an ideal world, you can immediately embed a micromarking. Well, at least Open Graph. The further, the more often this item is encountered in various implementation audits, it is usually done quickly and simply, but it can give a decent value in terms of displaying the site and its attractiveness to customers. About micromarking can be read here .

Also create a standard robots.txt. Usually on the Internet you can find one for the CMS that you use - for all it is different. In many CMS there is an automatic generator, or plug-ins, which will take over the item.

Optimize the download speed of the site



One of the main factors of the site’s appeal to the user is its download speed. This is understandable - it was normal to wait for 2 minutes of loading the page in the 2000s, but today the rate of information consumption is several times higher, as well as the user's needs. If you have to look at the preloader for more than 5 seconds, it becomes annoying. Often, in terms of download speed, SEO specialists are guided by the Google PageSpeed ​​Insights tool.

The main focus of this audit is image optimization. Now it’s just indecent to upload huge pictures without processing or compressing any of them. If your picture weighs 2-3 megabytes, it usually means that your page will load much longer than it could and you don’t care about your product. This item will immediately take away from your page about 40 points (if everything is really bad) in the same PageSpeed ​​Insights.

Also, the service suggests moving Javascipts to the bottom of the page, since loading them prevents download of the displayed content, as well as the use of file compression and browser cache. For this, server modules are used, which are accessed via the .htaccess directive. You can prepare for yourself the rules that are quite universal, with a period of caching and compression, to insert them later on the project.

It looks like this:

<IfModule mod_expires.c> ExpiresActive on ExpiresDefault "access plus 6 hour" ExpiresByType image/jpeg "access plus 7 day" ExpiresByType image/gif "access plus 7 day" ExpiresByType image/png "access plus 7 day" ExpiresByType image/x-icon "access plus 7 day" ExpiresByType text/css "access plus 6 hour" ExpiresByType application/javascript "access plus 6 hour" ExpiresByType application/x-javascript "access plus 6 hour" ExpiresByType application/x-shockwave-flash "access plus 6 hour" </IfModule> <IfModule mod_deflate.c> <filesMatch "\.(js|css|html|php|jpg|jpeg|png|gif|svg)$"> SetOutputFilter DEFLATE </filesMatch> </IfModule> 

The issue of caching and compression can also include the minimum connection of third-party files. If the site from which we connect them stops working, we may not notice it in time. In addition, we do not minify an external resource, do not compress or cache (it is an external resource). That is why it will be a good practice to download fonts and connect scripts (like jquery) from our server.

Create a page for 404 errors


All non-existent pages should give the response code 404. Many people forget about it, but on this page it is recommended to have common navigation elements (header / menu), so that the user can go to the page of interest and several links - usually to the main page and the previous page. . It is important that the user understands that he has reached a non-existent page, but at the same time he was not fooled from leaving your site altogether.

Do not forget about the adaptive / mobile version



Every day more and more information is searched through mobile devices. Therefore, if you make edits that affect the appearance of the site in the desktop, you need to not be lazy and check that nothing will adapt from this. If there is no mobile version, then it will be good practice to at least bring everything into a non-irritating rubber look so that the site does not have to be constantly turned to the right / left when viewed from the phone. In addition, the presence of a mobile version improves your position in the search results.

Closing the development environment from indexing



If you are developing a new site or are working on a subdomain, you need to close it for indexing in order not to fall under the affiliation (several identical sites from which the system determines the main one, and everything will be fine with it). This is usually done through a special robots.txt file, in which all information is closed from all robots.

At this point, the most important thing is not to forget to delete this file afterwards, otherwise the search robot will not know about your new site, and the client will surely be very unhappy.

Cherry on the cake (pagination pages)



For those who want to stand out and perfectionists - we are working on the page pagination pages. The list of actions is quite simple:

  1. We glue the page with the canonical tag with the main one (<link href = ”link to the main page” rel = ”canonical”>).
  2. We add special meta tags to <head> that will tell the robot where the previous and next pages are located. To do this, use <link href = ”address of the next page” rel = ”next”> (on all pages except the last), and <link href = ”address of the previous page” rel = ”prev”> (on all pages except the first ). Moreover, if we are on the second page, the pagination parameters should not be added to the first page.
  3. On all pages except the first one, add the captions “- page N” to the tags <title>, <description>, <h1> (where N is the page number). This will help us to avoid even a whole series of duplicate content.

Commenting correctly


Often the client decides to remove any banner or just a block that can not yet fill. But after six months, or even a year, “miraculously” remembers him and asks for a return. How you get it is your problem. Therefore, if you need to hide some things that may later become necessary again, it is better to comment them out or make a copy of the file. And the comment should not be of the form <! ----->, i.e. plain html. Such a comment will be displayed in the page code, the search engine will see it, and it is quite possible that it will count as trash code. In addition, it increases the amount of downloadable html-code. Therefore it is better to wrap this comment for example in the php comment. And do not forget to sign what it refers to! For it is not a fact that in a year you yourself will remember this.

Final checklist



Total We can make an approximate checklist for your loved one:


These are, perhaps, the main points that should be kept in mind when developing / optimizing a site in order to learn Zen search engine promotion (well, as far as the developer can do this).

Source: https://habr.com/ru/post/417503/


All Articles