📜 ⬆️ ⬇️

Is it possible for 0 rubles at the start and 20 thousand rubles a month to license the server software and update the corporate server park, and also to raise the hosting? I did!

Our farm was located in the office, consisted of 4 servers of 3 years old and honestly purchased software on Gorbushka for 150 rubles per CD.
The iron worked quite normally, except for two moments: when the accountants sausage reports sagged on the disks ... and the disks themselves periodically took off. The crash of the discs was especially painful, since backups are done only at night, and raid is built only on 2 servers.
For a long time I tried to solve the riddle with the frequent departure of the disks, at first I thought that the defective disks came across ... but I realized the real reason only after the excursion to the real data center.
In general, I decided to pretend to be a thoughtful client and to be invited to visit a torn data center, to look and think. I typed in Google a magic spell "service data center", phoned the first two pages - it turned out, almost all are encrypted. Someone says that they need to transfer the server at the entrance to the building and during the week (!) They will connect it, but someone does not know the address of their DC at all. And in the Google list were small hosters who are trying to stand in foreign data centers to pass off as their own and get welded. We invited you to live to see the hardware in only one DC - in the e-style in Bibirevo. I happily broke off and went, not delaying the excursion for tomorrow.

Came, parked in the inner parking lot under the barrier (said the car number to the manager, the pass was ordered). The first thing that surprised me was the huge modern buildings with flags and inscriptions in concrete “R-Style” and “e-Style” that inspire confidence ... I communicated with hosters six months ago where only to get to the office, interview all the security guards and street sweepers wandering between the garbage dumps ... there will be some nook in the basement.
I phoned the guard with the manager, confirmed the pass - for some reason without the typical business-centric booness and expectations ... it turned out that they also have their own security and are responsible for each offended client.
They took me to a data center, I was accompanied by a manager and an engineer on duty. In principle, nothing new was told, almost everything that I had previously read on the e-styleisp.ru website about the data center, I heard it, but there are interesting features about which for some reason they don’t write:

1. clean cold air that escapes from under the perforated floor. It turns out that the room is completely dust-free, the air is clean ... I asked the engineer, he says - in the kondey, there are powerful filters, but even then they are cleaned and replaced weekly.

2. I saw large air conditioning cabinets, asked about their washing. It turns out that blocks of 54 kW each (!) Are also connected via UPS. I was surprised and asked why? It turns out that all the powerful kondei do not immediately start up after connecting the power, but for some time they slow down, equalizing the pressure in the compressors. That is, it turns out that if the power jerks for a couple of seconds in a regular data center, then the kondei will either hang completely, or they will restart from 5 to 25 minutes. There are only two methods to deal with this: either make a large (tone by 10-20) storage tank for antifreeze (in the system chiller-fenoil), or connect kondei through bespereboyniki. In fact, they say that in Moscow only three commercial data centers were made correctly ... all three were launched in 2008 and 2009, all the rest were lower class, kondey at best on diesel engines with a few minutes running ... Oh, they don’t like in our country it is right to do it ... not much more expensive, but the class of reliability is completely different.
')
3. Lack of statics. I didn’t even imagine that this could be so bothering, but remembering a few dead computers in the office and constant reminders for components, I realized that this problem can be solved only in this way. Throughout the hall, a ground loop is made of a thick copper strip, absolutely every piece of hardware is grounded to the loop, and the loop with a separate thick cable goes to a separate ground in the VSU. The engineer began to talk about the two earthing systems PE and FE, and that they are being done by electrotechnical standards somehow completely difficult ... I did not quite understand, to be honest. The dependence of static accumulation on air humidity seemed more important to me. It turns out that statics is noticeable in general, usually in winter, when the air in the rooms heats up ... when heated, the air is dried out (the humidity drastically decreases), and the dry air and the surfaces conduct much worse current, so the static does not flow anywhere. But all kondei very dry air! And humidification (“vaporization”) is only in large precision ones, which are not installed in offices for financial reasons, only in the data center.

4. I don’t even want to tell a couple of diesels in front of the building about battery rooms and UPSs the size of a room, and everything is as trite as everyone else’s. Although there is still one feature: as you know, all equipment must be periodically stopped and turned off, such is life ... fans and batteries still wear out, and then simply burns out something ... the more complex the system, the more chances that it will die that element. In general, an engineer told me about a complex system for the decommissioning of a separate UPS subsystem for about 5 minutes, while all this WITHOUT A LOAD SWITCH! Imagine the operation of complete replacement of a large bespereboynik (2 cabinets, weighing about one and a half tons), with terminals and switches for hundreds of kilowatts, and all this without stopping the connected servers! I was filled. Switching to backup systems takes place in less than half of the sine wave period and is completely imperceptible to servers and equipment.

He came back to his office, went to the server rack and realized that the disks really died due to dust, temperature and vibration, because all this is in abundance. It became even insulting somehow. Previously, it was possible to blame all the manufacturers-marketers, and now I understand - that's not the point. Okay, something needs to be done. I never invented how to ensure normal conditions in an ordinary office: I was immediately turned down on the estimate for the containment area, and the filters were cleaned every day.
Tariffs for hosting servers are not particularly cheap, making old servers in the data center is an expensive and stupid idea. I came up with a different way: only one server remained in the office, which I collected from the old ones (I pulled all the disks from the servers and reassembled into one server in Raid-6), deployed the domain and file storage facility on it. For everything else (VPN, Web, MS Exchange with antispam, 1C, and a couple of specialized databases) I rented a server in the e-Style data center.

The new real Intel server on the Xeon 5520 pro with 24 GB of DDR3 RAM cost my office along with the deployment on the 100 Mbit channel at 14 tons of rubles per month. The server supports virtualization, I immediately deployed ESXi to it, created 4 virtuals. I put the tasks neatly into virtuals, and the most difficult thing was to get the accountants to work in the terminal: printers and 1c-new hasp surfaced, but it won. Another of the tricks - I ordered extra. the service of “external iscsi storyage”, it turned out to be cheaper, faster and more reliable than building an array ... when the virtual women from the images on the storyline deployed, I did not understand why the system was fully installed in 4-5 minutes ... it doesn’t happen, but I saw ! After all, the bridge is connected to a gigabit, the transfer should be slower than the usual SATA, but in real work the external iscsi raid is unique, probably in a multi-gigabyte cache.
I also added lease of licenses for MS Server2008, MS Exchange and user licenses to the contract, it turned out to be not very expensive (about 5% of the total cost of the software if purchased) and the official contract with Microsoft’s gold partner completely reassures comrades in ponghons, and more and you can upgrade versions at no cost and refuse part of the licenses at any time ...
It was necessary to upload the databases and files, refused to be poured over the Internet right away - I’ll go by myself, at the same time and see the server with my own eyes. I arrived, they brought me to my server (I saw my full name and contacts on the tag) - I asked for the console, they gave me a monitor and keyboard - and really my server! I put the system on it through IP-KVM, until the last moment there was a feeling that something was wrong ... some kind of blade or a working one ... but no, a NEW separate true 1U server. I connected a data logger with a hot-swap SATA, deployed it and left backups to store, it won't be superfluous.
The data center also provides “monitoring”, which I use (it's free!). It looks like a “paper” instruction and a list of SNMP objects and acceptable values, if the value goes out of the allowable range - the engineer on duty will open my instruction and act on it ... I just wrote “call me” and the list of telephones in it, and not really Critical things added "do not call at night, wait for the morning." If something appears on my screws in smart, or the fan stops, they will immediately call me, and since all the hardware is in the lease, they will also change everything themselves, all I can do is to allow / coordinate my data, if necessary.
As a result, the load on the servers now (by percent of the CPU load) is on average three times less than it was, the databases run much faster, works quickly and reliably. From my personal achievements - I mastered ESXi, turned out to be no more difficult than in the lab, I sharpened for 2 days, the thing is interesting, this is the future.

Source: https://habr.com/ru/post/78297/


All Articles