📜 ⬆️ ⬇️

Data centers, similar to chicken coops, and work in Antarctica: a selection of unusual data centers

Today we decided to tell you about several unusual data centers built in various parts of the world. How to ensure the work of the data center in the harsh conditions of Antarctica? And why are Yahoo data centers like chicken coops? About this further.


/ photo Krishna CC

The "coldest" data center on the planet


By the standards of the current data centers, Ice Cube Lab is not the largest and not the most powerful. The data center has 1.2 thousand computing cores and data storage for three petabytes. However, one feature sets it apart from the others - it is located in the Antarctic at Amundsen-Scott station.
')
The data center is connected to the neutrino detector of the IceCube Observatory, consisting of arrays of optical sensors immersed in ice for a kilometer depth. The system helps scientists register neutrino flashes caused by various astronomical phenomena, it is necessary to study dark matter.

The Ice Cube IT staff consists of only a few people, most of whom only come in the summer to carry out the planned work. The rest of the support is carried out remotely. For most communications with the center, the Iridium satellite network is used, although the data transfer rate is only 2400 bps.

With such a low speed, the team has to go for a number of tricks: squeeze email attachments, resort to multiplexing, even use wireless servers to connect field stations. However, not everything is so bad. Eight hours a day, the Nasa GOES-3 satellite (which was previously a meteorological satellite) still provides the station with a megabit communication channel.

Managing the work of the data center in Antarctica, you need to take into account a number of features. One of the main problems is almost zero humidity. Because of this, employees, being near the servers, are forced to wear antistatic vests and ensure that all grounding requirements are met.

In addition, low humidity disables the magnetic cassettes used to store and transfer data, which literally begin to crumble. Engineers even have to slightly moisten them. For this cassette periodically placed in the greenhouse. The station staff even wanted to put in a server humidifier, but because of possible problems with condensate, they decided to abandon this idea.

Another difficult task is to cool the data center. It would seem that you can simply “open the door” and allow the cold to solve the problem. However, the temperature outside can reach -70 degrees Celsius, which in a short time will disable all the equipment. In such conditions, a special ventilation system (without air conditioning) is used to control the air temperature, controlling the flow of cold air from the outside. However, as the station staff say, it happens that it freezes in extreme conditions.

Despite all the difficulties, team members are proud to be able to support the operation of 150 Ice Cube Lab servers near the South Pole, ensuring availability at levels greater than 99.5%.

The Ice Cube Lab team last year published a short video in which they conducted a tour of the data center. You can view it at the link .

One of the greenest data centers


LEED is a private green building certification program that was launched back in 1998. Companies whose buildings comply with LEED standards save up to 25% more energy, and can also receive tax benefits from the state.

Today, about 25 data centers around the world have LEED platinum status - the highest status in this program. One of the first to receive it was Citi Data Center occupying 2 hectares in Frankfurt back in 2009.

Citi Data Center uses several energy-saving technologies. The center itself is designed so that 65% of the time at the base of the cooling system is fresh air outside. Due to the process of reverse osmosis , the sediment in cooling towers used for cooling water is reduced. In addition, rainwater is used for irrigation of greenery in the data center.

All operational waste is sent for recycling. At the same time, the “environmental friendliness” of the data center was observed even during construction. All construction waste was taken out of the city dump, and competent design of the data center reduced the length of all necessary cables by 250 kilometers.


/ photo Open Grid Scheduler CC

Data centers from cubes


When we talk about data centers, most often the image of a huge room full of server racks and equipment appears in my head. However, there is a more compact alternative - a modular data center . Such data centers are often presented in the form of blocks with their own cooling system, with which you can quickly deploy your IT infrastructure in the required volume.

For example, the transport company Ecotality, using modular data centers Instant, was able to move to a new office in 8 weeks, while saving 120 thousand dollars in the space under the data center and reducing the cost of cooling costs by 65%.

It happens that companies (for example, IBM and Elliptical ), “pack” servers and other auxiliary equipment in sea containers. In this form, it is convenient to transport data center modules using trucks or ships in a short time over long distances.

The advantage of "portable" data centers is the ability to place them closer to their customers or sources of information. One example is the Nautilus project , whose creators placed equipment on barges. This concept allows not only to use water for cooling the centers, but also to significantly increase their mobility.

Yahoo chicken coops


The new project of data centers from Yahoo is called Yahoo Compute Coop ("Yahoo's Computing Coop"). However, this is not a joke or a marketing ploy. Data Centers really look like chicken coops, but the point here is not only in external similarity. The oblong form of chicken coops, along with an additional section of the roof, is used to create natural ventilation, due to which warm air rises and is discharged to the outside through special openings.


This Yahoo data center can be cooled with simple air from the street. In the latter case, the air outside with a temperature of 21 to 29 degrees Celsius enters the data center through adjustable louvers (1) located on the walls of the building. Then, using the ventilation unit (2), it is sent to the server room (3) via the mixing chamber (4). In this case, the air is not cooled additionally, but only filtered.

The fans installed on racks with servers blow the already warm air into the internal corridor (5), from where it, by natural convection, rises to the “attic” (6). It is then brought out through the adjustable louvers at the top (7).

If the air temperature on the street is above 29 degrees, then at first it is cooled in the mixing chamber. And if it is below 21 degrees, then part of the hot air that has already passed through the server room is sent to the mixing chamber for heating.

This design came to Yahoo not immediately. The company began to build its own data centers in 2007, but they did not stand out in any way - the usual rooms with active air cooling. The next project was called YTC (Yahoo Thermal Cooling). The hot air in such data centers was blown by the fans of the servers into a special enclosed area, from where it was forced out using an intercooler.

The project "chicken houses data centers" is the third iteration of the Yahoo project to optimize cooling systems in large enterprises with tens of thousands of computers. The company has already built several data centers according to a new plan. One of the first was opened in Lockport, New York. At the same time, its energy efficiency ratio ( PUE ) is 1.08 points, which is comparable with European data centers that use cold climate features for cooling racks.

In addition, the corporation has registered about 3 thousand patents, and intends to sell the technology to other large firms.



PS What else do we write in the First blog about corporate IaaS:


PPS Some fresh materials from our blog on Habré:

Source: https://habr.com/ru/post/358948/


All Articles