📜 ⬆️ ⬇️

Friday format: Unusual data center solutions



/ photo Dennis van Zuijlekom CC

The ever-increasing burden on computing systems is forcing companies to look for new ways to design data centers. In this article, we have prepared for you an overview of some of the most unusual solutions for data centers.
')
Our materials on the topic:



Underground data centers


Modern data centers consist of many thousands of servers that process confidential user information. To ensure data security, companies take unexpected steps, for example, they build data centers underground . The advantages of this solution are high speed of deployment: you don’t need to build a data center building, and installation can be done in any weather.

An area of ​​1.5 square kilometers is prepared for work for 60 days. Additionally, it is worth noting that underground facilities are extremely resistant to natural disasters, and this is a very useful advantage in tornadoopasny regions.

An example of an underground data center is the brainchild of Iron Mountain, located at a depth of 67 meters in an abandoned 145-acre mine near Pittsburgh. Under the ground, mostly cool - in Iron Mountain the temperature in the mine drops below 14 degrees Celsius, which saves on cooling. For every 1 kW of server power, there is 0.56 kW for cooling, while in classic data centers, the 1: 1 ratio is customary.

Due to the fact that the abandoned mine is located away from the city, Iron Mountain was able to save on electricity. Now the company pays 5.5 cents per 1 kW / h, while data centers in big cities give for the same amount of electricity from 10 to 17 cents. Save helps and the fact that the data center buys electricity at high voltage and independently carries out the transformation. If classic data centers take 480 V, then Iron Mountain takes 4160 V.

A similar underground data center is located on the Norwegian island of Rennes. Created by Green Mountain, it covers 21,000 square meters and goes 100 meters deep. Previously, this room was used as warehouses where NATO military equipment was stored.

The trend was supported by LightEdge Solutions, which is engaged in cloud computing and collocation. In 2014, she invested $ 58 million in the creation of the world's largest underground complex of 18 square kilometers.

Another impressive, according to Wired, server farm is located in an underground technology center in the Swiss Alps. The Swiss Fort Knox are two separate data centers housed in Cold War bunkers. Both centers are interconnected by duplicated fiber-optic communication channels.

Multistep protection systems in the network and IT reduce any possible risk of disruption to a minimum, and climate control and power supply systems reduce dependence on the outside world: the center is able to work autonomously for many weeks.

A distinctive feature of this data center is its “inaccessibility”: it is protected from man-made disasters by a thick layer of rock and a three-ton door, and from terrorist threats by security systems and a security service operating according to military standards.

It is worth noting that the data centers presented above use the help of mother nature. Water from underground sources is used to cool the compute modules, and Green Mountain receives water at 8 degrees Celsius from the fjord.

The working equipment is heated to high temperatures, and the energy consumption of individual racks reaches 30-35 kW, so heat removal is one of the most important problems.

DCD Intelligence reports that if you gather the heat generated by data centers around the world, it will be enough to warm the UK. Not wanting to waste so much energy in vain, the Russian company Janedex decided to find an application of the heat released. Not so long ago, in the Finnish city of Mäntsälä, a water after-heating station opened, which collects heat from the company's data center.

Now Mäntsyläl has one of four queues of Yandex servers, each with a capacity of 10 MW. The air heated by servers is driven by fans to the station of additional heating, warming up water from 30 to 60 degrees.

As for Russia, we have a problem in introducing such technology due to high energy losses in the heat supply networks, but there are still such projects. In the Irkutsk region, several companies (En + Group, Huawei, CDS, Lanit) and local authorities plan to build a data center, the heat from the servers of which will be transferred to the local CHP plant.



/ photo by Sam Howzit CC

Data centers under water (and on water)


The designers of the data centers are trying to make the most of the opportunities given by nature. Back in 2008, Google received a patent for a floating data center. The project provides for the placement of a data center at a distance of 5 km from the coast. This approach eliminates the need to build a separate building and exempts from paying property taxes.

However, Google is not in a hurry to bring the idea to life. Instead, the first step in this direction was made by the startup Nautilus Data Technologies.

Arnold Magcale and Daniel Kekai, co-founders of the company, create a network of floating platforms to provide equipment placement services. In their opinion, such a construction would allow protecting data centers from natural disasters and adding mobility to them - if necessary, the platform can be moved from place to place.

Traditional data centers use a huge amount of water for cooling, for example, the data center of the US National Security Agency in Utah consumes 1.5 million gallons of water per day (or 5.6 million liters). Experts from Nautilus decided to change the concept and brought the data center to the water, developing an original cooling system that takes water directly from under the barge and returns it back to the water.

Now the first commercial data center Nautilus is being built at the naval shipyard of Mar Island. The founders of the project are convinced that working at a military base will help them protect technologies; Subsequent data centers are also planned to be built at military shipyards.

The main advantage of the project is its rapid deployment. Floating data center accommodates 800 server racks, and it is possible to prepare it for work in 6 months (anywhere in the world where there is a reservoir).

According to Daniel, the data center at the shipyard of Mar Island is only a prototype. Now, electricity is used as the main source of power, but management plans to create next-generation barges with fuel cells, which will make the data center autonomous. It remains to understand how to interest this technology of conservative customers who prefer traditional types of data centers.

If Nautilus Data Technologies builds a data center on the water, then Microsoft decided to dive into the topic with the head. The company is working on a project called Natick , and has already developed a prototype capsule that can withstand hundreds of meters of water pressure.

“When I first heard about it, I thought,“ Water, electricity, why do it at all? ”Said Ben Cutler, Microsoft designer, who participates in the design of the Natik project. “But the more you think about it, the clearer you see the meaning.”

The team from Microsoft is confident that the mass introduction of such technologies will reduce the time to deploy new data centers from 2 years to 90 days.

Larry Smarr, a physicist and data processing specialist, is confident that this approach has a future. He notes that cloud providers have been looking for suitable locations for years to install a data center in the hope of taking advantage of the environment.

Placing a data center under water solves the problem of cooling servers. Moreover, such capsules can be placed closer to cities that are located near water bodies, which will speed up the work of web services, reducing the delay that gives Internet users a lot of inconvenience.

The first experimental vessel of the Natik project was named “Leon Philpot” in honor of the popular game character from the game Halo. Testing of the data center took place in the Pacific Ocean at a distance of a kilometer from the coast from August to November 2015.

The system was crammed with various pressure, humidity, movement and leakage sensors to better understand the possible causes of equipment failure. But for the entire time of the test, not a single failure occurred, which allowed the team to extend the experiment and even successfully deploy in the course of further testing some commercial projects with data processing in the Microsoft Azure cloud environment.

During this experiment, experts from Microsoft monitored not only the state of electronic systems, but also the state of the surrounding underwater environment. Using acoustic sensors, the noise emitted from the capsule turbines and its degree of influence on the inhabitants of the underwater world were estimated. It turned out that the sound of the device drowns out the slightest noise emitted by fish swimming past.

Having successfully completed the first stage of the experiment, the research team is planning to develop another version of the capsule three times larger (Leona Philpot diameter - 2.5 meters). At this stage, the division plans to involve a group of specialists in alternative energy to participate in the design.

The company plans to generate electricity using undercurrents. This will reduce the emission of heat into the sea water. At the moment, scientists have managed to achieve the absence of a temperature change around the capsule at a distance already a few inches from the device.

It is premature to make proactive conclusions about the likelihood of technical implementation of this bold project, and even more so the timing of putting the first objects into operation. “At the moment, the Natik project is at the research stage, and it is not completely clear whether this concept will be applied by Microsoft and other cloud service providers.

In addition to developing new submarine data centers, Microsoft is engaged in laying submarine cables across the Pacific between China, South Korea, Taiwan, Japan, and the US West Coast. New Cross Pacific, as they called this network, will increase the speed of data transmission.

Another transatlantic cable - Hibernia Express - was commissioned in September 2015 and is laid between Canada, Ireland and the United Kingdom. So far, Microsoft is the only company that has decided to spend $ 300 million on laying AEConnect cable.

The future network, with a length of more than 5,400 kilometers, is designed to transmit data at a speed of 100 Gbps, so Microsoft will use this transatlantic network to meet the growing demand for broadband and support for cloud services.

Managing Director of Microsoft Corporation for interaction with global network operators Dave Crowley (Dave Crowley) noted that out of 230 submarine cables in the world, very few are able to support coherent transmission technology at 100 Gbit / s.



/ photo Arthur Caranta CC

Future data centers


Looking at the submarine / surface data centers and cables stretching across the Atlantic, it seems that the future described in many cyberpunk novels has already arrived. Another confirmation of this is the concept of the data tower, created by Marco Merletti (Marco Merletti) and Valeria Mercury (Valeria Mercuri).

For developers, their project is the answer to the question of what an environmentally friendly data center should look like. The main task is to explore ways of beneficial use of natural conditions to simplify and reduce the cost of maintenance.

Despite the seeming futuristic character of the tower, it can be created using modern technologies. According to the architects, this project is a giant cylindrical three-dimensional motherboard. All elements are attached from the outside, and the inside remains empty, forming the air duct of the cooling system. With the help of fans, it is planned to create the effect of natural thrust, feeding fresh cold air from the outside and bringing warm air through the emptiness in the center.

The cooling of the tower also contributes to its planned geographical location. Weather conditions in Iceland are ideal for placing data centers, and thanks to cheap electricity generated by hydrostation, the cost of maintenance will be low.

The data tower consists of 24 pillars that form the skeleton of a building, on which are mounted blocks with computing equipment. Blocks with equipment protruding from the walls carry from 4 to 8 standard server racks, which opens up opportunities for scaling. According to Mercury and Merletti, the data tower can accommodate up to 400,000 servers.

In conclusion, I would like to note that it is precisely such ambitious projects that are pushing forward the rapidly developing IT industry. It is possible that in the near future machine rooms will be deployed in all the caves of the world, towers with servers will grow at the server pole, and the expression “deep Internet” will become literal.

PS We try to share not only our own experience on the service of providing virtual infrastructure 1cloud , but also to talk about related areas of knowledge in our blog on Habré. Do not forget to subscribe to updates, friends!

Source: https://habr.com/ru/post/301154/


All Articles