📜 ⬆️ ⬇️

EBay data center and adiabatic wetting

"The data center equipment should be operated in a temperature range of up to + 25 ° C, and cooling is preferably carried out using chillers or precision air conditioners." Until recently, thanks to the recommendations of the ASHRAE (American Society of Heating, Refrigerating and Air Conditioning Engineers, one of the most respected organizations in the field of refrigeration and air conditioning), this was an axiom. But the cost of electricity for cooling the data center grew along with tariffs and equipment capacity, and in the end, cooling systems began to consume 35-40% of all the energy required for the operation of the data center.


Intro


There is a traditional approach to reducing the energy consumption of a refrigeration unit, which consists in finding more efficient refrigerants and selecting system parameters. But this is an evolutionary development, in essence, a battle for a few percent of the increase in energy efficiency. In this context, the complete abandonment of condensing units and the transition to the use of outdoor air can be considered a revolutionary way. In Murmansk or Norilsk, such an approach would be fully justified. But a data center with freecooling in a hot desert is, in the opinion of a non-expert, already from the field of unobvious marketing solutions invented for the sake of "green itch" and other phenomena that are not yet understood to us.


')
It is all the more surprising that such a solution has much less to do with marketing, and such an unconventional approach is primarily due to economic and technical reasons. The data center "Mercury" in Phoenix, Arizona, belongs to the well-known to all of us eBay. This company chooses the location for its sites in such a way as to minimize delays for users around the world, because eBay makes deals for $ 2,000 every second, which means that it is vital for the company's services to remain available 24/7/365. Thus, the data center in Phoenix is ​​geographically well located. But the climate ... Tropical desert climate, very hot summer with maxima around +50 . It’s time to think that not every refrigerating machine can withstand work in such conditions, not to mention free-cooling. But the initial conditions — maximum equipment density and maximum computing power per watt of energy consumed by the data center — left no choice: the use of traditional cooling systems would have shattered all dreams of high energy efficiency. After a thorough analysis, Global Foundation Services specialists (eBay division) came to the conclusion that it was “free-cooling” that would best provide the necessary efficiency, and announced a competition to design a data center in the desert, but with year-round cooling by outside air.

Operating principle


Of course, you can not take a project of a conventional data center with traditional server racks and redo its cooling system under the outside air. Equipment for which the IT standard +25 .. + 27 C is stable, simply will not withstand this transition, because “free cooling”, especially in a hot region, in principle cannot provide the necessary temperature. Requires equipment capable of operating normally at higher temperatures. And such equipment was found in the Dell line: modular data centers with Dell PowerEdge servers and rack density up to 30 kW.
But what about air temperatures up to +50 degrees Celsius? Of course, the temperature range of PowerEdge equipment is somewhat wider (up to 45 degrees for short peaks), but not to the same degree! And it was here that a solution that was very wild for most IT specialists was applied: adiabatic humidification, or cooling using the heat of vaporization of water to extract heat from the air. The essence is very simple: in the dry hot air (which is the air in Phoenix) droplets with a diameter of, as a rule, 0.06-0.08 mm, ordinary water is sprayed, purified from impurities. The specific heat of water vaporization is 2260 KJ / kg, the specific heat capacity of air is 1.006 KJ / kg * ° C. Thus, due to the evaporation of a kilogram of water, the temperature of 2,200 kilograms of air can be reduced by one degree. In practice, the air flow temperature decreases significantly (on average by 7 degrees, it depends on a combination of factors). The downside of this approach is the increase in humidity. Thanks to numerous stereotypes, everyone knows that high humidity is a death for equipment, it is a failure and a premature server outage.
Numerous studies of industry giants have shown that this is not the case. Most of the equipment is able to withstand a temperature rise and an increase in air humidity without any harm to it.



Specially designed for such conditions, racks and servers, of course, also tolerate high humidity. Operational experience of the Mercury data center showed that in short hot periods the evaporation of water is able to maintain the air temperature at a level that is acceptable for the data center, while adiabatic cooling is not required at all for most of the year - there are very cold months in Phoenix. There are no “peak” and duplicate systems in the data center; thus, the data center equipment is cooled using an inexpensive and highly reliable system, due to the absence of complex units.

Nuances


Of course, the implementation of such a system, and even in such an unconventional version, is associated with a lot of practical difficulties. So, it is extremely important to combine the correct droplet diameter and airflow rate: if it does not fit into the “beauty standards” or the airflow rate is too high, it will be taken out of the space where the heat exchange takes place, and accordingly “the focus will not succeed”. Water must meet fairly serious requirements for CaCO3 content and hardness (8–12 degrees of hardness, 1 degree of hardness corresponds to CaCO3 in an amount of 1 mEq / l of impurity), the pH should also not be higher than 7, otherwise the cooling system elements will be exposed corrosion. There are less obvious difficulties: for example, what to do with water that does not evaporate, how and where to collect it?

Profit


However, after overcoming these difficulties, the use of such an unconventional cooling system for data centers has led to an enviable energy efficiency. The coefficient of PUE (Power usage effectiveness, calculated as the total power of the equipment divided by the power of the IT equipment) in August day was 1.043, i.e. auxiliary equipment, including the cooling system, consumes only about four percent of the data center energy even in summer, and in winter even less, PUE in the region of 1.018. The efficiency of condensing systems based on chillers or DX air conditioners is significantly lower, for them PUE in the region of 1.3 is an achievement. Even on the hottest days, the “free” data center cooling system allows servers to function reliably and reliably. Remember, because the site owns eBay. If there were any doubts about the effectiveness and stability of such a solution, the company, whose life depends on the availability of its sites, would never have done it. But the Mercury data center with an area of ​​12,600 square meters and a capacity of 4 MW has been operating for over a year.
Interestingly, such a cooling system and the placement of data processing modules on the roof of the data center make it possible not only to cool them effectively, but also to quickly increase the computing power, if necessary. So, with the help of special cranes, one and a half thousand servers can be raised to the roof in twenty minutes. Then they are quickly connected to electricity and water, and after an hour they are in the ranks. The data center has the ability to rapidly expand capacity to 6 MW, as well as the necessary infrastructure to increase it to 12 MW. 12,600 square meters - quite a bit by the standards of modern data centers, but such power and density - this is serious.
The use of free-cooling in conjunction with adiabatic evaporative cooling in a “hot” data center is a bold, unconventional, non-obvious, but already proven solution.



Of course, the precision and chillers will not disappear anywhere, and the increase in the average air temperature in an air-cooled data center must be approached carefully. But even if ASHRAE, in its recommendations of 2011, recognizes the existence of equipment classes A3 (up to 40 ° C) and A4 (up to 45 ° C), and eBay with such equipment is already in full swing, it means that neither humidity nor elevated temperature should be feared just because that they are and there is a rumor about their poor compatibility with servers. Competently selected equipment, an efficient cooling system and well-established monitoring are all the secrets of super-efficient data centers, whose share will surely grow in the coming years, including in our country.

Native penates


Where are such conclusions from? The reason is simple: FZ-261 establishes a fairly rigid framework for all serious consumers of resources, as well as serious indicators of increasing energy efficiency — 40% by 2020. The transition to natural refrigerants and the use of new thermostatic valves do not achieve these indicators. In addition, it is - with a sufficiently large investment - not always any tangible savings. But the transition to a fundamentally different paradigm of data center cooling with the help of outside air is almost the required tens of percent, and, taking into account the constantly growing energy tariffs, significant savings in the operation of the data center. Money, the growing power of server hardware, as well as regulatory documents that, after the advent of the FZ-261, breed like mushrooms after the rain - this is what will very soon lead free-cooling to domestic data centers.

Source: https://habr.com/ru/post/163821/


All Articles