Hurdles for Free Cooling

It is indeed a lot easier for Facebook, Google and Microsoft to operate data centers with "free cooling". After all, the servers inside those data centers are basically "expendable"; there is no need to make sure that an individual server does not fail. The applications running on top of those servers can handle an occasional server failure easily. That is in sharp contrast with a data center that hosts servers of hundreds of different customers, where the availability of a small server cluster is of the utmost importance and regulated by an SLA (Service Level Agreement). The internet giants also have full control over both facilities and IT equipment.

There are other concerns and humidity is one of the most important ones. Too much humidity and your equipment is threatened by condensation. Conversely, if the data center air is too dry, electrostatic discharge can wreak havoc.

Still, the humidity of the outside air is not a problem for free cooling as many data centers can be outfitted with a water-side economizer. Cold water replaces the refrigerant, pumps and a closed circuit replace the compressor. The hot return water passes through the outdoor pipes of the heat exchangers. If the outdoor air is cold enough, the water-side system can cool the water back to the desired temperature.

Google's data center in Belgium uses water-side cooling so well that it
does not need any additional cooling. (source: google)

Most of the "free cooling" systems are "assisting cooling systems". In many situations they do not perform well enough to guarantee the typical 20-25°C (68-77 °F) inlet temperature the whole year around that CRACs can offer.

All you need is ... a mild climate

But do we really need to guarantee a rather low 20-25°C inlet temperature for our IT equipment all year round? It is a very important question as the temperature in large parts of the worlds can be cooled with free cooling if the server inlet temperature does not need to be so low.

The Green Grid, a non-profit organization, uses data from the Weatherbank to calculate the amount of time that a data center can use air-side "free cooling" to keep the inlet temperature below 35°C. To make this more visual, they publish the data in a colorful way. Dark blue means that air-side economizers can be efficient for 8500 hours per year, which is basically year round. Here is the map of North-America:

About 75% of North-America can use free cooling if the maximum inlet temperature is raised to 35°C (95 °F). In Europe, the situation is even better:

Although I have my doubts about the accuracy of the map (the south of Spain and Greece see a lot more hot days than the south of Ireland), it looks like 99% of Europe can make use of free cooling. So how do our current servers cope with an inlet temperature up to 35 °C ?

Free Cooling for the Data Center? Servers and High Inlet Temperatures
POST A COMMENT

48 Comments

View All Comments

  • ShieTar - Tuesday, February 11, 2014 - link

    I think you oversimplify if you just judge the efficiency of the cooling method by the heat capacity of the medium. The medium is not a heat-battery that only absorbs the heat, it is also moved in order to transport energy. And moving air is much easier and much more efficient than moving water.

    So I think in the case of Finland the driving fact is that they will get Air temperatures of up to 30°C in some summers, but the water temperature at the bottom regions of the gulf of Finland stays below 4°C throughout the year. If you would consider a data center near the river Nile, which is usually just 5°C below air temperature, and frequently warmer than the air at night, then your efficiency equation would look entirely different.

    Naturally, building the center in Finland instead of Egypt in the first place is a pretty good decision considering cooling efficiency.
    Reply
  • icrf - Tuesday, February 11, 2014 - link

    Isn't moving water significantly more efficient than moving air because a significant amount of energy when trying to move air goes to compressing it rather than moving it, where water is largely incompressible? Reply
  • ShieTar - Thursday, February 13, 2014 - link

    For the initial acceleration this might be an effect, though energy used for compression isn't necessary lost, as the pressure difference will decay via motion of the air again (but maybe not in the preferred direction. But if you look into the entire equation for a cooling system, the hard part is not getting the medium accelerated, but to keep it moving against the resistance of the coolers, tubes and radiators. And water has much stronger interactions with any reasonably used material (metal, mostly) than air. And you usually run water through smaller and longer tubes than air, which can quickly be moved from the electronics case to a large air vent. Also the viscosity of water itself is significantly higher than that of air, specifically if we are talking about cool water not to far above the freezing point of water, i.e. 5°C to 10°C. Reply
  • easp - Saturday, February 15, 2014 - link

    Below Mach 0.3, air flows can be treated as incompressible. I doubt bulk movement of air in datacenters hits 200+ Mph Reply
  • juhatus - Tuesday, February 11, 2014 - link

    Sir, I can assure you the Nordic Sea hits ~20°C in the summers. But still that tempereture is good enough for cooling.

    In Helsinki they are now collecting the excess heat from data center to warm up the houses in the city area. So that too should be considered. I think many countries could use some "free" heating.
    Reply
  • Penti - Tuesday, February 11, 2014 - link

    Surface temp does, but below the surface it's cooler. Even in small lakes and rivers, otherwise our drinking water would be unusable and 25°C out of the tap. You would get legionella and stuff then. In Sweden the water is not allowed to be or not considered to be usable over 20 degrees at the inlet or out of the tap for that matter. Lakes, rivers and oceans could keep 2-15°C at the inlet year around here in Scandinavia if the inlet is appropriately placed. Certainly good enough if you allow temps over the old 20-22°C. Reply
  • Guspaz - Tuesday, February 11, 2014 - link

    OVH's datacentre here in Montreal cools using a centralized watercooling system and relies on convection to remove the heat from the server stacks, IIRC. They claim a PUE of 1.09 Reply
  • iwod - Tuesday, February 11, 2014 - link

    Exactly what i was about to post. Why Facebook, Microsoft and even Google didn't manage to outpace them. PUE 1.09 is still as far as i know an Industry record. Correct me if i am wrong.

    I wonder if they could get it down to 1.05
    Reply
  • Flunk - Tuesday, February 11, 2014 - link

    This entire idea seems so obvious it's surprising they haven't been doing this the whole time. Oh well, it's hard to beat an idea that cheap and efficient. Reply
  • drexnx - Tuesday, February 11, 2014 - link

    there's a lot of work being done on the UPS side of the power consumption coin too - FB uses both Delta DC UPS' that power their equipment directly at DC from the batteries instead of the wasteful invert to 480vac three phase, then rectify again back at the server PSU level, and Eaton equipment with ESS that bypasses the UPS until there's an actual power loss (for about a 10% efficiency pickup when running on mains power) Reply

Log in

Don't have an account? Sign up now