Header image

VOICES | DATA CENTRE COOLING Broader perspective The idea that data centre loads work most reliably at an ambient 18-21C is outdated, according to Professor Ian Bitterlin, who says ASHRAE guidance offers much wider limits for temperature and relative humidity T he recent CIBSE Journal article on data-centre cooling and, specifically, humidification Mist opportunity CIBSE Journal, April 2019 is misleading, as it lays down a set of thermal requirements for the reliable operation of (ICT) servers and storage equipment that were superseded more than a decade ago. The basis of the article adiabatic or evaporative systems as a low-energy form of cooling in a data centre is not an issue. However, suggesting that data centre loads operate most reliably at an ambient temperature of 18-21C and a relative humidity (RH) of more than 45%... and air conditioning is essential is rooted in the past, and will waste a lot of cooling energy in the process. Temperature and humidity control in data centres is a long story of continuing energy reduction, which started in the mid-1950s with IBM machine rooms and has accelerated in development since 2004 through the auspices of ASHRAE. The original requirements from the IBM Planning Guide for mainframe rooms were based on tight temperature and humidity limits for two technical reasons: Magnetic tapes did not like rapid temperature variation over time, as they were the cause of read/write errors, and this manifested itself in a 21C1K temperature specification although it was not stated where that was to be maintained. The result was the application of precision air conditioning with a return-air setpoint of 21C, cold data centres with air below the raised floor of 12-15C, and with more than 50% bypass air. Punched cards for data sorting made of thin compressed cardboard were sensitive to humidity. Below 45% RH, the high-speed card-sorting machines produced prodigious static electric charge that cracked like mini-lightning discharges every time an operator touched the cabinet frame. Above 55% RH, the cards absorbed moisture and the sorting machines jammed. The resulting specification, 50% RH 5%, made precision humidity control a necessity. This was not seen as a problem from the mid-1950s to 2000, however, because energy was cheap, and the business enablement of ICT produced nearexponential opportunities for commercial cost-effectiveness and growth. It was not until 2004 that ASHRAE changed the cooling landscape of data centres and has continued to do so by publishing the Thermal Guidelines.1 The committee that writes these is populated by the ICT hardware OEMs, and it issues the only globally authoritative specification for temperature, humidity and air quality for loads with embedded microprocessors. ASHRAE is responsible for producing ANSI documents, so goes much further than one would assume a trade association to go. Its conditions are segregated into Recommended and Allowable envelopes for a range of electronic loads, with Class 1 most often used for data-centre equipment although vendors such as Dell offer servers built to Class 3 as standard. The ASHRAE Thermal Guidelines run to many chapters, and it is almost wrong to quote just the temperature limits given in tabular form. However, to show how far the industry has come since the 18-21C and >45% RH type of specification mentioned in Mist opportunity, the latest version (2015) for Class 1 Allowable is 15-32C and a humidity range based on dew point (DP) from a lower limit of -12C DP and 8% RH to 17C and 80% RH, with many limiting conditions. For Class 3 (just for example), Allowable is 5-40C, and -12C DP and 8% RH to 24C and 90% RH. Far, far wider than this article suggests. The data centre tends to follow ASHRAE limits with a couple of years delay, so the 2011 Class 1 Recommended is still often quoted in design specifications 18-27C and a humidity range based on DP from a lower limit of 5.5C DP to 60%RH and 15C DP, with Allowable at 15-32C and a humidity range of 20% to 80%RH. Low-energy adiabatic and evaporative systems are being used successfully in some indirect-air, data centre cooling applications to the 2015 ASHRAE guidelines. In rarer cases, such as Google and Facebook, direct-air adiabatic systems outside of ASHRAE limits are being used where the ICT hardware is bespoke, not commercially standard. References: Temperature and RH control in data centres is a long story of continuing energy reduction These conditions exacerbated by partial load, no variable speed fans or pumps, or free-cooling opportunities often resulted in cooling systems that consumed nearly as much energy as the load; basically, a coefficient of performance (COP) of 1.0 to 1.4. 1 Thermal guidelines for data processing environments, 4th edition, Ashrae, 2015 Professor IAN F BITTERLIN MCIBSE is a consulting engineer and visiting professor at the University of Leeds www.cibsejournal.com June 2019 39 CIBSE Jun19 pp39 Data cooling.indd 39 24/05/2019 16:31