COST MODEL | DATA CENTRE COOLING Table 2: Cost comparison of cooling methods in a 3,000m2 data centre Table 1: Benefits and drawbacks of three data centre cooling systems* 1 Traditional Method PUE range (cooling only) Pros 2 Indirect air cooling Air cooling with chiller and Crac units Air cooling 1.4 to 2.0 1.1 to 1.3 Tried and tested Proven technology Simple controls methodology No risk of leaks in white space Limited maintenance needed 3 Cool water with chiller assist CHW cooling with free cooling 1.1 to 1.3 Flexibility for distribution Higher achieved white-space ratio Simple controls methodology Cost-effective for low PUE Better control of supply air distribution Cons Hot spots, inefficient Acoustic challenges Large plant space required Highest PUE range Impact on net white space for ducts Increased risk of water leaks 24/7 operation required Reduced flexibility in distribution Large bulk water storage required No free cooling Space for bulk water storage required Higher maintenance costs IAC plant located close to data hall * All based on 1500W.m-2 IT load density requirement; Tier III certification. Based on experience of designing data centres The source of the cooling water is via free cooling cooling towers located externally, usually on the roof. Ambient air is used to cool the warm return water from the Crac units, with adiabatic cooling added during the warmer months. At peak times, when approaching the towers cooling-load limits, refrigeration chillers are used to run in parallel with the cooling towers. Table 1 is a summary of the pros and cons of each system. Bear in mind that numerous factors work in tandem within any given solution; for example, net-togross area, efficiency, power load, capital expenditure (capex) cost, and total cost of ownership (TCO) combine to determine the best solution for the client. Always make sure defined parameters are set to allow measurement of any solution against these critical factors. This will ensure the best-fit solution can be determined. In reality, most data centres use air- or water-based cooling solutions, and this is where our cost comparison has focused. The future is already in place, however, with some Notional data centre Net technical space (m2): 3,000 Load density (W.m-2): 1,500 IT load (kW): 4,500 Cost/kW of cooling installation Crac units and chiller (/kW) Crac units (N+2) 240 Air-cooled chillers (N+1), including CHW pipework distribution, pumps and other associated plant 890 (Four data halls) Indirect air cooling (/kW) 240 IAC units (N+2) 760 Process water installation to IAC units, including water storage tanks, pumps, water treatment. Supply and return ductwork to IAC units, including attenuation 220 Hybrid cooler (N+1), water-cooled chillers (N+1), including CHW pipework distribution, process water storage tank and pipework distribution, pumps and other associated plant BMS controls and power supplies to above mechanical plant Total /kW of cooling installation Crac units and hybrid cooler with chiller assist (/kW) 1,130 320 280 350 1,450 1,260 1,720 Notes on the above cost: Hot/cold aisle containment is excluded Main contractor prelims and OHP are excluded Building/structural/architectural works, dedicated fresh air systems, and electrical infrastructure are excluded clients opting for immersion cooling, by which servers are immersed into a liquid coolant for direct cooling of the electronic components. Immersing servers has been shown to improve rack density, cooling capacity and other design-critical factors. Test projects where data centres are located in the sea could result in some significant changes in this industry in the future. Aecom is carrying out advisory work with Atlantis, which proposes to build a data centre on the site of its tidal-energy centre, off the coast of Scotland. It demonstrates how high-power-demand data centres could help fund the emerging tidal-power sector, thereby contributing to the future decarbonisation of the data centre sector. CJ ABOUT THE AUTHORS This article has been written by Associates Nichola Gradwell and James Garcia, of Aecoms cost-management team in London, with assistance from Mike Starbuck and Anirban Basak, of Aecoms engineering team. 70 November 2019 www.cibsejournal.com CIBSE Nov19 pp68-70 Cost Model.indd 70 25/10/2019 16:04