Overall global data centre CAGR is estimated at 11 per cent for 2016 to 2020. McCulloch commented on the significance of this growth: “It’s interesting to note that CAGR for data centre cooling is actually higher than that for overall data centre growth. Although to a certain extent this reflects the shorter period being examined for overall growth, it also speaks to the central role that cooling plays in the modern data centre. As data volumes become larger and requirements become more complex, so cooling plays an increasingly important role in enabling the data centre to handle these requirements stably and profitably.”
McCulloch went on to give examples of technologies that will ultimately depend on a data centre’s cooling capabilities: “Artificial Intelligence (AI), the Internet of Things (IoT), Virtual Reality (VR), and Augmented Reality (AR) – what all these technologies have in common is that they are predicted to grow in prominence over the next few years, and that they have complex and demanding data requirements. In order to thrive in this age of exponentially growing data volumes, organisations are increasingly turning to high performance computing (HPC) services. This technology is geared towards solving problems which involve huge data sets, and the market for it is expected to reach $33bn (?26bn) by 2022.
“Where does cooling come into this picture? For a data centre to comfortably accommodate HPC it must be able to accommodate its superior processing power, which often involves concentrating more computing power in higher density racks. This produces far more heat than the standard data centre configuration, meaning that efficiently cooling this space becomes far more important for supporting this technology.”
Huge data centre providers are going to great lengths to minimise their cooling costs. For example, Facebook located a data centre in near-arctic Lule?, northern Sweden, whilst Microsoft is experimenting with an underwater data centre. McCulloch concluded by challenging the notion that maximising cooling efficiency requires huge expenditure: “You don’t need to have the vast resources of a Facebook to access the kind of cooling efficiencies needed to support the ever-expanding data requirements of new technologies. There are various other approaches available, including liquid or conductive cooling.
“We echo the approach of the Nordic data centres with Direct Fresh Air Cooling (DFAC), which draws the ambient air surrounding the data centre into the building, filtering it for contaminants, and mixing it so that it is at the optimal temperature and humidity for operations. This enables us to deliver an ultra-low design Power Usage Effectiveness (PUE) of 1.13 and a contracted PUE of 1.2.”