We would like to keep you up to date with the latest news from Datacentre Solutions by sending you push notifications.
Subscribe to Data Centre Solutions
Why Subscribe?
Digital Newsletter Each week our editor Phil Alsop rounds up the most popular articles, videos and expert opinions. We compile this into a Digital Newsletter and send it straight to your inbox every week.
Digital Magazines We'll let you know each time a new edition of Data Centre Solutions is released so that you're always kept up-to-date with the latest and greatest news and press releases.
Video Magazines The Data Centre Solutions Video magazine contains the latest Zoom interviews with experts in the industry.
Environmental control has key role to play
Projected
growth for the data centre cooling market illustrates the key role that
environmental control will play in the data centre of the
future.
Fresh forecasts from Occams Business Research & Consulting (OBRC) predict that the global data centre cooling market will grow at a compound annual growth rate (CAGR) of 14.95 per cent between 2016 and 2023. According to Greg McCulloch, CEO of colocation providers Aegis Data, this prediction illustrates the core role that cooling will play in enabling data centres to support the demanding technologies of the future, and the need for data centres to have efficient and sophisticated cooling systems in place.
Overall global data centre CAGR is estimated at 11 per cent for 2016 to 2020. McCulloch commented on the significance of this growth: “It’s interesting to note that CAGR for data centre cooling is actually higher than that for overall data centre growth. Although to a certain extent this reflects the shorter period being examined for overall growth, it also speaks to the central role that cooling plays in the modern data centre. As data volumes become larger and requirements become more complex, so cooling plays an increasingly important role in enabling the data centre to handle these requirements stably and profitably.”
McCulloch went on to give examples of technologies that will ultimately depend on a data centre’s cooling capabilities: “Artificial Intelligence (AI), the Internet of Things (IoT), Virtual Reality (VR), and Augmented Reality (AR) – what all these technologies have in common is that they are predicted to grow in prominence over the next few years, and that they have complex and demanding data requirements. In order to thrive in this age of exponentially growing data volumes, organisations are increasingly turning to high performance computing (HPC) services. This technology is geared towards solving problems which involve huge data sets, and the market for it is expected to reach $33bn (?26bn) by 2022.
“Where does cooling come into this picture? For a data centre to comfortably accommodate HPC it must be able to accommodate its superior processing power, which often involves concentrating more computing power in higher density racks. This produces far more heat than the standard data centre configuration, meaning that efficiently cooling this space becomes far more important for supporting this technology.”
Huge data centre providers are going to great lengths to minimise their cooling costs. For example, Facebook located a data centre in near-arctic Lule?, northern Sweden, whilst Microsoft is experimenting with an underwater data centre. McCulloch concluded by challenging the notion that maximising cooling efficiency requires huge expenditure: “You don’t need to have the vast resources of a Facebook to access the kind of cooling efficiencies needed to support the ever-expanding data requirements of new technologies. There are various other approaches available, including liquid or conductive cooling.
“We echo the approach of the Nordic data centres with Direct Fresh Air Cooling (DFAC), which draws the ambient air surrounding the data centre into the building, filtering it for contaminants, and mixing it so that it is at the optimal temperature and humidity for operations. This enables us to deliver an ultra-low design Power Usage Effectiveness (PUE) of 1.13 and a contracted PUE of 1.2.”