Space and power: what’s really limiting data centre capacity

Perceptions of limitations in data centres may not be what they first appear, as demand drives new examination. By Markus Gerber, Senior Business Development Manager, nVent Schroff.

The world has seen increasing demand for digital services in recent decades. This demand has only grown since the pandemic when they were not just a boon but a lifeline for many during times of lockdown and isolation.

As we moved beyond the pandemic, work practices were also changed forever, as more and more sought to work remotely, whether from home or different geographies. This combined with developing business models and digital transformation, has seen demand grow even more, as well as new requirements that have seen the likes of edge computing proliferate.

All of this has driven growth in data centres but has also increased pressure to meet demand. These pressures are seeing space, density, and power come under the spotlight as limitations in some cases.

Scale of growth

To get an idea of the scale of growth in digital services in recent decades, data volume is a key indicator. According to Statista, since 2010, the volume of data created, captured, and consumed has grown from 2 zetabytes to 97 zetabytes in 2022, with the figure for 2025 expected to be 181 zetabytes. Despite this near exponential growth, according to the International Energy Agency, energy demand since 2010 has only gone from 194 terawatt hours (TWh) to just over 200 TWh in 2022.

These two contrasting figures show the extraordinary strides that have been made in energy efficiency in computing since then, especially when it comes to pure processing power. With Moore’s Law in effect for the period, the benefits are clear. Now though, there are concerns from no less a figure than the CEO of Nvidia, Jensen Huang, that the Moore’s Law effect may be coming to an end. While this is disputed, there can be little doubt that processors are likely to become evermore powerful, while producing more heat in the process.

To meet that demand towards 2025, and beyond, it is likely that data centre limitations will be encountered, with space being chief among them.

Space and power

Space was often seen as one of the chief limitations for data centres. The space was described in terms of the ability to power equipment in a given unit of measure, such as watts per square foot or square meter. This was a useful rule of thumb for specifications and facility design. Architects would plan cooling and power according to such measures. Under this approach, the data centre has progressively been becoming hotter, using more

power, to provide for increasing levels of processing. In an air-cooled data centre this required more and more air pumped through, meaning for every watt drawn, less and less was used for compute.

As a result, data centres in the nineties and early 2000s became less and less efficient in terms of how effective that power was in being used to provide data processing. As the chip power kept going up through a number of different technological developments, and with evermore demand for performance, data centre operators found themselves demanding more and more cooling volume and flow, until they hit barriers in cost, complexity and management. Many experienced a threshold where they simply could not just pump a room full of air to be able to cool those chips, making it increasingly unfeasible for much of what is already deployed.

Equipment management

Management too became an issue. Often as a data centre evolved, equipment was upgraded or altered, moved around or replaced due to failure. Gaps, spaces, and expansions often meant that even carefully implemented methodologies such as hot aisle/cold aisle systems, were left working poorly, as guidelines for airflow management were often ignored in the need for expediency and demand.

This could add to the impression of space limitations when a new project or service was contemplated, when in fact a facility, if properly managed, could take more before reaching the inevitable limit of pumped air cooling.

What is clear from this is that while good management and design are key to ensuring that physical space is not a limitation in meeting demand for digital services from data centres, air cooling is and will be increasingly so in the future. As other architectures also emerge, such as edge computing, new cooling solutions will be required if service demand, physical space, energy efficiency and sustainability needs are to be met.

To meet the emerging demands for digital services in the foreseeable future, data centre operators will need to consider hybrids of air, liquid and direct-to-chip cooling, taking advantage of the specific characteristics of each to appropriately and proportionately provide the kind of cooling that enables density to be deployed reliably and economically.

Inefficient medium

There is a clear reality when it comes to cooling: the closer heat can be captured from where it is produced, the more efficient the process.

Allied to this is the fact that air is a very inefficient medium. A water-based fluid, or another dielectric liquid, is a much more efficient medium to capture and transmit heat.

Even with the likes of hot and cold aisle layouts, with rear door and in-row coolers, blanking plates and efficient cabling, air is still inefficient. While these measures are likely to remain part of the mix for many operators for years to come, other methods must be considered.

Liquid cooling solutions can offer greater capability to accommodate equipment density than air cooling, Heat captured through liquid cooling can be more efficiently removed from the immediate environs of the equipment, and brought to potential reuse opportunities, without a state change.

Developments available now in liquid and direct to chip cooling can not only meet today’s density demands, relieving physical space limitations, they can also offer a critical upgrade path to allow data centre operators to move towards more efficient methods. This will be crucial as budgets also come under pressure amid the ongoing inflation trends and continuing global uncertainty.

Strengths and purpose

With these new cooling techniques and systems, it is not a one size fits all approach. Each technique and system has its particular strengths and characteristics that must be taken into account ensure the right performance is delivered per requirement. In-rack, in-row, or direct precision liquid cooling all offer differing applications and benefits to achieve and overall density and performance goal. All the while, ensuring efficiency that contributes to sustainability targets.

Data centre operators must be supported in their design and operational objectives by a trusted technology partner that not only has in depth knowledge but also a broad portfolio of solutions to meet each need. Understanding where better managed air cooling can remain, liquid cooling can be adopted and direct to chip cooling leveraged, is key to getting current needs under control, while building a path to future capability.

Improvements and a path forward

By properly examining real or perceived data centre space limitations, data centre operators can determine how best to tackle their density needs. More efficient, precise and controllable cooling solutions will be a key part of that effort.

With efficiency as a central strand of sustainability efforts, hybrid systems of air, liquid and direct cooling techniques can build a path to greater effectiveness in data centre cooling that relieves space pressures, while meeting demand and providing a strong base for future growth.

A trusted technology partner, with broad knowledge and portfolio resources, can guide operators towards the most informed and appropriate use of these now proven technologies to achieve their business ambitions sustainably.

By Marc Caiola – nVent Vice President of Global Data Solutions.
By Kamlesh Patel, VP Data Center Market Development at CommScope.
Relocation and replacement of cooling infrastructure on mission critical live data centre.
The positive impact of data centres on people, society, business and government. By Ed Ansett,...
By Sujatha Iyer, manager of AI in security, ManageEngine.
By Chris Wellfair, projects director at Secure I.T. Environments takes a look at data centre...
By Mike Meyer, Managing Director of Portman Partners.
By Ian Jeffs, UK&I Country General Manager at Lenovo Infrastructure Solutions Group.