Modular design eases the commissioning and expansion of data centres

With demand for cloud services growing so rapidly, data-centre architects are looking for efficient, cost-effective ways to build facilities that enable them to match the level of capital investment to current demand, while providing a low-cost way to expand to meet future demand. By Ian Wilcoxson, Channel Manager (Data Centres) EMEA Power Solutions, Kohler

Data centres are the factories of the Information Age, automating and standardising the processing of vast amounts of data into information, entertainment, and insight. Data centres must operate 24/7/365, as close to peak capacity as possible, to make the most of the capital invested in them. The same is true when it comes to building data centres: the entire project must be carefully specified, meticulously planned, and implemented with great precision, to minimise the time that capital is lying idle. With internet traffic growing around 3% a month, delaying the completion of a data centre can leave significant opportunities untapped.

One response to the industry’s demand for capacity is to modularise the process of building data centres. In this approach, servers, networking equipment and ancillary services such as back-up power generation are built at the supplier’s factories and then delivered to site for ‘plug and play’ installation. This approach also means that data-centre architects can specify and build the infrastructure for very large data centres, but only populate them with enough equipment to meet current demand. Again, this ensures that capital investment matches current capacity requirements.

A modular approach can streamline the build process, reduce its carbon footprint, and lower costs. For example, rather than having to support changes to complex systems such as backup generators (gensets) onsite, modular designs can be configured to meet customer needs in the equipment maker’s factory. This speeds up the installation process. It also gives equipment makers the confidence to offer more product options and greater customisation, because they can have direct access to in-house planners, designers, manufacturing expertise, testing equipment, and quality-control systems.

Enabling this kind of modularity requires deep engineering experience and innovative design strategies to make it work. For gensets, it means developing own-brand engines, cooling systems and ancillary equipment and then integrating them seamlessly into robust containers or canopies fitted with the latest soundproofing. This requires a lot of in-house expertise, as well as one-stop-shop manufacturing capabilities, with generators and their external housings being produced in the same factory to ensure quality and consistent technical performance.

This one-stop-shop approach can be extended to the whole process of planning, specifying, building, delivering, commissioning, and maintaining a genset. This is attractive to data-centre architects, who are focused on ensuring that the solution they specify is right for the job and will work when it is needed. And it makes sense for the supplier, who can take end-to-end responsibility for building, integrating, and testing the equipment in their factories so that they can deliver, commission, and maintain it without calling in third parties.

Centralising genset engineering also increases the determinism in the process and reduces the chances of unexpected delays. It does away with the need to rely upon the efforts of third-party fabricators, whose service quality may vary. It increases product quality, because all the parts are made in a dedicated facility, for a single purpose, under a common quality-control scheme. And it increases reliability, since it is easier to track any issues, find their root causes and rectify them quickly when all the work is being done in-house.

At Kohler, all these processes are handled within one manufacturing plant – usually at headquarters in Brest, France – eliminating the need for third-party fabricators and packagers. This helps ensure consistent build quality for a range of power solutions optimised for data-centre applications.

With demand for cloud services growing so rapidly, data-centre architects are looking for efficient, cost-effective ways to build facilities that enable them to match the level of capital investment to current demand, while providing a low-cost way to expand to meet future demand. Modular data centres, and the modular ancillary equipment such as backup gensets that enable them, make this capital-efficient approach to the design, commissioning, operation, and expansion of data centres much easier to implement.


By Marc Garner, VP, Secure Power Division, Schneider Electric UK&I
By Simon Bennett, CTO EMEA at Rackspace Technology
Edge computing delivers information at the moment it is needed. Edge computing, simply said, is about reducing the information located in a centralized server, so it is freed up from the issues surrounding latency, bandwidth and geographic distance, making applications such as AI, IoT and 5G much easier to successfully accomplish. By definition, edge computing is moving user data away from a server of origin and closer to the user who needs it – at the edge. By Laura Roman, CMO at EDJX
It’s no secret that edge computing and 5G are intrinsically linked. 5G networks can be up to 500% faster than 4G and support a 100x increase in traffic capacity, but edge computing is central to realising this promise, providing compute and storage power that eliminates backhaul latency issues inherent to a reliance on a central data centre. By Jon Abbot, EMEA Telecom Strategic Clients Director for Vertiv
Andy Connor, EMEA Channel Director, Subzero Engineering, outlines this edge explosion and examines the crucial role of the modular, micro data center in delivering digital transformation.
Marc Garner, VP, Secure Power Division, Schneider Electric UK & Ireland The data centre sector skills shortage has been documented by industry publications and research firms for almost a decade. In fact, a report published by Gartner in 2016 found 80% of firms expected to find their growth held back due to a lack of new data centre skills, with the McKinsey Global Institute predicting a global shortage of 1.5 million qualified data centre managers as early as 2015.
Big data, big energy consumption? Each photo we post on social media or email we send is saved into servers that are stored in physical data centres around the world. This process consumes a significant amount of energy, raising sustainability issues in the data centre industry. To help overcome this challenge, Marcin Bala, CTO of telecommunications networks specialist Salumanus Ltd, explains how to create a more sustainable data centre infrastructure.