The datacentre is dead — and other misconceptions

By Marion Stewart, COO, Pulsant.

If popular wisdom is to be believed, the datacentre is long dead, sent to its demise by the advent of cloud computing. It is true that the traditional datacentre reached its peak during the dotcom revolution towards the end of the 1990s. It is also true that the datacentre took a bit of a backseat to cloud in the last few years. But the datacentre is far from dead. If anything, over the last eighteen months colocation and datacentre services have experienced a bit of a renaissance.

According to recent research, the global market for datacentres is expected to grow by 17% (£ 232 billion / USD 284.44 billion) in the next four years.

There is a reason for this. It’s about data. How much we create, how we store it and what we’re doing with it. This is especially relevant seeing as we continue to break records when it comes to the sheer volume of data that we are generating. By 2025 this volume will be 61% more than what it is now, growing from 33 zettabytes to 175 zettabytes.

The data opportunity

Businesses are finding great value and opportunity in this data but not necessarily in its raw format. To take advantage of its potential and draw out the intelligence from data, it needs to be processed and manipulated. This is nothing new; after all, this is exactly what the age of big data was all about.

 

Now, however, the landscape has changed and so has the technology that enables it. Looking at edge computing as an example; with the continued adoption of IoT, the increase of end points (devices) on a network and the rise in data that is generated, we are placing more focus on immediacy. That is, processing data in real time or near real time to ensure we’re getting the insights when we need them.

 

When it comes to edge computing, just like the name suggests, the requirement for speed means placing data processing and storage capabilities closer to the where the data is actually being generated.

 

If you’re monitoring traffic flow through a city, for example, the processing needs to take place quickly, close to the roads being monitored. The intelligence that is generated needs to be fed back to a control room who can make decisions about the traffic. If there’s a delay in

that information getting back to users, it will be out of date and in effect worthless.

 

In the same vein, if you’re monitoring water levels along a riverbank, the data needs to be collected and processed speedily and the insights used to issue flood warnings, for example. Again, immediacy is key.

 

While these are just two examples, the use cases for edge computing are wide-reaching. Gartner predicts that in the next six years, by 2025, three-quarters of enterprise-generated data will be processed via edge computing. This represents a staggering 650% increase from where it is today.

 

The value of the outsourced datacentre

So what role does the datacentre play in edge computing? Why are businesses making more use of outsourced datacentres? Gartner predicts that by 2025, 80% of enterprises will have moved away from using on-premise datacentres.

 

There’s the natural assumption that cloud computing is the answer to everything and that this data will go into the cloud. While it’s true that cloud brings organisations many benefits, including flexibility, agility and cost savings, it is not the only solution to use. This is especially true when it comes to edge computing, where performance, cost and compliance are three critical areas that shape how the data is collected, processed and used. While cloud may be good for a lot of things, it is the refreshed datacentre that can perhaps add the most value to edge computing. The reason? Outsourced datacentres are equipped to deal with the processing and storage of data and, in certain instances, designed to outperform public cloud. This is especially true when it comes to latency, cost and compliance.

 

·         Latency has long been a concern in the use of cloud computing. If insights aren’t needed in real time, such as monitoring manufacturing volume in a baked goods factory, or the data isn’t business critical, then a slight delay is acceptable. However, when those insights need to be processed, recognised and used in real time, any delay, regardless of how small, can have a devastating impact on the business. Consider a financial institution with a trading floor; performance is absolutely essential here because with brokers making high value investments and deals, a split-second delay can cost millions.

 

·         Cost is always a consideration when procuring IT services. Public cloud is cheap and easy to purchase. Looking at the volume of data that businesses are generating, it doesn’t cost much to get that data into the cloud. The costs start to mount when you process and store that data. The more data, the higher the cost. In a datacentre environment, the costs remain steady and predictable regardless of the volume of data you’re processing.

 

·         Compliance also plays a key role in determining where your data should be processed and stored. Whether that’s part of corporate governance requirements and data custodianship, or part of the wider regulatory landscape. In a nutshell, if you use a datacentre you’ll always know exactly where your data is located, whereas this isn’t possible with cloud.

 

 

 

Moving into the future

The growth of IoT and the exponential increase in data have certainly influenced the datacentre market. The more end devices, the more data, the more need there is for this data to be processed, stored and used. As a result, the role of the datacentre will become even more critical.

Add in the concept of edge computing, and the datacentre becomes even more important. There will be the need to move computing power closer to the actual data sources; this means retrofitting existing datacentres to ingest this information and process it more quickly. This can apply to existing sites, or facilities that have been bypassed because they are no longer able to run mission critical workloads.

For older datacentres, there’s an opportunity for revitalisation – whether that’s upgrading connectivity, boosting the network or improving electricity supply.

We’re already seeing datacentre providers adapting their commercial models to become more flexible in terms of contract length and the services on offer. The commercial model of colocation is changing.

In the longer term, IoT and edge computing will change the way that datacentres are built in terms of mechanical and electrical requirements, as well as where they are built. The edge datacentre will therefore meet the need of being closer to the data sources; or tactically facilitate the transmission of data, while delivering commercially flexible options to organisations.

 

 

 

 

By Stuart Farmer, Sales Director, Mercury Power.
By Nick Bannister, vice president sales for Arrow’s enterprise computing solutions business in...
Here are the top six trends according to Brent Owens, Director Sales & Partner Enablement EMEA for...
By Paul Flannery VP of International Channel Sales at ERP provider, Epicor.
By Chris McKie, VP, Product Marketing Security and Networking, Datto, a Kaseya company.