Today’s computational environments are changing rapidly as more companies are looking to utilise HPC and intensive applications across an increasingly wide variety of industries. Indeed, in a rapid period of time, artificial intelligence (AI) has transitioned from the realms of research labs and into our everyday business processes.
Machine learning is getting smarter and smarter and, as a result, is being designed into more and more applications. The financial sector is a prime example, here PayPal is using machine learning to fight fraud and money laundering, with its applications capable of comparing millions of transactions between buyers and sellers to distinguish between those that are legitimate and those that are fraudulent. Equally, there are many cases of scientists and medical researchers working on machine-learning models that can predict disease and help in early diagnosis of illnesses, like cancer. And, in more mainstream environments, a recent study showed that one in four football fans want to see more AI technology used in sports to help referees out in those all too critical decisions.
What is, perhaps, even more of interest is the potential of AI and machine learning to clean up the environment and slow down climate change. PwC research found that the application of AI levers could improve productivity across a number of sectors and, in doing so, reduce worldwide greenhouse gas (GHG) emissions by 4 per cent in 2030. Unfortunately, this is only looking at the bright side of the technology. While the more public-facing risks of decision bias and job losses have taken up much of the media attention, less attention has been put on the environmental cost of poorly located AI and machine learning applications. Given the increased focus on climate emergency, it’s time to take a deeper look at what AI means for the environment and, in turn, a business's sustainability credentials.
By design, AI, machine learning and deep learning applications consume an enormous amount of energy in order to function. Large-scale combinatorics, discrete and continuous optimisation, and niche simulation and modelling in the training and development process, and more -- required all on-demand – necessitates significant and constant access to power.
Unfortunately, many of today’s corporate data centres are not well equipped to deal with the intensive, high density compute and the processes involved in the rollout or implementation of AI systems at industrial scale. Traditional data centres rely heavily on averages and load balancing across the whole environment to manage power consumption. Any machine learning computing platform running 24-hours a day, as it would do in neural and deep-learning practices, would tax the environment extensively.
This is because the numerical techniques involved means that the amount of compute required is significantly greater than previous generations of HPC. It’s not the running of the neural networks involved per se, but rather the process of the system scanning all the layers of the data to find the correct values and ratios to determine the right answer. That takes days and weeks of compute power processing, plus trillions of attempts, to get the numbers lined up.
When it comes to environmental sustainability, the problem really lies in if these power-hungry machine learning applications are housed in fossil fuel-powered facilities. This is especially the case for organisations that rely on cloud providers in locations like the UK where energy is predominantly natural gas. Data centres already use an estimated 200 terawatt hours (TWh) each year, contributing around 0.3% to overall carbon emissions worldwide. At a wider ecosystem level, information and communications technology (ICT) accounts for 2%. By 2025, not too far away, Swedish researcher Anders Andrae predicts that data centres will amount to ICT’s largest share of global electricity production at 33 percent, followed by smartphones (15 percent) and networks (10 percent).
As such, for organisations keen to explore how this groundbreaking technology could transform their operations, their carbon footprint could be in for a nasty shock if the data centre where their machines (or the third-party service they access) are powered by non-renewable sources of electricity. Even if the company were to put these applications into the public cloud, they could still end up paying three times what would be paid for an in-house architecture. The predominant business model for the hyperscaler cloud doesn’t work well when on-all-the-time compute is required. Moreover, virtualised servers do not provide the same HPC-optimised environments that many AI applications need in order to work as efficiently and effectively as possible
Of course, the situation is further compounded by that fact that a huge amount of power is also required to keep the servers that run AI and machine learning applications cool so that they can operate effectively and continuously. The conventional way to cool the environment is through fans and hot aisle/cold aisle solutions and this, typically, soaks up 40 per cent of the organisation’s overall energy bill. For any AI neural network, if you put two racks back-to-back, they will just blow the hot air at each other and, inevitably, restrict airflow and cause overheating. In an unfortunate case of catch-22, by spacing them out, you would solve one problem - cooling - but would create another - performance; network cables should be no longer than one metre.
The way businesses keen to find a balance between embracing the promise and potential of AI, while not damaging their sustainability credentials and contributing to the climate emergency, is to take a hard look at the location of their data centres. Pivotally, the majority of the equipment involved in training machine learning models AI does not need to be located near the end-user for performance or accessibility.
As such, racks can be comfortably housed in data centres that are serious about their power sources. For example, in countries like Iceland – the only developed nation globally with 100 % of its grid power generated by renewable sources – hydroelectric, geothermal and wind, and with a naturally cooler climate, meaning the cooling of powerful AI servers is easily accomplished using free, ambient cold air.
Ultimately, when advanced applications like AI offer many new, exciting opportunities -- representing a $15trn economic opportunity, according to some estimates -- businesses need to look for innovative data centres that can be tailored to their needs; providing solutions that naturally flex between varied resiliency requirements and adapt to a wide range of power densities, and more. There downsides that come from the need to adequately power these systems, maintain them, improve them and store the information created are significant, but it doesn’t need to cost the earth when you choose an optimised and renewable-energy powered, HPC-ready environment.
[1104 / 1200 max words]