Why Companies are Building Their AI Applications Outside of Public Cloud

By Stewart Laing, CEO, Asanti.

  • 4 days ago Posted in

Over the last ten years a "Cloud First" approach to IT infrastructure has become the default choice, promising scalability, flexibility, and cost efficiency. Both public and private sector organisations embraced the public cloud for its quick setup and global accessibility, making it the backbone of digital transformation strategies. However, our recent research reveals a growing shift away from this approach. A striking 52% of organisations now plan to host and deliver their AI applications on-premise or through colocation facilities rather than relying solely on the public cloud. This movement is fuelled by three primary needs that the cloud often struggles to meet for AI workloads: real-time processing capabilities, control and security.

The reality of cloud costs and latency for AI

The initial appeal of the public cloud is undeniable. With no upfront CAPEX and theoretically infinite scalability, it offers flexibility to adjust computing resources as needed. However, as companies scale their AI operations, the financial reality doesn’t always match the promise. Our research found that 77% of organisations reported operating costs in the public cloud were higher than expected, and 63% indicated that their overall cloud costs exceeded those of their previous non-cloud setup. For data-intensive AI workloads, the monthly costs can quickly add up, sometimes reaching unsustainable levels.

Real-time processing needs are another critical factor. AI applications such as financial trading systems, and customer service chatbots require instantaneous data processing and low latency. In our study, 36% of those repatriating applications cited the need for faster data transfer for real-time applications as a key reason for moving away from the cloud. Local or edge data centres allow businesses to process data closer to the source, enabling faster, more efficient processing compared to the latency often experienced when using the public cloud.

Data sovereignty and control: a critical factor

Beyond cost and latency, data control is a key consideration driving companies away from the cloud. Many AI applications handle sensitive data, requiring compliance with regulatory standards such as the Data Protection Act in the UK, GDPR in Europe, or indeed HIPAA for healthcare data in the US. 41% of survey respondents cited security and compliance concerns as reasons for reconsidering the public cloud. In a public cloud environment, data is often distributed across multiple locations, making it difficult to ensure compliance with location-specific regulations and creating challenges in data oversight.

By using on-premise or colocation data centres, organisations can maintain greater control over their data, meeting regulatory requirements with confidence. This capability is especially attractive to sectors such as healthcare, finance and government, where compliance with strict regulations is essential. 39% of respondents bringing applications back on-premise expressed concerns over the lack of control and customisation in the public cloud, which can be particularly problematic in highly regulated environments.

Local data centres: a viable solution for AI's unique demands

Given these challenges, edge data centres are emerging as the preferred choice for hosting AI applications. Colocation data centres, in particular, offer companies the benefit of local control without the substantial CAPEX required to establish a private data centre. They provide facilities optimised for AI, with power and cooling systems capable of supporting the intensive processing power required by AI workloads. This setup allows organisations to handle their AI applications cost-effectively and with the performance levels necessary for real-time processing, while avoiding the spiralling costs often associated with scaling in the public cloud.

A fundamental reset for IT infrastructure strategy

The shift towards on-premise and colocation for AI workloads could indicate a fundamental shift in cloud strategy. A decade of cloud-first enthusiasm is now giving way to a more nuanced view of IT infrastructure. With 67% of IT decision makers indicating they wished they had taken a hybrid approach from the outset, our findings suggest that a mixed model - utilising both the cloud for less intensive applications and local data centres for high-performance workloads - could be a more practical approach. As companies rethink their cloud strategy, this shift could have ripple effects across the IT infrastructure industry.

While the public cloud remains valuable, especially for specific workloads, AI’s requirements for substantial data storage, real-time processing, and regulatory oversight are driving companies to reconsider their approach. The insights from our research underscore the importance of building an IT infrastructure that is flexible and adaptable to each application’s unique needs. This evolving trend highlights a potential turning point in the relationship between cloud and on-premise infrastructure, one that will reshape the future of IT in a world where AI applications demand ever-greater performance, efficiency, and control.

By Dave Errington, Cloud Specialist, CSI Ltd.
By Rupert Colbourne, Chief Technology Officer, Orbus Software.
By Jake Madders, Co-founder and Director of Hyve Managed Hosting.
By David Gammie, CTO, iomart.
By Brian Sibley, Solutions Architect, Espria.
By Lori MacVittie, F5 Distinguished Engineer.
By Adam Gaca, Vice President of Cloud Solutions at Future Processing.