We would like to keep you up to date with the latest news from Datacentre Solutions by sending you push notifications.
The storage industry is once again at an inflection point. Most recently it has been the move from centralized, on premises storage solutions to more flexible, scalable solutions that accommodate the endusers immediate requirements. This time it is AI that has the storage industry reevaluating, pivoting and embracing this evolving market.
Throughout the last generations storage vendors have defined not only the technologies, but these vendors have also actively created the demand. It is this particular vision and perception of the market that has led to the development of new technologies. The vendors would evaluate the market from their perspective, which is sometimes a narrow vision, and determine what could be improved and add value, if presented correctly. The vendors then evangelise that new technology, create the value and in some situations create the feeling of fear of not having such features.
In many cases these technologies are simply incremental improvements - faster, bigger, more secure. In other cases, these innovations provide a new direction that adds to or replaces an existing method. All improvements have one thing in common, they are designed to differentiate the vendor from the competition and move away from simple parameters such as performance, price, reliability which are easily defined and therefore easy to compare.
For example, if a vendor’s main value propersition is 40GB/sec performance within a simple internet search, within minutes a potential buyer could search competitive options. Add a feature like ‘line speed varible deduplication’ and what could a potential buyer compare? There is no benchmark.
Realistically, increased performance and its ilk are relatively simple to develop. The challenge lies in implementing a business model that can sustain the storage industry’s traditionally high margins when the value of a solution comes down to performance, price, and reliability. It is much more profitable to focus on features that offer a potential value. Example, should your storage solution be struck by a lightning bolt, without the foresight of purchasing a ‘snapshot driven duplication synchronise replication’ solution, who knows what the cost would have been? Provides a viable value to present to for example eBay, who would lose millions for every moment it lost service.
The AI market is different in two ways:
The AI market is evolving and largely driven by startups and research institutions. It’s all about funding: Both startups and research institutions require funding to develop new algorithms, build prototypes and push the boundaries of AI capabilities.
It is crucial to allocate funds strategically to prioritise elements that yield the best results for a project. Most specifically, at the hardware level, GPUs are high priority, Storage is low. Storage is certainly needed and must be fast, but ideally not at the cost of fewer GPUs.
The value of the actual data is also different. If we look at the traditional IT model and consider the likes of eBay at one end of the scale and the local convenience store at the other end, the data is very much the lifeblood of both businesses. With eBay, even one moment of loss of access to their data would be catastrophic. The convenience store may not be as time critical, but losing its stock control, credit and debit balance and deliveries would be a nightmare. It is simple to see why storage features are commercially needed for a mainstream business. eBay requires 24/7 uptime with zero recovery time. The convenience store might instead need a timed backup routine with snaphots. Either way, the data often cannot be recreated and therefore data management and protection features are important and provide realistic value. Data in this case is considered ‘output’, as it is created by the applications, is not easy to recreate and needs protection.
In AI, as much as data is important, contrary to the norm, the data is likely the ‘input’. Take for instance, a dataset that consists of many copies of MRI scans. The valuable process in this case is how the AI reads and analyses the data, and the result it produces, which could be as simple as an algorithm, a model or an uncomplicated decision. No matter how hard we tried to force fit traditional storage features, other than performance, connectivity, reliability, and price it simply makes minimal commercial or technical sense to the new market which is focused on pushing innovation and is not concerned with legacy IT or fear and doubt. The new innovators are driving past ‘what if this happens’ and focused on creating solutions that will change our lives.
There will be a time when AI is more mainstream. We will see it at the fuel station, in schools and at the local shop. When it reaches this level of ubiquity, it will have created a new IT requirement and many new features will have been made essential. However, today’s innovators need performance, connectivity and stability and at a price that does not grab funds from GPU resources.
Many storage vendors have attempted to re-brand and re-purpose existing high-end solutions for the AI market. After failing to justify the cost or value of the fatures, most have already moved out or have chosen to focus on the small percentage of super-scaled AI solutions. The smart vendors, who have listened to the market, talked to the end users and determined the real needs, have redesigned their offerings and are doing well within this extremely exciting market.
It's a completely new market with completely new visions and needs. Why would we expect storage designed specifically for an established market and model to simply fit in? In fact, just because it was ready and available does not mean it is appropriate.