Infinidat has launched a retrieval augmented generation (RAG) workflow architecture, deliverable as a consultancy service to its storage customers, which allows them to build in up-to-date, private data from multiple company data sources to artificial intelligence (AI) from any NFS storage in their organisation.
The move reflects a trend that has seen multiple storage companies address AI workloads, and RAG issues – in particular in generative AI (GenAI) – that result when data used for training is incomplete, out of date or lacks the type of information that can only be gained from private data, such as within an organisation or from expert knowledge.
When an organisation wants to develop GenAI, it puts a dataset through a training process in which the AI learns how to recognise particular attributes that can be used for information, or for triggers in applications.
Those training processes are often built around datasets that are very general, can go out of date or perhaps initially lack specialised or private data. This is often the case with AI projects inside organisations that need to stay up to date over time, said Infinidat chief marketing officer Eric Herzog.
“A lot of organisations are using generative AI as an internal project with private data,” said Herzog. “And as well as wanting to protect their IP, they have concerns about accuracy, avoiding hallucinations, etc.
“For example, a large enterprise that generates vast amounts of data – in sales, support, operations – would want to boost the performance of what it is doing, and that’s very much tied to its storage performance.
“The customer wants to see accurate data in near real time. It can use AI to understand the details – it might be screws in a component, the type, the supplier, any number of details – and be able to update that information on a continual basis.”
What Infinidat now offers is professional services consulting to allow its customers to access data for RAG purposes from its own and other suppliers’ storage, as long as it is in NFS file storage format.
According to Herzog, that comprises help with configuring the storage system to get at data and metadata rapidly for RAG purposes. He said Infinidat is well-positioned to do this because of the importance it places on metadata and the “neural cache” within its architecture and the InfuzeOS environment.
Infinidat arrays can be all-flash or hybrid spinning disk and solid state, and are mostly targeted at high-end enterprise and service provider customers. Their hardware products feature triple-active controllers and use of a so-called neural cache that marshals data to the most appropriate media, with the bulk of I/O requests going via very fast DRAM, with a cache hit rate of more than 90% claimed.
Infinidat’s focus here on RAG capabilities sees it join other storage suppliers that have recently made a push for customers embarking on AI projects.
Pure Storage CEO Charlie Giancarlo was keen to highlight his company’s AI push at its Accelerate event in June, with storage write speed and availability emphasised. Meanwhile, NetApp launched a push towards data management for AI with the announcement of data classification for AI via its OnTap operating system at its annual Insight shindig in September.
This post is exclusively published on eduexpertisehub.com
Source link