‘We’re offering several new innovations to help provide an enterprise-grade data platform for AI,’ says Jeff Baxter, NetApp’s vice president of product marketing.
Data platform and storage technology developer NetApp Tuesday expanded its reach into helping customers derive value from their AI data infrastructure with the introduction of a new disaggregated storage system along with a new comprehensive AI data service.
The company also added data breach detection to its cyber resilience capabilities.
The new hardware and software offerings were introduced at the NetApp Insight 2025 conference being held this week in Las Vegas.
[Related: NetApp CEO Says Don’t Call NetApp A Storage Company]
NetApp is really advancing the state of the art in enterprise AI, said Jeff Baxter, vice president of product marketing for the San Jose, Calif.-based company.
“We’re offering several new innovations to help provide an enterprise-grade data platform for AI,” Baxter told CRN. “We’re specifically focused on the challenge of providing AI-ready data so as customers build out their enterprise AI and move toward agentic AI, they have an enterprise-grade data platform that always has AI-ready data for them to greatly increase the chances for success of their enterprise IT initiatives.”
Key to that is NetApp’s new AFX 1K, its first disaggregated storage system, along with the new NetApp AI Data Engine software that runs on the AFX, Baxter said.
By “disaggregated storage,” NetApp means separating the storage controllers from the storage capacity, Baxter said.
“With NetApp’s AFF and other modern storage systems, you can scale out, but each individual controller has its own storage capacity,” he said. “In a disaggregated storage architecture like NetApp AFX, all the storage controllers are attached to a high-speed, low-latency network, resulting in one giant single pool of storage with the ability for every storage controller to connect to every storage enclosure.”
This is critical for AI, Baxter said.
“We see AI workloads with relatively small datasets but a huge amount of throughput for model training and other things,” he said. “And we see RAG inferencing where you may need a lot more data but not quite as much performance. This architecture allows you to fully disaggregate, fully expand and fully grow.”
“Disaggregated storage” is an industry term, not just a NetApp term, Baxter said.
“We believe we built the most modern and advanced disaggregated storage architecture, primarily because we’re able to do it all with Ontap and all its advanced data capabilities, resiliency and enterprise-grade features that NetApp has built over the last decade,” he said.
NetApp AFX pairs well with NetApp’s Keystone Storage-as-a-Service technology, Baxter said.
“You’re paying for a certain service level, and we will automatically expand or even contract your AFX cluster to meet the performance and capacity requirements for your given SLA,” he said. “We’ll automatically add more performance or capacity as needed and manage that entire AI data platform for you.”
NetApp’s new AI Data Engine comprehensive data service has four key pillars, Baxter said:
- A metadata engine that finds and understands all the data across an AFX cluster and keeps it in a catalog for use with semantic searches.
- Data sync that automatically updates all the data collections as source data changes so that AI is always working on the latest version of the data.
- Data guardrails to protect data in real time based on multiple different rules, some of which can be provided by NetApp and others based on a customer’s specific regulatory frameworks or policies.
- Data curator, which pushes data through the data guardrails, keeps the data up to date in real time and creates an embedded vector database directly on the system to replace multiple tools needed to copy or move data.
Also new with NetApp AFX are the company’s first storage controllers that can be configured with Nvidia GPUs to accelerate a wide range of AI applications, Baxter said.
The new NetApp AI Data Engine software, an optional licensed application for the NetApp AFX, runs on the GPUs in NetApp’s DX-50 Nvidia L40 GPU-based storage controllers attached directly to the cluster, Baxter said.
“This means that it can immediately see all of your data and perform AI operations on it all within that storage cluster, which dramatically cuts down on data movement and improves efficiency,” he said. [You can] pair this with your choice of AI model and AI hardware. You could build that on Nvidia DGX, as part of FlexPod with Cisco, or with any server vendor working with Nvidia GPUs. Just connect to the GPUs within the storage cluster to provide AI-ready data at speed and scale.”
NetApp is not in the accelerated compute business, Baxter said.
“Our goal is, over time, to allow channel partners to work with the vast swath of excellent compute partners out there to utilize the constantly growing number of the latest Nvidia GPUs and connect them directly into the AFX cluster to accelerate compute directly to the data,” he said.
NetApp also made it easier for businesses to connect to the Azure public cloud with a couple of new services, Baxter said.
The first is a new object REST API that lets businesses access their Azure NetApp Files data, he said. This lets NFS and SMB datasets connect directly to Microsoft Fabric, Azure OpenAI, Azure Databricks, Azure Synapse, Azure AI Search, Azure Machine Learning and so on without the need to move or copy file data into a separate object store to analyze data, train AI models, perform intelligent searches and build modern applications on existing Azure NetApp Files datasets.
The second is an enhanced unified global namespace in Microsoft Azure that lets businesses unify their global data estate across cloud and on-premises into Microsoft Azure with Azure NetApp Files’ new FlexCache capabilities so data stored in NetApp Ontap-based storage on-premises or across multiple clouds is visible and writeable in Azure NetApp Files environments, Baxter said.
NetApp also updated its NetApp Shift Toolkit, a free piece of software for NetApp Ontap customers that uses the company’s cloning technologies to move workloads from one hypervisor to another without having to copy the data, Baxter said.
The Shift Toolkit previously could quickly turn a VMware VMDK virtual machine into a Microsoft Hyper-V VHD virtual machine, he said.
“Now we’ll do that with KVM as well to support Red Hat OpenShift, Oracle Linux Virtual Manager, Proxmox and other KVM variants so that customers will have even more freedom of choice to freely switch between hypervisors without having to do whole massive data migrations,” he said
On the cyber resilience front, NetApp is expanding the NetApp Ransomware Resilience autonomous ransomware protection suite, formerly known as Blue XP, Baxter said.
This includes new data breach detection that will now be available in addition to ransomware detection, he said.
Ransomware detection is fundamentally about looking for anomalous data writes, while data breach detection is fundamentally looking for anomalous reads, such as when a user who normally accesses a few files a day in a couple of directories all of a sudden accesses 100 files at a time in directories they’ve never touched before or is suddenly copying Gbytes per second of traffic, Baxter said
“Now if we see any of those, we will alert the customer and their choice of SIEM [security information and event management] software that we see this going on, and we’ll give them a very easy option directly within the interface to block that end user or block that end system at least temporarily from reads while they figure out what’s going on,” he said.
Flexibility in storage is important for customers, said Ned Engelke, CTO of Evotek, a San Diego-based solution provider and NetApp channel partner.
“I love the idea of having more control over performance and more control over performance relative to density,” Engelke told CRN. “For instance, customers want the lowest possible power consumption for an archive. So I want to get the least amount of compute and the most amount of slower drives. Maybe we can start to build those kinds of architectures.”
Engelke said Evotek’s customers are all over the board in terms of how they are using AI.
“We have some customers that are AI hosting facilities, and they are doing a lot because that’s all they do,” he said. “We have some customers testing to evaluate the viability of artificial intelligence in hyperscaler environments. Some are repatriating those workloads. We don’t have any customers yet coming to us and saying, ‘We’re building out a specific big environment for AI, but we expect one exception to be defense contractors.’ Their main focus is around getting their environment to be flexible and adapt to workloads on demand. We’ve seen that with people who are developing AI and testing and then going to deployment. But in general, we’re seeing baby steps but not huge adoptions.”
Meanwhile, adding data breach detection to the NetApp Ransomware Resilience suite is an important step forward for the company, he said.
“A customer might normally have 10 file accesses per hour from this group, but then it increases 10,000, and they see, ‘We might have a problem here,’” he said. “So from my perspective, anything we can get to alert a customer when something goes wrong is additive. When we look at companies like Rubrik or Cohesity, they’ve had a similar detection feature for years. So I’m glad to see that type of visibility, that type of approach, incorporated into a primary storage platform like NetApp’s.”