Ptechhub
  • News
  • Industries
    • Enterprise IT
    • AI & ML
    • Cybersecurity
    • Finance
    • Telco
  • Brand Hub
    • Lifesight
  • Blogs
No Result
View All Result
  • News
  • Industries
    • Enterprise IT
    • AI & ML
    • Cybersecurity
    • Finance
    • Telco
  • Brand Hub
    • Lifesight
  • Blogs
No Result
View All Result
PtechHub
No Result
View All Result

NetApp CEO On ‘Bringing AI To Your Data,’ Changing H-1B Visa Rules And The Firm’s Channel Strategy

CRN by CRN
October 17, 2025
Home News
Share on FacebookShare on Twitter


‘We’re focused on the part of the AI value chain that is delivering true value. If AI is to deliver on the promise of what [consultant firm] McKinsey and others say is enormous potential, an organization’s data has to be useful alongside AI technologies like large language models, and we are solving precisely that problem,’ says NetApp CEO George Kurian.

The week of October 13 was a busy one for NetApp as it looked to demonstrate to the world the importance of transforming data for use in AI and its latest technologies for helping enterprises do so.

At the company’s NetApp Insight 2025 conference in Las Vegas, NetApp introduced its new AFX disaggregated storage infrastructure it says paves the way to bring AI to data rather than the traditional method of copying and migrating data to bring it to the AI applications.

NetApp also showed its new AI Data Engine, or AIDE, an AI data management service for simplifying the tasks of making data AI-ready and then protecting that data while ensuring it is up to date without making new copies.

[Related: 6 Ways NetApp Is Looking To Power AI Flexibility, Ransomware Resilience]

NetApp CEO George Kurian told CRN that his company, rather than focusing on the storage of data, is actually using storage as a base for building data intelligence as part of transforming data to be ready for AI.

“The NetApp data platform is the core,” Kurian said. “It’s the heartbeat of the intelligent data infrastructure. And so we’re really delivering on the promise of the intelligent data infrastructure message that we shared with clients last year. We said two important things. We would bring AI to your data. We’re doing that. With the AFX system that combines the DX engines with the AFX storage platforms, we’re actually bringing AI to your data. And then we said we will help make it much easier to use AI with your data, and we’re delivering on that promise as well.”

While Kurian still declines to talk about specific competitors, he did say that NetApp is very differentiated in its approach to intelligent data management for AI.

“Today, we are not only talking about traditional data formats like file, block, and object,” he said. “We are also talking about vector embeddings and tokenized data formats for LLM access to data. We are talking about semi-structured data formats like Apache Iceberg tables [for large-scale analytical datasets in a data lake] or Apache Parquet files and so on. Or classically you can look at CSV [plain text] files or JSON (JavaScript Object Notation) kinds of structures. We operate at two different levels of the stack. They are talking about storage. We’re talking about storage and data.”

Kurian also discussed doing business with the U.S. government during its shifting priorities, and about how the imposition of $100,000 application fees for H1-B visas not only impacts businesses but how it might have impacted his own journey to the top.

There’s a lot going on in the storage, or rather the data intelligence, industry as data comes to the foreground with the journey to AI. For more, read CRN’s exclusive conversation with Kurian, which has been lightly edited for clarity.


How do you define NetApp in 2025?

NetApp enables our clients to extract value from their data by transforming data into knowledge. We do that by delivering to them a data platform that is the foundation for the most modern data center environments, with our leading all-flash systems that are proven in the world’s largest hyperscalers and bring hyperscaler efficiencies to customers’ data centers. We’re the most secure storage on the planet, with the only operating system that supports the entire NIST (National Institute of Standards and Technology) life cycle for secure data management. We offer the seamless integration of public cloud with on-premises environments, and the leading architecture to accelerate data pipeline and data transformation for AI.

In that definition, you said the word ‘storage’ once, referring to secure storage. Do you no longer consider NetApp a storage company?

Storage is a part of our value proposition. It’s the foundation on which the other elements are built. But we have for a long time infused intelligence into our storage. Data management, data security, data governance, and hybrid cloud data management have always been built on storage. Now we are doing two really important things. One is, we are changing the definition of a data storage infrastructure to a new category. [Our new] AFX system is a composable system architecture that combines data access nodes that are really classical storage paradigms with data processing and transformation nodes like our DX50s that are really about making enterprise data AI ready. That’s important at the system infrastructure level. At the platform level, we are infusing multiple new elements of software intelligence, first and most importantly, metadata intelligence, which is intelligence about the data, and then a suite of services that uses metadata intelligence to make it much easier to organize data, to implement guardrails, to curate the data as it gets transformed, to track the lineage of data and models, and so on.


Last year, NetApp introduced the idea of intelligent data infrastructure, and that was really the theme of Insight 2024. NetApp this year introduced the AFK and the AIDE, or AI Data Engine, but is there any overall theme or anything as foundational as the intelligent data infrastructure this year?

The NetApp data platform is the core. It’s the heartbeat of the intelligent data infrastructure. And so we’re really delivering on the promise of the intelligent data infrastructure message that we shared with clients last year. We said two important things. We would bring AI to your data. We’re doing that. With the AFX system that combines the DX engines with the AFX storage platforms, we’re actually bringing AI to your data. And then we said we will help make it much easier to use AI with your data, and we’re delivering on that promise as well.

What comes next after intelligent data infrastructure?

We have a whole series of innovations to really deliver on that promise. I think fundamentally what we see is the entire technology stack is getting redefined. If you look at the application stack in the future, a lot of the business logic that was enumerated by humans will be replaced by the business logic that’s discovered by the LLMs. They will look through your data and discover the relationships within your data, and then create the business logic that results out of that. And then, similarly, what you see is the data platforms, meaning things like data warehouses and data lakes, which sit one level below the application stack, are also getting transformed because you need to now integrate much more scalable data sets, like unstructured data with the classically structured data sets. We are now at the heartbeat of that technology stack transformation. And the reason I say that is, to really get scale and efficiency, you want to implement many of those functions right where the data is created or written. We said you want to be able to share data across systems, and we created network file storage. We said you want to implement security controls, and it’s much better to do it right where the data is created or deleted or encrypted, because then you can trap a malicious attack much, much faster than sitting up in an application stack or sitting far away in a backup window. And now we’re saying that it’s much better to do transformation and enrichment of that data right where the data is created. And so we’re excited about the possibilities that holds for NetApp because we will create so much more value for our clients than we’ve ever been able to create. And there’s a value exchange with every client that results from what you create for them which in return gives us more opportunities than we ever had before.


This year we saw your rival Pure Storage introduce what they call the Enterprise Data Cloud. What are your first takes on the Pure Storage Enterprise Data Cloud, and how does that differ from what NetApp is doing?

I don’t want to comment specifically about the Pure Cloud offering. I think what many people will be describing as a data cloud, or a data platform, first operates really at the storage layer. It’s really about storage primitives like files and blocks and objects. I think we have several advantages, and it’s much more far-reaching. Today, we are not only talking about traditional data formats like file, block, and object. We are also talking about vector embeddings and tokenized data formats for LLM access to data. We are talking about semi-structured data formats like Apache Iceberg tables [for large-scale analytical datasets in a data lake] or Apache Parquet files and so on. Or classically you can look at CSV [plain text] files or JSON (JavaScript Object Notation) kinds of structures. We operate at two different levels of the stack. They are talking about storage. We’re talking about storage and data.

Second, we have embedded in that a metadata engine that’s highly scalable, that allows what we call active metadata, meaning you can customize the metadata by tagging it and annotating it in the way that you want to create communities of data. So metadata is something that we do much more differentiated and unique compared to most of the storage vendors.

And then the last is in terms of the way we implement the technologies for some of these data pipelines. We have several technologies that are proven at scale that nobody else has. We talk about how you can create copy-less vector embedding, meaning you can keep your source data and your AI-ready data in the same volume without having duplicates. And it’s because of the efficient data copy technology we created many, many, many years ago that tens of thousands of customers use. We talk about a change detection mechanism where you can notify an AI model that a data set changed, and then we run the model only for that piece of data because of a technology that we introduced many, many decades ago called SnapDiff, which is a highly scalable, proven change detection engine.

So there are three things in summary. One is we are operating at many more layers of the storage stack with metadata and real data. Second, we have a proven set of technologies that we’re building on. And then the last is, we have a much larger data estate than anybody else, especially of unstructured data, and we’re bringing value to that giant data estate in a way that nobody else can.

A big focus of the NetApp Insight conference is on making data available for AI. In the AI industry in general, there’s a lot of talk about a potential bubble in part because of all the circular investments between several large vendors. Do you see a possible AI bubble, and how might that impact NetApp?

We’re focused on the part of the AI value chain that delivers true value. If AI is to deliver on the promise of what [consultant firm] McKinsey and others say is enormous potential, an organization’s data has to be useful alongside AI technologies like large language models, and we are solving precisely that problem. I’m not an investment expert to comment about the rest of the ecosystem and topics like bubbles, but I think what we are doing is delivering tangible value to enterprises across the globe.


In another important industry-wide question, there’s a lot of talk about increasing the cost to apply for an H-1B visa to $100,000. Does NetApp take advantage of that visa?

We have provided H-1B visas to employees who come here for long-term assignments. We rely on having access to the world’s best technologists so that we can build the world’s best products and deliver value to not only private and commercial institutions, but also educational institutions as well as national security interests. Having access to and the ability to bring talent from all of the parts of the world to the United States so that we can build the best products is of interest to us.

So how does the recent move by the current administration to impose $100,000 per visa application impact NetApp’s plans in terms of bringing in that talent?

It certainly makes it harder for us to bring talent here. It will cause us, because we’re a global company, to go where the talent is, which is probably unfortunate. And speaking from personal experience, my parents would never have been able to afford $100,000 for a visa, and so I wouldn’t be here if the rules that the current administration are putting up were in place.


NetApp is one of the biggest providers of storage infrastructure to the U.S. government. How have changes in government spending priorities this year impacted your government business?

We have always had to align our technologies to the spending priorities of any administration, and it is something that we’re quite familiar with. Each administration brings a new set of priorities with regard to the impact of the administration’s change in budget priorities. We talked about that in our last earnings call. We have been cautious about the part of our business that’s tied to the U.S. public sector, and we’re focused on those opportunities that are spending priorities for the administration.

Several years ago, Dell went from a public company to a private company before it became public again because the investors were focused on Dell as a PC company and not its big enterprise picture. With the storage and data infrastructure industry and NetApp in particular, is there anything that investors don’t understand or don’t follow? What would you like to see them better understand about NetApp?

We are in dialogue with our investors all the time. I reiterate to investors that we have been gaining share in the fastest growing markets in the industry. We’re number one in flash storage. We have a really strong and uniquely differentiated position in the hybrid cloud. And because of our technology innovation as well as our large footprint of installed systems, we have a unique advantage in the race to AI, because we hold a huge part of the world’s unstructured data. I think building on that, our intellectual property is really software, and we have a highly leveraged R&D model because we have one software code base that we are leveraging across all of the places in the world.

From an ecosystem standpoint, just look at the people we’re working with. It is without question the innovation leaders and the industry leaders in all parts of the world. And so it creates a leveraged business model for us that, together with the fact that we are moving our business into more and more profitable lines over the past several years, we have moved our gross margin profile into a higher tier because of more software-rich products. Because we have also increased the percentage of our business into recurring revenue streams, as opposed to a classical revenue model, we are able to deliver strong shareholder returns.


How often do you talk with channel partners, and what do you see as the key issues channel partners face when it comes to not just AI per se or storage, but in terms of building that data infrastructure NetApp is talking about?

I meet with channel partners all the time. On every trip, I meet customers, I meet employees, and of course channel partners. They’re an integral part of our value delivery chain. We don’t think of channel as something adjacent to our strategy. It is our strategy. We’ve worked with many of these partners for a very long time. It was amazing. I get to meet people who deployed NetApp systems 28 years ago, 30 years ago. It’s an honor to meet the people who have trusted us for that long. I think the things that we work on with them on is not dissimilar to what we work on with our clients, which is to really think about the technology architecture, to bring efficiency, to bring commonality across all of these different data patterns and data types, and to build services and consulting revenue streams for our partners. You hear us talk about offerings that allow our partners to engage with our clients on getting data ready for AI. It’s a precursor to AI. Get your data ready today because you will have to use it with AI models tomorrow. There’s lots of different ways that we are trying to enable our partners to be successful with our technologies and with clients.

Do you think the majority of your partners are ready for AI or not?

I don’t think the industry as a whole is ready for AI. As we told our investors, we think that enterprise AI, which will ultimately be the largest and most deeply deployed use of AI, is still in the early innings. We said that calendar year ‘25 second half, you would see more proofs of concept, and then calendar year ‘26 would be when the first wave of proofs of concept translate into production. When I talk to clients and partners, I see we are on the right track on that. And I think you can see that people are building data lakes or organizing their data so they can use it

Anything else you think we need to know about NetApp?

We are innovating together with industry leaders like Nvidia and the hyperscalers to enable our clients to get knowledge from data. We’re doing that with systems. We’re doing that with security. We’re doing that with cloud integration, and now with a range of new capabilities for AI. We’re super excited about the year ahead, sharing all of this innovation with our clients.



Source link

Tags: Cloud DataCloud StorageGenerative AI
CRN

CRN

Next Post
Chase enlists generative AI to drive modernization

Chase enlists generative AI to drive modernization

Recommended.

MediaTek Introduces M90 5G-Advanced Modem with AI and 12Gbps Peak Speed

MediaTek Introduces M90 5G-Advanced Modem with AI and 12Gbps Peak Speed

February 26, 2025
Here’s what CEOs are saying about DEI at Davos

Here’s what CEOs are saying about DEI at Davos

January 24, 2025

Trending.

⚡ Weekly Recap: Oracle 0-Day, BitLocker Bypass, VMScape, WhatsApp Worm & More

⚡ Weekly Recap: Oracle 0-Day, BitLocker Bypass, VMScape, WhatsApp Worm & More

October 6, 2025
Cloud Computing on the Rise: Market Projected to Reach .6 Trillion by 2030

Cloud Computing on the Rise: Market Projected to Reach $1.6 Trillion by 2030

August 1, 2025
Stocks making the biggest moves midday: Autodesk, PayPal, Rivian, Nebius, Waters and more

Stocks making the biggest moves midday: Autodesk, PayPal, Rivian, Nebius, Waters and more

July 14, 2025
The Ultimate MSP Guide to Structuring and Selling vCISO Services

The Ultimate MSP Guide to Structuring and Selling vCISO Services

February 19, 2025
Translators’ Voices: China shares technological achievements with the world for mutual benefit

Translators’ Voices: China shares technological achievements with the world for mutual benefit

June 3, 2025

PTechHub

A tech news platform delivering fresh perspectives, critical insights, and in-depth reporting — beyond the buzz. We cover innovation, policy, and digital culture with clarity, independence, and a sharp editorial edge.

Follow Us

Industries

  • AI & ML
  • Cybersecurity
  • Enterprise IT
  • Finance
  • Telco

Navigation

  • About
  • Advertise
  • Privacy & Policy
  • Contact

Subscribe to Our Newsletter

  • About
  • Advertise
  • Privacy & Policy
  • Contact

Copyright © 2025 | Powered By Porpholio

No Result
View All Result
  • News
  • Industries
    • Enterprise IT
    • AI & ML
    • Cybersecurity
    • Finance
    • Telco
  • Brand Hub
    • Lifesight
  • Blogs

Copyright © 2025 | Powered By Porpholio