|
|
This video is part of the appearance, “NetApp’s Platform for AI Data Innovation“. It was recorded as part of Tech Field Day Experience at NetApp INSIGHT 2025 at 9:30-10:15 on October 16, 2025.
Watch on YouTube
Watch on Vimeo
Data is the fuel that powers AI. Discover how NetApp uniquely empowers AI Innovators to unleash the full potential of GenAI by securely accessing and managing their enterprise data, regardless of location or scale. Gain insights into real-world examples and use cases demonstrating how NetApp is assisting organizations in overcoming data challenges across data centers and multi-cloud environments, ultimately accelerating AI-driven outcomes. Be among the first to learn about groundbreaking innovations that span the best infrastructure for AI, data discovery, data governance, and how to seamlessly integrate AI and data. Simplify Enterprise AI for Inferencing, Retrieval Augmented Generation (RAG), and model training today, paving the way for your Agentic AI future tomorrow.
In their session at the Tech Field Day at NetApp INSIGHT 2025, speakers Tore Sundelin and Arindam Banerjee introduced the NetApp AI Data Engine (AIDE), discussing the state of enterprise AI adoption and the common challenges companies face in scaling AI to production use. Despite the tantalizing promises of AI, studies claim a high rate of failure among enterprise AI projects due to fragmented tools, siloed and duplicated data sets, and complex management needs. NetApp sought to address these issues with a unified platform anchored by ONTAP, its industry-leading data management software, and powerful integration with NVIDIA. The AI Data Engine aims to simplify AI operations across data discovery, governance, transformation, and cost-efficiency, enabling organizations to move from isolated experiments to production AI systems more easily.
Banerjee highlighted how the AI Data Engine integrates compute and storage by introducing dedicated Data Compute Nodes (DCNs) connected via high-speed networks to ONTAP-based AFX clusters. This tight integration, enhanced with NVIDIA GPUs and co-engineered embedding models, enables efficient vectorization and semantic search for AI workloads, especially for use cases like RAG. The system also features rich metadata indexing, resilient snapshot-based lineage tracking, and automated detection and governance tools to protect sensitive data. With support for hybrid and multi-cloud environments, the platform empowers both infrastructure admins and data scientists via distinct interfaces, allowing for flexible, secure, and scalable AI development and deployment processes, all while leveraging NetApp’s proven storage technologies and ecosystem integrations.
Looking ahead, NetApp’s roadmap for AIDE envisions a decentralized knowledge graph architecture that will further extend its scalability and capability to support complex AI use cases such as Agentic AI. The platform is already compatible with AI tools like Langchain and Domino Data Labs, and plans are underway to accommodate bring-your-own-model scenarios and support advanced AI modalities. Deep collaboration with NVIDIA has resulted in optimized pipelines and hardware compatibility, including support for upcoming GPU generations. Ultimately, AIDE is positioned as a future-ready solution to help enterprises unlock the value of their massive data estates—over 100 exabytes currently under NetApp management—and make them readily usable and governable for advanced AI applications.
Personnel: Arindam Banerjee, Tore Sundelin








