|
This video is part of the appearance, “Nutanix Presents at AI Infrastructure Field Day 2“. It was recorded as part of AI Infrastructure Field Day 2 at 15:30 - 17:00 on April 24, 2025.
Watch on YouTube
Watch on Vimeo
As presented by Laura Jordana, Nutanix Enterprise AI (NAI) is designed to simplify the process of deploying and managing AI models for IT administrators and developers. The presentation begins by demonstrating the NAI interface, a Kubernetes application deployable on various platforms. The primary use case highlighted is enabling IT admins to provide developers with easy access to LLMs by connecting to external model repositories and creating secure endpoints. This allows developers to build and deploy AI workflows while keeping data within the organization’s control.
The demo showcases the dashboard, which offers insights into active endpoints, request metrics, and infrastructure health. This view is crucial for IT admins to monitor model usage and impact on resources. The process involves importing models from various hubs like Hugging Face and creating endpoints that serve as the inference engine connection. The presenter emphasized the simplicity of this process, with much of the configuration pre-filled to ease the admin workload. They also highlighted the platform’s OpenAI compatibility, allowing integration with existing tools.
While focusing on inferencing, not model training, the platform provides a secure and streamlined way to deploy and manage models within the organization’s infrastructure. The key takeaway from the presentation is the simplification of AI model deployment, focusing on day 2 operations and ease of use. The platform leverages Kubernetes’ ability to run on Nutanix, EKS, and other cloud instances. It also provides API access and monitoring capabilities for IT admins, and easy access to LLMs for AI developers.
Personnel: Laura Jordana