|
This video is part of the appearance, “Rafay presents at AI Infrastructure Field Day 3“. It was recorded as part of AI Infrastructure Field Day 3 at 14:00-16:00 on September 10, 2025.
Watch on YouTube
Watch on Vimeo
Haseeb Budhani, CEO of Rafay Systems, discusses how the Rafay platform can be used to address AI use cases. The platform provides a white-label ready portal that allows end users to self-service provision various compute resources and AI/ML platform services. This enables cloud providers and enterprises to offer services like Kubernetes, bare metal, GPU as a service, and NVIDIA NIM with a simple and standardized experience.
The Rafay platform leverages standardization, infrastructure-as-code (IaC) concepts, and GitOps pipelines to drive consumption for a large number of enterprises. Built on a Git engine for configuration management and capable of handling complex multi-tenancy requirements with integration to various identity providers, the platform allows customers to offer different services, compute functions, and form factors to their end customers through configurable, white-labeled catalogs. Additionally, the platform features a serverless layer for deploying custom code on Kubernetes or VM environments, enabling partners and customers to deliver a wide range of applications and services, from DataRobot to Jupyter notebooks, as part of their offerings.
Rafay addresses security concerns through SOC 2 Type 2 compliance for its SaaS product, providing pentest reports and agent reports for customer assurance. For larger customers, particularly cloud providers, an air-gapped product is offered, allowing them to deploy and manage the Rafay controller within their own secure environments. Furthermore, the platform’s unique Software Defined Perimeter (SDP) architecture enables it to manage Kubernetes clusters remotely, even on edge devices with limited connectivity, by establishing an inside-out connection and a proxy service for secure communication.
Personnel: Haseeb Budhani