|
![]() |
This Presentation date is April 25, 2025 at 10:30 - 12:00.
Presenters: Alex Bortok, Amritam Putatunda, Ankur Sheth
Validate Frontend Networks to Optimize and Secure Low-Latency LLM Data Flow
As large language models scale, new challenges emerge—not only in maximizing GPU performance but also in validating the infrastructure that fuels the data pipeline used for training. On the front end, this includes securely ingesting user data from distributed cloud and customer environments into centralized AI data centers, followed by ensuring high-speed, low-latency data transfer between VMs and hosts within those data centers. This talk focuses on testing products that evaluate the performance, scalability, latency, reliability, and security of these critical data pathways.
Building Trust at Scale: How Crusoe Validates network Infrastructure for AI Workloads
In this session, Crusoe shares how they are actively testing frontend networks and inter-VM/host data transfers that feed their GPU clusters. By validating the performance, reliability, and scalability of its infrastructure early, Crusoe aims to identify and resolve issues internally, minimizing dependence on end customers to discover them, and to differentiate itself with a more robust, production-ready AI platform.
Maximize the Performance of AI Backend Fabric with Keysight AI Data Center Builder
This session provides an overview of the Keysight AI (KAI) Data Center Builder solution and how it supports each phase of AI data center design and deployment with actionable data to improve performance and increase reliability of AI clusters. You’ll learn how KAI Data Center Builder helps streamline the design process, optimizes resource allocation, and enhances the overall efficiency and stability of AI infrastructures to achieve superior performance and reliability.
Demonstrating the Keysight AI Fabric Test Methodology
Overview of the Keysight AI fabric test methodology, with a demonstration of the key findings and improvements achieved through automated testing and the search for optimal configuration parameters.