Enabling Mass Innovation with the VMware Edge Compute Stack
Event: Edge Field Day 3
Appearance: VMware Presents at Edge Field Day 3
Company: VMware by Broadcom
Video Links:
- Vimeo: Enabling Mass Innovation with the VMware Edge Compute Stack
- YouTube: Enabling Mass Innovation with the VMware Edge Compute Stack
Personnel: Alan Renouf, Chris Taylor
Across all industries, organizations are adding intelligence to enhance their business operations at the edge for lower costs, higher quality, and increased sales. But scaling out innovation across sites is challenging. VMware’s approach removes edge complexity with zero touch operations.
VMware’s approach to edge computing focuses on simplifying operations and enabling mass innovation across industries by addressing the unique challenges of managing distributed infrastructure at scale. The VMware Edge Compute Stack is designed to handle the complexities of edge environments, where organizations are increasingly deploying intelligent systems to enhance business operations. These edge environments, such as retail stores, manufacturing plants, and energy substations, require localized computing power to process large amounts of data in real-time, often without reliable network connectivity. VMware’s solution integrates edge computing with networking and security services, offering a full-stack approach that includes SD-WAN and SASE technologies to ensure reliable and secure operations across dispersed locations.
The VMware Edge Compute Stack is built to handle the specific constraints of edge environments, such as limited on-site personnel, ruggedized hardware, and the need for real-time processing. The platform supports both virtual machines and containerized applications, allowing organizations to run legacy systems alongside modern applications. VMware’s orchestration platform, Edge Cloud Orchestra, enables zero-touch provisioning, making it easier to deploy and manage edge infrastructure without requiring IT staff at each location. This pull-based management model, inspired by the way smartphones update themselves, allows edge devices to autonomously check for updates and install them, reducing the need for manual intervention and minimizing downtime.
VMware’s edge computing solutions are already being used in various industries, including retail, manufacturing, and energy. For example, in retail, edge computing is used for loss prevention through computer vision, while in manufacturing, companies like Audi are using edge AI to improve the precision of welding robots and torque wrenches. In the energy sector, virtualizing electrical substations allows for faster response times and reduced operational costs. VMware’s flexible and scalable platform is designed to meet the evolving needs of edge environments, ensuring that organizations can innovate and optimize their operations while maintaining security and reliability.
Presented by Alan Renouf, Technical Product Manager, Software-Defined Edge and Chris Taylor, Product Marketing, Software-Defined Edge
See how VMware Edge Compute Stack works: https://www.youtube.com/watch?v=LiJ3YAWDASw








In the presentation at AI Field Day 5, Tom Emmons, the Software Engineering Lead for AI Networking at Arista Networks, discussed the challenges and solutions related to AI networking visibility. Traditional network monitoring strategies, which rely on interface counters and packet drops, are insufficient for AI networks due to the high-speed interactions that occur at microsecond and millisecond intervals. To address this, Arista has developed advanced telemetry tools to provide more granular insights into network performance. One such tool is the AI Analyzer, which captures traffic statistics at 100-microsecond intervals, allowing for a detailed view of network behavior that traditional second-scale counters miss. This tool helps identify issues like congestion and load balancing inefficiencies by providing a microsecond-level perspective on network traffic.
Hugh Holbrook, Chief Development Officer at Arista, presented on the unique challenges and solutions associated with AI networking at AI Field Day 5. He began by highlighting the rapid growth of AI models and the increasing demands they place on network infrastructure. AI workloads, particularly those involving large-scale neural network training, require extensive computational resources and generate significant network traffic. This traffic is characterized by high bandwidth, burstiness, and synchronization, which can lead to congestion and inefficiencies if not properly managed. Holbrook emphasized that traditional data center networks are often ill-equipped to handle these demands, necessitating specialized solutions.
Arista’s presentation at AI Field Day 5, led by Hardev Singh, General Manager of Cloud and AI, delved into the evolving AI landscape and Arista’s strategic approach to AI networking. Singh emphasized the critical need for high-quality network infrastructure to support AI workloads, which are becoming increasingly complex and demanding. He introduced Arista’s Etherlink AI Networking Platforms, highlighting their consistent network operating system (EOS) and management software (Cloud Vision), which provide seamless integration and high performance across various network environments. Singh also discussed the shift from traditional data centers to AI centers, where the network’s backend connects GPUs and the frontend integrates with traditional data center components, ensuring a cohesive and efficient AI infrastructure.