|
![]() |
The streaming date of this Showcase is September 17, 2025.
ZEDEDA showcased their combination of a central console with hardened software and edge hardware for managing Kubernetes and AI applications securely across a fleet of edge locations.
Presenters
Delegate Panel
ZEDEDA Automated Orchestration for the Distributed Edge
Watch on YouTube
Watch on Vimeo
In this Edge Field Day showcase, ZEDEDA’s Padraig Stapleton, SVP and Chief Product Officer, provides a comprehensive overview of ZEDEDA, its origins, and its vision for bringing the cloud experience to the unique and often hostile environment of the edge. The video highlights how ZEDEDA’s platform enables businesses to securely and scalably run their applications at the edge. The discussion covers how the platform addresses the complexities of diverse hardware, environments, and security challenges, allowing customers to focus on their core business applications.
This presentation also introduces the ZEDEDA edge computing platform for visibility, security and control of edge hardware and applications. The presentation details a unique partnership with OnLogic to provide zero-touch provisioning and discusses various real-world use cases, including container shipping, global automotive manufacturing, and oil and gas.
Personnel: Padraig Stapleton
Understanding Containers at the Edge with ZEDEDA
Watch on YouTube
Watch on Vimeo
In this Edge Field Day Showcase, ZEDEDA’s Consulting Solutions Architect, Manny Calero, demonstrates how the ZEDEDA platform addresses the diverse needs of edge computing workloads. While Kubernetes is ideal for large, complex, and distributed applications, Docker Compose is often a better fit for smaller, lightweight, and resource-constrained edge sites. The ZEDEDA platform’s key strength lies in its flexibility, allowing users to deploy both legacy VMs and modern containerized applications side-by-side on the same edge node. This provides a unified orchestration and management experience, offering a simple solution for a repeatable, scalable, and secure edge architecture. This presentation includes a demo of the ZEDEDA platform to deploy Docker Compose workloads to multiple edge nodes, highlighting features like zero-touch provisioning and API-driven automation with Terraform.
Solutions Architect Kris Clark presents the ZEDEDA Edge Kubernetes Service. While Kubernetes is complex, it is essential for highly scalable, distributed, and complex applications. Kris provides a brief overview of the Kubernetes service’s architecture, emphasizing its ease of use and its ability to integrate with familiar developer tools like kubectl and Git repositories. The demo shows how to quickly create a Kubernetes cluster and deploy applications from the ZEDEDA marketplace or from a custom Helm chart. This presentation concludes with a discussion about how the ZEDEDA platform provides a cohesive solution for both containerized and VM-based workloads, supporting enterprises in their digital transformation journey at the edge.
Personnel: Kristopher Clark, Manny Calero
Manage Edge AI Using ZEDEDA Kubernetes Service
Watch on YouTube
Watch on Vimeo
In this Edge Field Day Showcase, ZEDEDA’s Distinguished Engineer, Hariharasubramanian C. S, discusses how ZEDEDA is tackling the growing importance and challenges of deploying AI at the edge. He highlights that factors like insufficient bandwidth, high latency, and data privacy concerns make it impractical to send all sensor data to the cloud for analysis. ZEDEDA’s solution is to bring AI to the edge, closer to the data source. This, however, introduces its own challenges, such as managing a wide range of hardware, ensuring autonomy in disconnected environments, and updating AI models at scale. Hari argues that Kubernetes, with its lightweight nature and robust ecosystem, is the ideal solution for packaging and managing complex AI pipelines at the edge.
This presentation demonstrates how ZEDEDA’s Kubernetes service simplifies the deployment of an Edge AI solution for car classification. Using a Helm chart, he shows how to deploy a multi-component application, including an OpenVINO inference server, a model-pulling sidecar, and a demo client application. The demo showcases how the ZEDEDA platform provides a unified control plane for zero-touch provisioning and lifecycle management of these components, all while keeping models in a private, on-premise network without exposing them to the cloud. He concludes by demonstrating the application’s real-time inference capabilities and encouraging developers to leverage ZEDEDA’s open-source repositories to build their own edge AI solutions.
Personnel: Hariharasubramanian C. S.
ZEDEDA Edge AI – Object Recognition Use Case
Watch on YouTube
Watch on Vimeo
In this ZEDEDA Edge Field Day Showcase, Sergio Santos, Account Solutions Architect shows how ZEDEDA manages edge AI for a practical object recognition use case, specifically for computer vision. His presentation shows how to deploy a stack of three applications—an AI inference container, a Prometheus database, and a Grafana dashboard—using the Docker Compose runtime across a fleet of three devices, one equipped with a GPU and two without. The demo highlights the ability to deploy and manage applications at scale from a single control plane, leveraging ZEDEDA’s automated deployment policies. The process starts from a clean slate, moves through provisioning the edge nodes, and automatically pushes the application stack based on predefined policies, including GPU-specific logic.
A key part of the demonstration is the live update and rollback process. Santos shows how to remotely update the inference container to a new version and then roll it back to the original without restarting the runtime. This highlights ZEDEDA’s lightweight, efficient updates and the use of its Zix infrastructure to push configuration changes. The demo also shows the ability to monitor application logs and device metrics (CPU, memory, network traffic) from the central ZEDEDA controller, proving the platform’s comprehensive management capabilities. The session concludes by demonstrating how to easily wipe the entire application stack by simply moving the edge nodes to a different project.
Personnel: Sérgio Santos