AI is Driving all Infrastructure Change – Delegate Roundtable at AI Infrastructure Field Day 4
Event: AI Infrastructure Field Day 4
Appearance: Delegate Roundtable at AI Infrastructure Field Day
Company: Tech Field Day
Video Links:
- Vimeo: AI is Driving all Infrastructure Change – Delegate Roundtable at AI Infrastructure Field Day 4
- YouTube: AI is Driving all Infrastructure Change – Delegate Roundtable at AI Infrastructure Field Day 4
Personnel: Alastair Cooke
This roundtable discussion explores how the reality of artificial intelligence is driving profound shifts in infrastructure, moving beyond mere marketing labels to necessitate new, distinct approaches. Participants noted this transformative power in vendor presentations, citing Exite Labs’ massively scalable ARM architectures embedded within network interfaces and Vast’s innovative use of Bluefield DPUs, both driven by the evolving demands of AI. The conversation highlighted AI’s role as a significant driver of innovation in networking, pushing traditional Ethernet to the forefront over InfiniBand for HPC, and accelerating the development of smarter NICs and HBAs to support AI workflows.
A significant shift observed was the increasing emphasis on AI inferencing over training. This pivot indicates the practical application of AI in real-world scenarios, with enterprises actively deploying AI solutions. However, delegates recognized that building inference is not the final stage; it requires sophisticated application delivery and load balancing that, while familiar in concept, now demands context switching based on specific AI prompts or models. Parallels were drawn to historical architectural migrations, suggesting that AI is reaching a maturity where it’s integrated into applications for mainstream business value, moving away from being a “solution in search of a problem.” This evolution also sees a mix of large language models for general tasks and specialized, smaller language models (SLMs) for specific business applications, as exemplified by Forward Networks’ approach to distribute intelligence.
The discussion also touched on the critical role of human oversight and trust in AI systems, particularly in regulated environments, likening it to the gradual adoption of automation seen in systems such as VMware vSphere’s Dynamic Resource Scheduler. While AI is undeniably accelerating the scale and speed of innovation in networking and storage, some elements resonate with “everything old is new again,” as past concepts like offload engines and advanced storage architectures are being repurposed at an unprecedented scale. There was a debate on whether AI *drives* innovation or simply provides a compelling use case for existing “cool tech” that previously lacked widespread application. Looking ahead, AI is poised to become the “killer app for the edge,” driven by the high cost and time required to move large datasets, pushing processing closer to data generation. This necessitates new infrastructure designs for smaller, distributed AI clusters, creating opportunities for greenfield builds and challenging architects to bridge the gap between massive data center deployments and efficient, localized AI.







