AI has driven your datacenter designs and is now moving outwards through your whole network. This episode of the Tech Field Day podcast features Lee Peterson from Cisco discussing AI and networks with Andy Banta, Jack Poller, and Alastair Cooke. The discussion explores how AI is “escaping the data center” and becoming pervasive across the network, necessitating a dual focus on “networking for AI” and “AI for networking.” The former involves building robust, high-performance, and secure infrastructure, particularly at the edge, to support AI workloads like real-time inference. The goal is to support new applications such as robotics, fraud detection, and small language models, moving beyond traditional cloud-centric deployments to a more federated model. The latter leverages AI to manage, optimize, troubleshoot, and secure the network itself, with Cisco utilizing deep network learning models, historical data, and expertise to create AI assistants that enable intent-based networking and streamline operations. Additionally, the conversation emphasizes the critical role of advanced security, including hardware-accelerated post-quantum cryptography, to protect data in this evolving, AI-driven environment from future decryption threats.
Billion-Dollar AI Headlines Obscure Real Business Value
The big headlines that we’re seeing around the massive funding of large AI companies are a distraction from the reality that AI is being built and used in business applications. This episode of the Tech Field Day podcast features Frederic Van Haren, Chris Grundemann, Brian Martin, and Alastair Cooke reflecting after AI Infrastructure Field Day in Santa Clara. Popular news often covers the creation of large, general purpose AI models, yet the real-world application of AI through inference is where most companies see a return on their investment. Similarly, the common understanding of “AI” is as a single topic, without a more granular view that differentiates between rules-based systems, traditional machine learning, and emergent generative models like Large Language Models (LLMs). Specialized AI models will be vital for cost-effective applications with enhanced efficiency and the integration of diverse AI capabilities into agentic architectures. Advanced security protocols and regulatory frameworks are vital to mitigate novel vulnerabilities, organizations must adapt to an extraordinarily rapid pace of technological evolution. AI has already had a profound impact on software development, potentially enabling widespread custom application creation.
AI Needs Resource Efficiency
As we build out AI infrastructure and applications we need resource efficiency, continuously buying more horsepower cannot go on forever. This episode of the Tech Field Day podcast features Pete Welcher, Gina Rosenthal, Andy Banta, and Alastair Cooke hoping for a more efficient AI future. Large language models are trained using massive farms of GPUs and massive amounts of Internet data, so we expect to use large farms of GPUs and unstructured data to run those LLMs. Those large farms have led to scarcity of GPUs, and now RAM price increases that are impeding businesses building their own large AI infrastructure. Task-specific AIs, that use more efficient, task-specific models should be the future of Agentic AI and AI embedded in applications. More efficient and targeted AI may be the only way to get business value from the investment, especially in resource constrained edge environments. Does every AI problem need a twenty billion parameter model? More mature use of LLMs and AI will focus on reducing the cost of delivering inference to applications, your staff, and your customers.
Your Security Strategy Needs Graphs
Modern security needs more than checklists. Instead of working down a process you need to start thinking like the attackers trying to get into your systems. In this episode of the Tech Field Day Podcast, Jay Cuthrell and Girard Kavelines join Tom Hollingsworth to discuss how Microsoft Sentinel helps bring this new security strategy to your environment.
They discuss the advances that have been made in data lake technology, including the increased retention time offered by Microsoft. They also talk about the way that graphs help train new security analysts to understand the way that attackers think. They discuss how you can adapt your plans in the new year to take advantage of new offerings and the questions you should be asking to make sure you’re not missing out.
Modern Data Mobility is Challenging the Laws of Physics with Hammerspace
Modern data mobility is challenging the laws of physics; the speed of light is a fundamental limit for moving signals. This episode of the Tech Field Day podcast features Kurt Kuckein from Hammerspace discussing data movement and management with Jim Jones, Jack Poller, Andy Banta, and Alastair Cooke. The challenge is that the distributed nature of data, spread across the globe, creates significant obstacles for AI, particularly regarding the speed of light and power consumption. We delve into overcoming these limitations through technologies that facilitate data access and movement, touching on concepts such as efficient storage solutions (Open Flash Platform), the importance of centralized data management, and the agility required for evolving AI workloads. While the underlying principles of data management are not new, the scale and complexity of AI necessitate innovative approaches to ensure data can be accessed and utilized effectively, regardless of its physical location.
2026 Is A Bright Year for HPE and Juniper
2025 was a big year for both HPE and Juniper, but 2026 is the year when the real integration starts. In this episode of the Tech Field Day podcast, Tom Hollingsworth is joined by wireless experts Jonathan Davis and Keith Parsons as they look at the acquisition of Juniper Networks by HPE and what the future holds for the new combined company.
They discuss how the culture of the two companies is coming together as well as the unification of their management platforms. They look at the differences between Juniper Mist and Aruba Central and which platform will be the one users log into going forward. The guests also look at the competition between Cisco and HPE and how the race will develop as customers look to refresh their installations and what they are likely to choose.
Tech Field Day Takeaways for 2025: Making AI Less Manic
2025 was the year of AI Mania. Everyone wants you to know where they stand with AI in their product. Tech Field Day has a different approach. In this special year end episode, Tom Hollingsworth, Stephen Foskett, and Alastair Cooke look back at the discussions and deep dives into AI and how Tech Field Day grounded them all in practical real terms. Our event leads discuss the boring reality of AI tools and how AI has a dual nature that must be understood to get real value. They also debate the changing landscape of security where AI is concerned, including the importance of data sovereignty. They wrap up with a focus on the fundamentals and how Tech Field Day continues to make those important for the community at large.
Generative AI Coding Tools Make Enterprise Applications Worse
AI is writing a large proportion of modern software and Generative AI coding tools make enterprise applications worse. This episode of the Tech Field Day podcast looks at AI generated applications with Calvin Hendryx-Parker, Jim Czuprynski, Jay Cuthrell, and Alastair Cooke. Satya Nadella says that up to 30% of the code Microsoft writes is AI generated, AWS is at about 25% AI generated code. We ponder whether there is a link between this AI generated code and the quality of the Windows 11 codebase, possibly even the recent AWS outage? Calvin has hands-on experience with a range of AI coding tools, finding he uses different AI tools for specialist tasks in his development projects. The easy task for AI coding is translating existing applications from one platform version to another, or rewriting existing application code in new languages. Both these tasks are onerous for human developers and ideal for an AI assistant. The unanswered question is whether generative AI tools can handle creating new functionality in enterprise applications, can AI fulfill the role of the senior developer or software architect?
DNS Must Be Secured Presented by Infoblox
DNS security is no longer optional. This service is not only being attacked by nefarious actors but it is also being leveraged in ways to compromise users and exfiltrate data. In this episode of the Tech Field Day podcast, brought to you by Infoblox, Tom Hollingsworth is joined by Jack Poller and Cricket Liu. They talk about the historical openness of DNS and how that has led to it becoming easy to see what users are doing and create ways to manipulate them. They discuss ways to secure the protocol and how companies like Infoblox are extending the capabilities for future security.
Well Managed Kubernetes Means Infrastructure Finally Doesn’t Matter
In a world of well-managed Kubernetes, we hoped that infrastructure finally wouldn’t matter. This episode of the Tech Field Day podcast features John Willis and Guy Currier wishing that infrastructure didn’t matter, with Alastair Cooke. Every new infrastructure revolution claims to make infrastructure invisible, from virtualization through HCI and cloud to containers and Kubernetes. The reality has always been that these revolutions shift the definition of infrastructure and bring some new aspect to be managed. Developers building features and applications want to focus on satisfying some business need, not considering storage devices and network configurations. Virtualization and Kubernetes both made delivering infrastructure easier, but neither eliminated infrastructure architecture and management. The dream of self-deploying and self-organizing infrastructure is as distant as it ever was. Agentic AI is the latest new hope to eliminate infrastructure challenges, yet it brings its own complex infrastructure requirements. Will we ever stop caring about IT infrastructure?
Private Cloud is Not just Self-Service Virtualization
Private cloud is not just virtualization 4.0, self-service VM deployment doesn’t fulfil the same need as the Public Cloud. This episode of the Tech Field Day podcast features Mike Graff, Jon Hildebrand, and Alastair Cooke. Private cloud has evolved from simple virtualization to a more comprehensive, cloud-like experience, emphasizing the need for on-premises infrastructure to offer the same developer-friendly tools and APIs as public clouds. Some application repatriation is driven by cost concerns and enabled by rise of technologies like Kubernetes and OpenShift for managing containerized workloads. A unified control plane for hybrid cloud environments is vital, as is accurate cost accounting for on-premises resources. Enterprises will search for a hybrid approach where developers can deploy applications without needing to worry about the underlying infrastructure.
Your Edge Projects will Fail Without Fleet Lifecycle Management with ZEDEDA
Projects to deliver applications to edge locations will fail without comprehensive fleet lifecycle management. This episode of the Tech Field Day podcast features Sachin Vasudeva from Zededa discussing the importance of long-term edge management with Guy Currier and Alastair Cooke. There are unique challenges of managing edge deployments compared to cloud or on-premises environments. Focusing on business logic and application outputs while leveraging infrastructure providers to handle the complexities of packaging, deploying, and monitoring AI models enables diverse edge environments. Edge locations might have different hardware deployed, intermittent connectivity, requiring a balance between standardization and flexibility in managing edge devices and applications. Teams with rapid responsiveness and adaptation will better enable their business to respond to changing conditions, especially with the rapid pace of AI innovation.
Every AI Strategy Needs a Data Protection Strategy with Commvault
Every company lives in fear of a ransomware attack, whether they have suffered one or not, and this is even more critical in the era of AI. This episode of the Tech Field Day Podcast looks forward to Commvault SHIFT in November with a discussion of the importance of data protection to AI applications with Tim Zonca from Commvault, frequent delegate Gina Rosenthal, and host Stephen Foskett. AI applications are reliant on good data, and yet this same technology makes it easier for attackers to breach corporate controls. Today’s social engineering and phishing is more convincing than ever thanks to generative AI, and this has helped ransomware crews to adopt larger and more powerful attacks. Ransomware is a massive business, and it isn’t going away any time soon. At the same time, GenAI applications offer a new attack surface, as agentic AI is empowered to take action based on untrusted inputs. Not only can we not stop ransomware, but the pace and technical capabilities of these attacks keeps accelerating. There is reason for optimism, however, as data protection tools keep getting better. Today’s AI-optimized tools can effectively categorize data, burst or migrate to different locations, and roll back or recover from corruption or compromise. In the future, we will see increasing use of AI to monitor systems and data, detecting patterns and hardening the attack surface.
Moving Enterprise AI Applications From Experiments to Production with NetApp
Running enterprise applications in production is a lot different from the AI experiments many of us have been involved with so far. This episode of the Tech Field Day podcast, recorded prior to NetApp Insight 2025, features Ingo Fuchs from NetApp along with Gina Rosenthal, Glenn Dekhayser, and Stephen Foskett. AI applications often start as experiments with a limited data set, but once these are moved to production there are many critical decisions to be made. Data must be classified and cleaned, removing personal and financial data and proprietary information before it even reaches an LLM. Data also must be structured for embedding and vectorization prior to use by an LLM. And we have to ensure that data is up to date or the application will not serve the customer properly. Finally we have to consider whether it is proper and ethical to share and act on this data. Many of the challenges facing modern AI applications are similar to the historic issues faced by enterprise storage, and this is an area in which NetApp and their customers have decades of experience.
Passkeys are the Future
Passwords create friction and therefore users find ways around them. New technology such as secure enclaves and PKI allow us to create better solutions like passkeys. In this episode of the Tech Field Day Podcast. Alan Shimel and Kate Scarcella join Tom Hollingsworth to discuss the problems with traditional passwords and how passkeys overcome them. They also talk about why it has taken so long to adopt passkeys and what barriers remain to full implementation. The wrap up with a look at what might lay ahead on the horizon for the future of user security.
Unified Flash Memory and Reduced HBM are Reshaping AI Training and Inference with Phison
AI will need less HBM (high bandwidth memory) because flash memory unification is changing training and inference. This episode of the Tech Field Day podcast features Sebastien Jean from Phison, Max Mortillaro, Brian Martin, and Alastair Cooke. Training, fine-tuning, and inference with Large Language Models traditionally use GPUs with high bandwidth memory to hold entire data models and data sets. Phison’s aiDaptiv+ framework offers the ability to trade lower cost of infrastructure against training speed or allow larger data sets (context) for inference. This approach enables users to balance cost, compute, and memory needs, making larger models accessible without requiring top-of-the-line GPUs, and giving smaller companies more access to generative AI.
Networks Need Agentic AI with HPE Juniper Networking
Agentic AI is reshaping the IT landscape and networking is no exception. Building upon the previous research into machine learning means we have a head start on harnessing that power. In this episode of the Tech Field Day podcast, brought to you by HPE Juniper Networking, Tom Hollingsworth is joined by Keith Parsons and Sunalini Sankhavaram. They talk about how agentic AI is driving new methods for operating networks and helping humans concentrate on real problems instead of menial tasks. They also discuss how agentic AI can power self-driving networks where configuration and provisioning are done automatically or with a minimum of effort to ensure resiliency and enhance user expectations.
SASE Makes Convergence Simple with HPE Aruba Networking
Converged networking is creating complexity with network operations and engineering teams. Ensuring that users are able to access resources and complete workflows means being up-to-date with the latest technology behind the scenes. You need a trusted partner to make it all happen with minimal issues.
In this episode of the Tech Field Day Podcast, brought to you by HPE Aruba Networking, Tom Hollingsworth is joined by Chris Grundemann, Jeff Olson, and Adam Fuoss as they discuss the growing challenges with networking and how HPE Aruba Networking is addressing them. They talk about SASE and SSE and how tools such as Cloud-Native NAC and SASE AI Copilot work together to address the needs of the modern network. These features, debuted at Black Hat 2025, help teams by making sense of the network and keeping it secure from intruders.
They discuss how AI is going to accelerate operations teams while providing context and understanding for challenges. They also talk about how cloud native principles an apply to both online and on-prem configurations. The panel wraps up with a discussion of the importance of a sole-source provider for these solutions and how HPE Aruba Networking is addressing the integration of recent acquisitions.


























