2026 Is A Bright Year for HPE and Juniper

2025 was a big year for both HPE and Juniper, but 2026 is the year when the real integration starts. In this episode of the Tech Field Day podcast, Tom Hollingsworth is joined by wireless experts Jonathan Davis and Keith Parsons as they look at the acquisition of Juniper Networks by HPE and what the future holds for the new combined company.

They discuss how the culture of the two companies is coming together as well as the unification of their management platforms. They look at the differences between Juniper Mist and Aruba Central and which platform will be the one users log into going forward. The guests also look at the competition between Cisco and HPE and how the race will develop as customers look to refresh their installations and what they are likely to choose.

Tech Field Day Takeaways for 2025: Making AI Less Manic

2025 was the year of AI Mania. Everyone wants you to know where they stand with AI in their product. Tech Field Day has a different approach. In this special year end episode, Tom Hollingsworth, Stephen Foskett, and Alastair Cooke look back at the discussions and deep dives into AI and how Tech Field Day grounded them all in practical real terms. Our event leads discuss the boring reality of AI tools and how AI has a dual nature that must be understood to get real value. They also debate the changing landscape of security where AI is concerned, including the importance of data sovereignty. They wrap up with a focus on the fundamentals and how Tech Field Day continues to make those important for the community at large.

Generative AI Coding Tools Make Enterprise Applications Worse

AI is writing a large proportion of modern software and Generative AI coding tools make enterprise applications worse. This episode of the Tech Field Day podcast looks at AI generated applications with Calvin Hendryx-Parker, Jim Czuprynski, Jay Cuthrell, and Alastair Cooke. Satya Nadella says that up to 30% of the code Microsoft writes is AI generated, AWS is at about 25% AI generated code. We ponder whether there is a link between this AI generated code and the quality of the Windows 11 codebase, possibly even the recent AWS outage? Calvin has hands-on experience with a range of AI coding tools, finding he uses different AI tools for specialist tasks in his development projects. The easy task for AI coding is translating existing applications from one platform version to another, or rewriting existing application code in new languages. Both these tasks are onerous for human developers and ideal for an AI assistant. The unanswered question is whether generative AI tools can handle creating new functionality in enterprise applications, can AI fulfill the role of the senior developer or software architect?

Network Automation Needs Standards

Automation is the best way to scale network deployment and operations. But the lack of formal standards in the field cause delays and wasted resources. In this episode of the Tech Field Day Podcast, Tom Hollingsworth is joined by Denise Donohue, Steve Puluka, and Kevin Myers. They discuss how some common things have emerged more from consensus and not from formalized standards bodies. The panel also discuss the value of using open ideas and not being forced into a single-vendor solution. They also discuss regulations and the likelihood that an organization like the IETF or IEEE will pick up the task of formalizing automation standards.

Simplification of IT is Really an Illusion

Simplification in IT is an illusion; increasing complexity outpaces every effort to simplify. This episode of the Tech Field Day podcast, recorded on-site at Cloud Field Day 24, features Camberley Bates, Nathan Nielsen, Guy Currier, and Alastair Cooke. Cloud services and centralized management platforms offer simplified interfaces but also introduce a multitude of choices and underlying complexities. History matters; advancements from mainframes to PCs demonstrate continuously shifting goalposts, while the more recent integration of cloud and AI contributes to increased complexity. It may be that AI brings simply advanced simplicity, yet it may also bring the unintended consequence of people becoming “ignorant” of how IT works. CIOs and CTOs need to think strategically to manage increasingly complex environments, striking a balance between patchwork fixes and long-term strategic approaches.

DNS Must Be Secured Presented by Infoblox

DNS security is no longer optional. This service is not only being attacked by nefarious actors but it is also being leveraged in ways to compromise users and exfiltrate data. In this episode of the Tech Field Day podcast, brought to you by Infoblox, Tom Hollingsworth is joined by Jack Poller and Cricket Liu. They talk about the historical openness of DNS and how that has led to it becoming easy to see what users are doing and create ways to manipulate them. They discuss ways to secure the protocol and how companies like Infoblox are extending the capabilities for future security.

Well Managed Kubernetes Means Infrastructure Finally Doesn’t Matter

In a world of well-managed Kubernetes, we hoped that infrastructure finally wouldn’t matter. This episode of the Tech Field Day podcast features John Willis and Guy Currier wishing that infrastructure didn’t matter, with Alastair Cooke. Every new infrastructure revolution claims to make infrastructure invisible, from virtualization through HCI and cloud to containers and Kubernetes. The reality has always been that these revolutions shift the definition of infrastructure and bring some new aspect to be managed. Developers building features and applications want to focus on satisfying some business need, not considering storage devices and network configurations. Virtualization and Kubernetes both made delivering infrastructure easier, but neither eliminated infrastructure architecture and management. The dream of self-deploying and self-organizing infrastructure is as distant as it ever was. Agentic AI is the latest new hope to eliminate infrastructure challenges, yet it brings its own complex infrastructure requirements. Will we ever stop caring about IT infrastructure?

NetAIOps Has Its Challenges

The industry has embraced AI for every possible problem. Operations will eventually embrace it as well but questions remain about how it will be implemented. In this episode, Tom Hollingsworth sits down with Pete Welcher, Rita Younger, and Jonathan Davis to discuss the issues that remain with implementing AI into an operations workflow. They discuss licensing and procurement, the need for institutional knowledge, and how this will all work in a multivendor world. They wrap up with some guidance about how to approach your next big AIOps project.

Ready or Not, AI is Coming to the Enterprise

Despite widespread skepticism, AI is already widely used in the enterprise, often in the form of so-called shadow applications outside traditional IT. This episode of the Tech Field Day Podcast, recorded on the eve of AI Field Day, features delegates Ryan Booth and Dave Graham discussing the real state of AI adoption in the enterprise with host Stephen Foskett. Just like the advent of the PC, generative AI is widely used across businesses, typically on a bring-your-own basis rather than as a coordinated effort by the IT department. The same process happened in the Software-as-a-Service world, where each department and even individual adopted multiple tools that met their needs. There will soon be a reckoning, where businesses try to get their hands around all of the AI applications being used across the enterprise. The next step is to develop a plan to control sprawl of tools, models, data, and subscriptions to ensure that this shadow AI doesn’t become a risk to the company. Then companies need to be prepared as AI agents become critical to their operations, likely also deployed by individuals without corporate control.

Private Cloud is Not just Self-Service Virtualization

Private cloud is not just virtualization 4.0, self-service VM deployment doesn’t fulfil the same need as the Public Cloud. This episode of the Tech Field Day podcast features Mike Graff, Jon Hildebrand, and Alastair Cooke. Private cloud has evolved from simple virtualization to a more comprehensive, cloud-like experience, emphasizing the need for on-premises infrastructure to offer the same developer-friendly tools and APIs as public clouds. Some application repatriation is driven by cost concerns and enabled by rise of technologies like Kubernetes and OpenShift for managing containerized workloads. A unified control plane for hybrid cloud environments is vital, as is accurate cost accounting for on-premises resources. Enterprises will search for a hybrid approach where developers can deploy applications without needing to worry about the underlying infrastructure.

Your Edge Projects will Fail Without Fleet Lifecycle Management with ZEDEDA

Projects to deliver applications to edge locations will fail without comprehensive fleet lifecycle management. This episode of the Tech Field Day podcast features Sachin Vasudeva from Zededa discussing the importance of long-term edge management with Guy Currier and Alastair Cooke. There are unique challenges of managing edge deployments compared to cloud or on-premises environments. Focusing on business logic and application outputs while leveraging infrastructure providers to handle the complexities of packaging, deploying, and monitoring AI models enables diverse edge environments. Edge locations might have different hardware deployed, intermittent connectivity, requiring a balance between standardization and flexibility in managing edge devices and applications. Teams with rapid responsiveness and adaptation will better enable their business to respond to changing conditions, especially with the rapid pace of AI innovation.

Every AI Strategy Needs a Data Protection Strategy with Commvault

Every company lives in fear of a ransomware attack, whether they have suffered one or not, and this is even more critical in the era of AI. This episode of the Tech Field Day Podcast looks forward to Commvault SHIFT in November with a discussion of the importance of data protection to AI applications with Tim Zonca from Commvault, frequent delegate Gina Rosenthal, and host Stephen Foskett. AI applications are reliant on good data, and yet this same technology makes it easier for attackers to breach corporate controls. Today’s social engineering and phishing is more convincing than ever thanks to generative AI, and this has helped ransomware crews to adopt larger and more powerful attacks. Ransomware is a massive business, and it isn’t going away any time soon. At the same time, GenAI applications offer a new attack surface, as agentic AI is empowered to take action based on untrusted inputs. Not only can we not stop ransomware, but the pace and technical capabilities of these attacks keeps accelerating. There is reason for optimism, however, as data protection tools keep getting better. Today’s AI-optimized tools can effectively categorize data, burst or migrate to different locations, and roll back or recover from corruption or compromise. In the future, we will see increasing use of AI to monitor systems and data, detecting patterns and hardening the attack surface.

Moving Enterprise AI Applications From Experiments to Production with NetApp

Running enterprise applications in production is a lot different from the AI experiments many of us have been involved with so far. This episode of the Tech Field Day podcast, recorded prior to NetApp Insight 2025, features Ingo Fuchs from NetApp along with Gina Rosenthal, Glenn Dekhayser, and Stephen Foskett. AI applications often start as experiments with a limited data set, but once these are moved to production there are many critical decisions to be made. Data must be classified and cleaned, removing personal and financial data and proprietary information before it even reaches an LLM. Data also must be structured for embedding and vectorization prior to use by an LLM. And we have to ensure that data is up to date or the application will not serve the customer properly. Finally we have to consider whether it is proper and ethical to share and act on this data. Many of the challenges facing modern AI applications are similar to the historic issues faced by enterprise storage, and this is an area in which NetApp and their customers have decades of experience.

Passkeys are the Future

Passwords create friction and therefore users find ways around them. New technology such as secure enclaves and PKI allow us to create better solutions like passkeys. In this episode of the Tech Field Day Podcast. Alan Shimel and Kate Scarcella join Tom Hollingsworth to discuss the problems with traditional passwords and how passkeys overcome them. They also talk about why it has taken so long to adopt passkeys and what barriers remain to full implementation. The wrap up with a look at what might lay ahead on the horizon for the future of user security.

Unified Flash Memory and Reduced HBM are Reshaping AI Training and Inference with Phison

AI will need less HBM (high bandwidth memory) because flash memory unification is changing training and inference. This episode of the Tech Field Day podcast features Sebastien Jean from Phison, Max Mortillaro, Brian Martin, and Alastair Cooke. Training, fine-tuning, and inference with Large Language Models traditionally use GPUs with high bandwidth memory to hold entire data models and data sets. Phison’s aiDaptiv+ framework offers the ability to trade lower cost of infrastructure against training speed or allow larger data sets (context) for inference. This approach enables users to balance cost, compute, and memory needs, making larger models accessible without requiring top-of-the-line GPUs, and giving smaller companies more access to generative AI.

Software is Automating Your AI Data Centre Infrastructure

Hardware always matters, especially in AI and now software is automating your AI data centre infrastructure. This episode of the Tech Field Day podcast features Gina Rosenthal, Barton George, Andy Banta, and Alastair Cooke. Generative AI brought new hardware into enterprise data centres; GPUs, TPUs, NPUs, XPUs all offload AI processing from CPUs for more performance and efficiency. Feeding these accelerators requires fast networks and fast storage, common topics for AI Infrastructure Field Day events. In parallel, sophisticated software to automate the deployment and operation of this new hardware is vital to return value fast and optimize the value from the hardware investment. Automation platforms are moving up towards delivering multiple AI applications on shared XPU infrastructure, where AI inference delivers the business value.

Agentic AI Spells the End of Dial Twiddlers

If you haven’t already, start working with Generative AI now and make sure to control your ongoing costs. This episode of the Tech Field Day podcast features Russ Fellows, Mitch Lewis, and Brian Martin, all from Signal65, and is hosted by Alastair Cooke. Generative AI is delivering value to businesses of all sizes, but significant evolution in models and technologies remains before maturity is achieved. Experimentation is essential to understand the value of new technologies, starting with cloud resources or small-scale on-premises servers. Business value is derived from the inference stage, where AI tools generate actionable information for users. Generative AI is like a knowledgeable and well-intentioned intern; someone more senior must ensure AI is given good instructions and check their work. In production, grounding and guard rails are vital to keep your AI an asset, not a liability.

Networks Need Agentic AI with HPE Juniper Networking

Agentic AI is reshaping the IT landscape and networking is no exception. Building upon the previous research into machine learning means we have a head start on harnessing that power. In this episode of the Tech Field Day podcast, brought to you by HPE Juniper Networking, Tom Hollingsworth is joined by Keith Parsons and Sunalini Sankhavaram. They talk about how agentic AI is driving new methods for operating networks and helping humans concentrate on real problems instead of menial tasks. They also discuss how agentic AI can power self-driving networks where configuration and provisioning are done automatically or with a minimum of effort to ensure resiliency and enhance user expectations.

SASE Makes Convergence Simple with HPE Aruba Networking

Converged networking is creating complexity with network operations and engineering teams. Ensuring that users are able to access resources and complete workflows means being up-to-date with the latest technology behind the scenes. You need a trusted partner to make it all happen with minimal issues.

In this episode of the Tech Field Day Podcast, brought to you by HPE Aruba Networking, Tom Hollingsworth is joined by Chris Grundemann, Jeff Olson, and Adam Fuoss as they discuss the growing challenges with networking and how HPE Aruba Networking is addressing them. They talk about SASE and SSE and how tools such as Cloud-Native NAC and SASE AI Copilot work together to address the needs of the modern network. These features, debuted at Black Hat 2025, help teams by making sense of the network and keeping it secure from intruders.

They discuss how AI is going to accelerate operations teams while providing context and understanding for challenges. They also talk about how cloud native principles an apply to both online and on-prem configurations. The panel wraps up with a discussion of the importance of a sole-source provider for these solutions and how HPE Aruba Networking is addressing the integration of recent acquisitions.

Early Adoption of Generative AI Helps Control Costs with Signal65

If you haven’t already, start working with Generative AI now and make sure to control your ongoing costs. This episode of the Tech Field Day podcast features Russ Fellows, Mitch Lewis, and Brian Martin, all from Signal65, and is hosted by Alastair Cooke. Generative AI is delivering value to businesses of all sizes, but significant evolution in models and technologies remains before maturity is achieved. Experimentation is essential to understand the value of new technologies, starting with cloud resources or small-scale on-premises servers. Business value is derived from the inference stage, where AI tools generate actionable information for users. Generative AI is like a knowledgeable and well-intentioned intern; someone more senior must ensure AI is given good instructions and check their work. In production, grounding and guard rails are vital to keep your AI an asset, not a liability.