Networks Need Agentic AI with HPE Juniper Networking

Agentic AI is reshaping the IT landscape and networking is no exception. Building upon the previous research into machine learning means we have a head start on harnessing that power. In this episode of the Tech Field Day podcast, brought to you by HPE Juniper Networking, Tom Hollingsworth is joined by Keith Parsons and Sunalini Sankhavaram. They talk about how agentic AI is driving new methods for operating networks and helping humans concentrate on real problems instead of menial tasks. They also discuss how agentic AI can power self-driving networks where configuration and provisioning are done automatically or with a minimum of effort to ensure resiliency and enhance user expectations.


A Look at Mainframe Innovation at Tech Field Day Extra at SHARE Cleveland 25

Tech Field Day Extra at SHARE Cleveland on August 19 promises a deep dive into mainframe innovation, and viewers are encouraged to subscribe to the Tech Field Day YouTube channel and follow the LinkedIn page for ongoing Field Day coverage.


Early Adoption of Generative AI Helps Control Costs with Signal65

If you haven’t already, start working with Generative AI now and make sure to control your ongoing costs. This episode of the Tech Field Day podcast features Russ Fellows, Mitch Lewis, and Brian Martin, all from Signal65, and is hosted by Alastair Cooke. Generative AI is delivering value to businesses of all sizes, but significant evolution in models and technologies remains before maturity is achieved. Experimentation is essential to understand the value of new technologies, starting with cloud resources or small-scale on-premises servers. Business value is derived from the inference stage, where AI tools generate actionable information for users. Generative AI is like a knowledgeable and well-intentioned intern; someone more senior must ensure AI is given good instructions and check their work. In production, grounding and guard rails are vital to keep your AI an asset, not a liability.


Network Engineers are Facing an Identity Crisis

Network engineers are the firefighters and knowledge bases of enterprise IT, however the role of a network engineer is rapidly evolving. With the rise of automation, orchestration, and AI, the familiar image of an engineer hunched over a command-line interface (CLI) is giving way, leading many to question the future of their profession. In this episode, Tom Hollingsworth is joined by Ryan Harris, Chris Grundemann, and Nathan Nielsen as they discuss how the perception of their role has shifted, the continuous need for learning and adaptation, and whether the CLI is truly dead.

The conversation explores the challenges and opportunities presented by these technological advancements, highlighting how network engineers are embracing new tools like chatbots and GUIs for enhanced visualization. While some aspects of the job, like manual CLI work, may be diminishing, the core principles of understanding network functionality remain core to the role of the network engineer. The panel talks about identity crisis in a field where continuous learning is essential, contrasting it with professions like doctors and lawyers who deal with slower-changing fundamentals. They discuss the value of specialization versus being a generalist, the concept of the “pitchfork engineer,” and ultimately, how redefining their identity as lifelong learners can help network engineers thrive in this ever-changing landscape.


The DoJ Just Devalued Juniper Mist

The proposed remedies for the HPE acquisition of Juniper Networks did a real disservice to Juniper Mist. The confusion around what’s going on with the proposed Juniper AIOps for Mist auction have professionals asking a lot of questions. In this episode, recorded on the eve of the close of the acquisition, Tom Hollingsworth sits down with Sam Clements, Jake Snyder, and Ed Weadon to make sense of it all. There are discussions about what exactly is included in the auction and what benefit will come from the license to use Juniper AIOps for Mist. Also discussed is who might be a good bidder for the solution and how long it will take for them to get any real value from it.


Enterprises Shouldn’t Be Outsourcing Their IT Anymore

Enterprise networks are complicated but outsourcing all of the operations team doesn’t lead to better outcomes. It’s important to remember that enterprise covers a wide range of network definitions. In this episode, Ed Weadon, Chris Grundemann, and Jody Lemoine join Tom Hollingsworth as they discuss how businesses see the network and IT in general as a cost center instead of value generation. They also talk about the various sizes of networks and why each of them has issues with the most popular outsourcing methods. They also discuss the human factor and why not all managed providers can give you the same level of service.


MLO is a Lie

One of the most anticipated features of Wi-Fi 7 isn’t ready for the public. Worse yet, it may never deliver on the promise of fast, reliable wireless connectivity. In this episode, Tom Hollingsworth is joined by Allyn Crowe, Peter Mackenzie, and Chris Reed as they discuss the way that multi-link operation (MLO) has been included in the specification for Wi-Fi 7 yet not quite implemented. They highlight the technical difficulties of deploying such a complicated protocol and how vendors are trying to squeeze every drop of performance out of their hardware. They wrap up with advice on whether or not to plan your next deployment around a technology that isn’t quite ready yet.


A Different Type of Datacenter is Needed for AI

AI demands specialized data center designs due to its unique hardware utilization and networking needs, which require a new type of infrastructure. This Tech Field Day Podcast episode features Denise Donohue, Karen Lopez, Lino Telera, and Alastair Cooke. Network design has been a consistent part of the AI infrastructure discussions at Tech Field Day events. The need for a dedicated network to interconnect GPUs differentiates AI training and fine-tuning networks from general-purpose computing. The vast power demand for high-density GPU servers highlights a further need for different data centers with liquid cooling and massive power distribution. Model training is only one part of the AI pipeline; business value is delivered by AI inference with a different set of needs and a closer eye on financial management. Inference will likely require servers with GPUs and high-speed local storage, but not the same networking density as training and fine-tuning. Inference will also need servers adjacent to existing general-purpose infrastructure running existing business applications. Some businesses may be able to fit their AI applications into their existing data centers, but many will need to build or rent new infrastructure.


User-Centric Connectivity Has to Innovate

Modern networking is being disrupted in the data center but user-facing networking has largely stagnated. Users are getting slightly faster connections but everything feels mostly the same. In this episode, Tom Hollingsworth is joined by Sam Clements and Ed Weadon as they discuss innovation in the edge of the network. They talk about how companies like Cisco have been trying to bring users into the modern era. They talk about the centralization of management in the cloud and how competition has driven those moves. They also look ahead to Cisco Live and discuss the releases they would most like to see at the event.


Have A Classy Time with Tech Field Day Extra at Cisco Live US 2025

Hello San Diego! We’re thrilled to be back once again with great content headed your way courtesy of Tech Field Day Extra!. We’re hoping the June Gloom stays away so we can shine a light on some wonderful presenters and get some great questions from our amazing delegates. You’re not going to want to miss […]


Scaling Smarter Optimizes Cloud Costs in the Age of Data Abundance

Keeping every application and every scrap of data on the public cloud becomes very expensive; we need to improve our cloud economics. This episode of the Tech Field Day podcast features Vriti Magee, Mitch Lewis, and Alastair Cooke. The belief that data is the new oil has led many companies to retain every piece of data they generate, often in object storage on public cloud platforms. The continuous growth of this data leads to a growing bill from the cloud provider, often with no clear plan in place for recouping the value of the money spent. Generative AI requires training data, which is another reason to retain everything; again, there needs to be value returned to the business. New designs for cloud applications must include data management and managed retention as key criteria. Sustainable, honest designs that enable business change are vital for delivering value back to the business.


Exploring Cloud Resilience, AI, and Data at Cloud Field Day 23

Cloud Field Day is making its highly anticipated return to San Francisco on June 4th and 5th, bringing together some of the biggest names in cloud technology for two days of in-depth insights and live demos. You can catch every moment of the action live on the Tech Field Day LinkedIn page and Techstrong TV. […]


Compliance Does Not Equal Security

Compliance reinforces the state of security in your organization. However, compliance in and of itself is not security. In this episode of the Tech Field Day Podcast, Tom Hollingsworth is joined by Jack Poller and Milou Meier as they discuss the nuance between securing your organization and ensuring compliance. They discuss the challenges with audits and the “checkbox” mentality that has become common. They also discuss how organizations face different challenges globally and how you can help ensure that you aren’t being exposed to problems in the future.


Exploring the Future of Cybersecurity at Security Field Day 13

The first Security Field Day event of the year is finally here! We’re excited to bring you Security Field Day 13 live from Silicon Valley. This event combines a number of trends in the cybersecurity and data protection spaces to bring you information you need to keep your users safe and sound. Security Field Day […]


Quality Data is the Foundation for AI with Qlik

AI ought to be able to help businesses derive value from their data, but not all AI applications have a solid foundation. This episode of the Tech Field Day podcast looks forward to Qlik Connect 2025, featuring delegates Gina Rosenthal and Jim Czuprynski discussing the importance of data with Nick Magnuson of Qlik and host Stephen Foskett. Last year Qlik introduced Answers, a RAG AI product that delivers intelligence from unstructured data. This year we expect to see much more integration with structured data, analytics, business intelligence, and agentic AI, as Qlik’s customers seek to deliver innovative solutions. Mature organizations are focused on building a solid governance foundation for their data, ensuring responsible and ethical use in AI applications. The advent of agentic AI raises more concerns, as autonomous agents are empowered to take action without human involvement. Responsible use must include strict limits and human supervision to make sure AI agents remain controlled. We’re looking forward to customer stories, technical takeaways, and maybe some new product introductions at Qlik Connect this year!


AI Needs to Be Boring

Mature technologies deliver business value by integration into boring production applications, so AI needs to be boring. This Tech Field Day Podcast episode features Max Mortillaro, Guy Currier, Jay Cuthrell, and Alastair Cooke. AI has frequently been in the public news, many organizations are busy building AI infrastructure and pipelines, and vendors have tagged their applications with AI to ride the hype. Yet, business value is usually delivered in applications that serve customers rather than generating headlines. The first steps towards AI being a functional but boring part of production applications have emerged, with interoperability mechanisms like MCP and A2A are vital steps towards pervasive AI. Options for Small Language Models (SLM) are opening up more cost-effective use of generative AI, while predictive AI continues to be the standard boring production AI. Data and output safety are other areas for development; avoiding GenAI hallucinations, model poisoning, and data leakage is vital for AI to become boring. Eventually, Generative AI will be as invisible and valuable in mainstream business applications, leading to a return on all the current investments.


Virtual Networks are Air Gapped

The definition of traditional security technologies must evolve to meet new use cases. Networks that use virtual constructs to segregate traffic are just as air gapped as physical separation. In this episode, Tom Hollingsworth is joined by Carole Warner Reece, John Osmon, and Jason Gintert discuss why the standard for hyper secure systems has always been physical separation. They look at how the terminology is being changed to support new use cases with virtual separation and whether or not those new networks can meet the high standards of the older versions. They also discuss the need for precision in terminology and how to avoid falling back on marketing terms that can create confusion with unsuspecting consumers.


Servers Are Still Relevant in the Age of Cloud with HPE

Although we live in a world of software, server hardware still matters from datacenter to cloud to edge. This episode of the Tech Field Day Podcast features Scott Shaffer, VP and Chief Technologist at HPE, discussing the evolution of the server with Jack Poller, Vuong Pham, and Stephen Foskett. Although servers might appear to be commoditized, companies like HPE are building optimized designs for various purposes. Edge servers, for example, are a unique form factor and have special requirements for mounting, air filtering, security, power efficiency, management, and more. Datacenter and cloud servers have special optimizations as well, with efficiency, expansion, and cooling in special focus. As we turn to AI and HPC, servers have to support unprecedented levels of electrical power and cooling, with air cooling still very much in demand but incredible advances being made on liquid cooling as well. The Tech Field Day delegates will be on-site with HPE on the same day this episode airs, learning more about the ProLiant line. Tune in and watch the Tech Field Day presentations on YouTube.


Hybrid and Multicloud Applications are a Headache

Hybrid-multi-cloud applications have become part of the enterprise IT landscape, but they give me a headache. This episode of the Tech Field Day Podcast features Barton George, Mike Graff, Mitch Lewis, and Alastair Cooke discussing the dream of a single cloud provider being home to every application in the enterprise has long been morphed. The goal is to optimize applications by using both cloud and on-premises resources, but integrating different platforms is challenging. Services like AWS and Azure have distinct configurations, making a unified management console difficult to achieve. Some companies simplify by using a single cloud, but mergers or unique business needs often disrupt this approach. While hybrid multi-cloud setups improve efficiency, they also bring ongoing IT challenges.


Not All AI Infrastructure Is The Same

Enterprises require vastly different infrastructure for AI. When building your next network, you need to understand what is required in order to achieve specific outcomes. In this episode of the Tech Field Day podcast, Tom Hollingsworth is joined by Scott Robohn, Brad Gregory, and Ron Westfall to discuss the various different types of AI infrastructure. They talk about inferencing and models as well as how to effectively utilize what you currently have. They also discuss what to look for when buying new equipment and how best to put it to use in order to maximize return on investment.