Passwords have served their use in the enterprise. We need to start moving away from simple passwords as an authentication mechanism. In this episode of the Tech Field Day podcast, Tom Hollingsworth is joined by Tony Efantis, Karen Lopez, and Fernando Montenegro as they discuss the premise that we are long past passwords, exploring the complexities and frustrations of relying on them for myriad online accounts. The conversation highlights the concept of economic externalities, where developers easily implement simple passwords, but the burden of managing hundreds of unique credentials falls on the individual user. While passwords were initially designed for basic authentication, there has been a shift towards alternative mechanisms like one-time codes sent to email or passkeys because of user laziness and the security risks associated with password reuse and compromised credentials. Ultimately, what is needed a balanced, risk-based authentication approach is necessary, tailoring security levels to the sensitivity of the data being protected, and leveraging technologies like biometrics and background risk assessments to create a more convenient and secure user experience, even as attackers continue to evolve their methods.
MLO is a Lie
One of the most anticipated features of Wi-Fi 7 isn’t ready for the public. Worse yet, it may never deliver on the promise of fast, reliable wireless connectivity. In this episode, Tom Hollingsworth is joined by Allyn Crowe, Peter Mackenzie, and Chris Reed as they discuss the way that multi-link operation (MLO) has been included in the specification for Wi-Fi 7 yet not quite implemented. They highlight the technical difficulties of deploying such a complicated protocol and how vendors are trying to squeeze every drop of performance out of their hardware. They wrap up with advice on whether or not to plan your next deployment around a technology that isn’t quite ready yet.
A Different Type of Datacenter is Needed for AI
AI demands specialized data center designs due to its unique hardware utilization and networking needs, which require a new type of infrastructure. This Tech Field Day Podcast episode features Denise Donohue, Karen Lopez, Lino Telera, and Alastair Cooke. Network design has been a consistent part of the AI infrastructure discussions at Tech Field Day events. The need for a dedicated network to interconnect GPUs differentiates AI training and fine-tuning networks from general-purpose computing. The vast power demand for high-density GPU servers highlights a further need for different data centers with liquid cooling and massive power distribution. Model training is only one part of the AI pipeline; business value is delivered by AI inference with a different set of needs and a closer eye on financial management. Inference will likely require servers with GPUs and high-speed local storage, but not the same networking density as training and fine-tuning. Inference will also need servers adjacent to existing general-purpose infrastructure running existing business applications. Some businesses may be able to fit their AI applications into their existing data centers, but many will need to build or rent new infrastructure.
User-Centric Connectivity Has to Innovate
Modern networking is being disrupted in the data center but user-facing networking has largely stagnated. Users are getting slightly faster connections but everything feels mostly the same. In this episode, Tom Hollingsworth is joined by Sam Clements and Ed Weadon as they discuss innovation in the edge of the network. They talk about how companies like Cisco have been trying to bring users into the modern era. They talk about the centralization of management in the cloud and how competition has driven those moves. They also look ahead to Cisco Live and discuss the releases they would most like to see at the event.
Scaling Smarter Optimizes Cloud Costs in the Age of Data Abundance
Keeping every application and every scrap of data on the public cloud becomes very expensive; we need to improve our cloud economics. This episode of the Tech Field Day podcast features Vriti Magee, Mitch Lewis, and Alastair Cooke. The belief that data is the new oil has led many companies to retain every piece of data they generate, often in object storage on public cloud platforms. The continuous growth of this data leads to a growing bill from the cloud provider, often with no clear plan in place for recouping the value of the money spent. Generative AI requires training data, which is another reason to retain everything; again, there needs to be value returned to the business. New designs for cloud applications must include data management and managed retention as key criteria. Sustainable, honest designs that enable business change are vital for delivering value back to the business.
Compliance Does Not Equal Security
Compliance reinforces the state of security in your organization. However, compliance in and of itself is not security. In this episode of the Tech Field Day Podcast, Tom Hollingsworth is joined by Jack Poller and Milou Meier as they discuss the nuance between securing your organization and ensuring compliance. They discuss the challenges with audits and the “checkbox” mentality that has become common. They also discuss how organizations face different challenges globally and how you can help ensure that you aren’t being exposed to problems in the future.
The Unknown Unknowns of Cloud Providers with Catchpoint
Your Internet Application is full of unknowns, which will affect its performance and availability for your customers. This episode of the Tech Field Day Podcast features Catchpoint CEO and co-founder Mehdi Daoudi, Eric Wright, Jon Myer, and Alastair Cooke. Internet applications are seldom self-contained, relying on other web services for specialized functions and needing responses from the services before a final response to a user. Functions such as DDoS protection, tracking, embedded advertising, and other valuable services enable faster application feature development, but at what cost? Any delayed response from these services can slow down your application for your users, leading to dissatisfaction, even when your servers perform beautifully. Remember that the services you choose to use may, in turn, use other external services. Catchpoint champions user-centric monitoring and Internet Performance Monitoring (IPM) to complement existing APM tools. Visibility of issues outside your data center is vital to identifying issues before they become helpdesk tickets or application outages. If this Tech Field Day Podcast episode piques your interest, watch the Catchpoint appearance at Cloud Field Day on YouTube.
Managing Hybrid Cloud Networks Complexity with Infoblox
Managing hybrid-cloud networks is complex due to differing architectures and naming between on-premises and the multiple public cloud platforms. This Tech Field Day Podcast episode features Glenn Sullivan, Senior Director of Product Management at Infoblox, Eric Wright, and Alastair Cooke. Each public cloud has a unique management console and network management paradigm; none provides deep integration with each other or with on-premises networking. It is left to individual customers to assemble a jigsaw of pieces into a coherent whole. Customers may not plan to use multiple public clouds, but through different project requirements or mergers and acquisitions, most large organizations find themselves in a hybrid multi-cloud environment. Combining fast-changing public cloud applications with on-premises applications further complicates network management, requiring an automation-based approach. Infoblox UDDI (Universal DNS, DHCP, and IPAM) provides a consistent, automatable interface to manage and operate basic network infrastructure across all enterprise locations. UDDI includes bi-directional operation where changes using cloud-native consoles are visible in UDDI and vice versa.
Wi-Fi is Fast Enough
Modern Wi-Fi connections rely on more than just raw throughput to measure performance. The complexity of wireless as a medium makes the user experience more varied and creates difficulties in troubleshooting. In this episode, Tom Hollingsworth is joined by Keith Parsons, Rocky Gregory, and Ron Westfall as they discuss the state of Wi-Fi and how performance works. They talk about the challenges with properly designed wireless networks and how data sheets make assumptions about the environment. They also discuss user expectations for performance and how workflows involve many moving parts that can impact overall user experiences.
Quality Data is the Foundation for AI with Qlik
AI ought to be able to help businesses derive value from their data, but not all AI applications have a solid foundation. This episode of the Tech Field Day podcast looks forward to Qlik Connect 2025, featuring delegates Gina Rosenthal and Jim Czuprynski discussing the importance of data with Nick Magnuson of Qlik and host Stephen Foskett. Last year Qlik introduced Answers, a RAG AI product that delivers intelligence from unstructured data. This year we expect to see much more integration with structured data, analytics, business intelligence, and agentic AI, as Qlik’s customers seek to deliver innovative solutions. Mature organizations are focused on building a solid governance foundation for their data, ensuring responsible and ethical use in AI applications. The advent of agentic AI raises more concerns, as autonomous agents are empowered to take action without human involvement. Responsible use must include strict limits and human supervision to make sure AI agents remain controlled. We’re looking forward to customer stories, technical takeaways, and maybe some new product introductions at Qlik Connect this year!
AI Needs to Be Boring
Mature technologies deliver business value by integration into boring production applications, so AI needs to be boring. This Tech Field Day Podcast episode features Max Mortillaro, Guy Currier, Jay Cuthrell, and Alastair Cooke. AI has frequently been in the public news, many organizations are busy building AI infrastructure and pipelines, and vendors have tagged their applications with AI to ride the hype. Yet, business value is usually delivered in applications that serve customers rather than generating headlines. The first steps towards AI being a functional but boring part of production applications have emerged, with interoperability mechanisms like MCP and A2A are vital steps towards pervasive AI. Options for Small Language Models (SLM) are opening up more cost-effective use of generative AI, while predictive AI continues to be the standard boring production AI. Data and output safety are other areas for development; avoiding GenAI hallucinations, model poisoning, and data leakage is vital for AI to become boring. Eventually, Generative AI will be as invisible and valuable in mainstream business applications, leading to a return on all the current investments.
Virtual Networks are Air Gapped
The definition of traditional security technologies must evolve to meet new use cases. Networks that use virtual constructs to segregate traffic are just as air gapped as physical separation. In this episode, Tom Hollingsworth is joined by Carole Warner Reece, John Osmon, and Jason Gintert discuss why the standard for hyper secure systems has always been physical separation. They look at how the terminology is being changed to support new use cases with virtual separation and whether or not those new networks can meet the high standards of the older versions. They also discuss the need for precision in terminology and how to avoid falling back on marketing terms that can create confusion with unsuspecting consumers.
You Already Have the Platform and Skills for On-premises AI Applications
Alastair Cooke explores the potential for businesses to leverage their existing on-premises infrastructure and skills to develop AI applications, bypassing the complexities and costs associated with cloud services. He emphasizes the strategic use of current hardware and team capabilities to enable innovation while maintaining control over data and processes. Watch the full episode of this Tech Field Day podcast on YouTube or at Techstrong IT.
Servers Are Still Relevant in the Age of Cloud with HPE
Although we live in a world of software, server hardware still matters from datacenter to cloud to edge. This episode of the Tech Field Day Podcast features Scott Shaffer, VP and Chief Technologist at HPE, discussing the evolution of the server with Jack Poller, Vuong Pham, and Stephen Foskett. Although servers might appear to be commoditized, companies like HPE are building optimized designs for various purposes. Edge servers, for example, are a unique form factor and have special requirements for mounting, air filtering, security, power efficiency, management, and more. Datacenter and cloud servers have special optimizations as well, with efficiency, expansion, and cooling in special focus. As we turn to AI and HPC, servers have to support unprecedented levels of electrical power and cooling, with air cooling still very much in demand but incredible advances being made on liquid cooling as well. The Tech Field Day delegates will be on-site with HPE on the same day this episode airs, learning more about the ProLiant line. Tune in and watch the Tech Field Day presentations on YouTube.
Production AI Applications with VMware Private AI on VCF
You already have the people and the platform to run production AI applications in your on-premises data center. This episode of the Tech Field Day podcast, presented by Broadcom, features Tasha Drew, Gina Rosenthal, Jay Cuthrell, and Alastair Cooke. The public cloud is a great place to innovate and test new technologies or for bursty workloads where on-demand access to near-limitless resources is essential. Predictable and steady-state production workloads are often more cost-effective on-premises, and AI applications are no different. Your existing on-premises compute platform, based on VMware Cloud Foundation, is a great place to run production AI applications with more direct cost control while keeping your data on-premises. Running your AI applications on your existing platform capitalizes on your investment in software, hardware, and your staff, who won’t need to learn a new paradigm.
AI Innovation Inevitably Drives Massive Energy Consumption
In this article, Stephen Foskett examines the significant rise in energy demand directly correlated with advances in artificial intelligence technologies. He highlights concerns about sustainability and the potential environmental impacts of this trend. Watch for more insights from AI Field Day 6 on Techstrong AI.
Not All AI Infrastructure Is The Same
Enterprises require vastly different infrastructure for AI. When building your next network, you need to understand what is required in order to achieve specific outcomes. In this episode of the Tech Field Day podcast, Tom Hollingsworth is joined by Scott Robohn, Brad Gregory, and Ron Westfall to discuss the various different types of AI infrastructure. They talk about inferencing and models as well as how to effectively utilize what you currently have. They also discuss what to look for when buying new equipment and how best to put it to use in order to maximize return on investment.
AI Innovation Inevitably Drives Massive Energy Consumption
The increasing reliance on technology has led to a surge in energy consumption, raising concerns about sustainability. In this episode of the Tech Field Day Podcast, recorded live at AI Field Day in San Jose, California, Jim Czuprynski, Jack Poller, Andy Banta, and Stephen Foskett discuss the challenges of powering data centers and the environmental impact of AI and modern computing. They explore potential solutions, including nuclear power, solar energy, and small modular reactors, while also addressing the issue of heat dissipation from high-density computing. The conversation highlights the need for innovation in energy efficiency, with advancements in semiconductor technology and battery storage playing a crucial role. While concerns about power availability persist, the panel remains optimistic that technological progress and cross-industry collaboration will drive sustainable solutions for the future.
Trade Restrictions will Allow China to Out Innovate US AI Companies
China will out-innovate US AI companies because of the trade restrictions imposed on it. In this episode of the Tech Field Day Podcast features Ned Bellavance, Eric Wright, Justin Warren, and Alastair Cooke. They say that necessity is the mother of invention. US restrictions on AI chip exports have driven China to develop a sophisticated generative AI solution with older technology. Are the restrictions making Chinese companies more innovative than their US counterparts? DeepSeek was trained with far fewer resources than previous Large Language Models. On the other hand, DeepSeek isn’t groundbreaking, apart from the apparent censorship around taboo topics to the Chinese establishment.