Billion-Dollar AI Headlines Obscure Real Business Value

The big headlines that we’re seeing around the massive funding of large AI companies are a distraction from the reality that AI is being built and used in business applications. This episode of the Tech Field Day podcast features Frederic Van Haren, Chris Grundemann, Brian Martin, and Alastair Cooke reflecting after AI Infrastructure Field Day in Santa Clara. Popular news often covers the creation of large, general purpose AI models, yet the real-world application of AI through inference is where most companies see a return on their investment. Similarly, the common understanding of “AI” is as a single topic, without a more granular view that differentiates between rules-based systems, traditional machine learning, and emergent generative models like Large Language Models (LLMs). Specialized AI models will be vital for cost-effective applications with enhanced efficiency and the integration of diverse AI capabilities into agentic architectures. Advanced security protocols and regulatory frameworks are vital to mitigate novel vulnerabilities, organizations must adapt to an extraordinarily rapid pace of technological evolution. AI has already had a profound impact on software development, potentially enabling widespread custom application creation.

AI Needs Resource Efficiency

As we build out AI infrastructure and applications we need resource efficiency, continuously buying more horsepower cannot go on forever. This episode of the Tech Field Day podcast features Pete Welcher, Gina Rosenthal, Andy Banta, and Alastair Cooke hoping for a more efficient AI future. Large language models are trained using massive farms of GPUs and massive amounts of Internet data, so we expect to use large farms of GPUs and unstructured data to run those LLMs. Those large farms have led to scarcity of GPUs, and now RAM price increases that are impeding businesses building their own large AI infrastructure. Task-specific AIs, that use more efficient, task-specific models should be the future of Agentic AI and AI embedded in applications. More efficient and targeted AI may be the only way to get business value from the investment, especially in resource constrained edge environments. Does every AI problem need a twenty billion parameter model? More mature use of LLMs and AI will focus on reducing the cost of delivering inference to applications, your staff, and your customers.

Cutting-Edge AI Networking and Storage Kick Off 2026 at AI Infrastructure Field Day 4

We’re kicking off 2026 with one of our most popular events, AI Infrastructure Field Day 4, running from January 28th through January 30th. The event will stream live on LinkedIn, Techstrong TV, the Tech Field Day website, and for the first time ever, on our YouTube channel, offering a front-row view of the latest in […]

The Rise of DDoS Attacks and Nokia’s $4B AI Networking Bet | TSG Ep. 974

The November 25 episode of Techstrong Gang focuses on massive DDoS attacks, the emergence os Tennessee as an AI datacenter hub, and Nokia’s rise as a provider of AI infrastructure leader with a $4 billion investment announced. Stephen Foskett discussed Nokia’s recent appearance at Networking Field Day 39, where the company made the case that it deserves a seat at the table in AI networking, as well as new management and investments. Learn more about Nokia’s AI infrastructure at AI Infrastructure Field Day 4 in January.