The Shifting Focus of AI at AI Field Day 5

AI Field Day 5 is coming on September 11-13, reflecting the shift in enterprise AI from large-scale models to practical applications, including vector databases and agentic AI. The event will gather industry leaders like Cisco, VMware, Keysight, and Arista to explore real-world applications, AI-ready infrastructures, and private AI solutions. Attendees will also hear from companies like Integrail, Elastic, and Enfabrica to learn about next-generation AI capabilities. Tune in live on LinkedIn or Techstrong TV!


AI is Not a Fad

The current hype about building massive generative AI models with massive hardware investment is just one aspect of AI. This episode of the Tech Field Day podcast features Frederic Van Haren, Karen Lopez, Marian Newsome, and host Stephen Foskett taking a different perspective on the larger world of AI. Our last episode suggested that AI as it is currently being hyped is a fad, but the bigger world of AI is absolutely real. Large language models are maturing rapidly and even generative AI is getting better by the month, but we are rapidly seeing the reality of the use cases for this technology. All neural networks use patterns in historical data to infer results, so any AI engine could hallucinate. But traditional AI is much less susceptible to errors than the much-hyped generative AI models that are capturing the headlines today. AI is a tool that augments our knowledge and decision making, but it doesn’t replace human intelligence. There is a whole world of AI applications that are productive, responsible, and practical, and these are most certainly not a fad.


AI as We Know It is Just a Fad

Although AI is certain to transform society, not to mention computing, what we know if it is unlikely to last much longer. This episode of the Tech Field Day podcast brings together Glenn Dekhayser, Alastair Cooke, Allyson Klein, and Stephen Foskett to discuss the real and changing world of AI. Looking at AI infrastructure today, we see massive clusters of GPUs being deployed in the cloud and on-premises to train ever-larger language models, but how much business value do these clusters have long-term? It seems that the true transformation promised by LLM and GenAI will be realized once models are applied across industries with RAG or tuning rather than developing new models. Ultimately AI is a feature of a larger business process or application rather than being a product in itself. We can certainly see that AI-based applications will be transformative, but the vast investment required to build out AI infrastructure to date might never be recouped. Ultimately there is a future for AI, but not the way we have been doing it to date.