As we build out AI infrastructure and applications we need resource efficiency, continuously buying more horsepower cannot go on forever. This episode of the Tech Field Day podcast features Pete Welcher, Gina Rosenthal, Andy Banta, and Alastair Cooke hoping for a more efficient AI future. Large language models are trained using massive farms of GPUs and massive amounts of Internet data, so we expect to use large farms of GPUs and unstructured data to run those LLMs. Those large farms have led to scarcity of GPUs, and now RAM price increases that are impeding businesses building their own large AI infrastructure. Task-specific AIs, that use more efficient, task-specific models should be the future of Agentic AI and AI embedded in applications. More efficient and targeted AI may be the only way to get business value from the investment, especially in resource constrained edge environments. Does every AI problem need a twenty billion parameter model? More mature use of LLMs and AI will focus on reducing the cost of delivering inference to applications, your staff, and your customers.
Cutting-Edge AI Networking and Storage Kick Off 2026 at AI Infrastructure Field Day 4
We’re kicking off 2026 with one of our most popular events, AI Infrastructure Field Day 4, running from January 28th through January 30th. The event will stream live on LinkedIn, Techstrong TV, the Tech Field Day website, and for the first time ever, on our YouTube channel, offering a front-row view of the latest in […]
Modern Data Mobility is Challenging the Laws of Physics with Hammerspace
Modern data mobility is challenging the laws of physics; the speed of light is a fundamental limit for moving signals. This episode of the Tech Field Day podcast features Kurt Kuckein from Hammerspace discussing data movement and management with Jim Jones, Jack Poller, Andy Banta, and Alastair Cooke. The challenge is that the distributed nature of data, spread across the globe, creates significant obstacles for AI, particularly regarding the speed of light and power consumption. We delve into overcoming these limitations through technologies that facilitate data access and movement, touching on concepts such as efficient storage solutions (Open Flash Platform), the importance of centralized data management, and the agility required for evolving AI workloads. While the underlying principles of data management are not new, the scale and complexity of AI necessitate innovative approaches to ensure data can be accessed and utilized effectively, regardless of its physical location.
Google Cloud Provides a Complete AI Portfolio
Andy Banta highlights the comprehensive AI portfolio provided by Google Cloud, which is designed to cater to various business needs ranging from foundational models to tailored solutions for enterprise challenges. This coverage showcases the depth and flexibility of Google Cloud’s offerings in the AI space, affirming its position as a significant player in the industry. For additional insights into AI Infrastructure Field Day 2, explore more articles on Techstrong AI.
We Are Still in the Early Innings of AI
Alastair Cooke explores the nascent stage of AI development following a delegate discussion at AI Field Day 6. The panel emphasized that while significant progress has been made, the technological journey is still in its early phases. They also discussed various advancements and challenges in the AI field, underscoring the potential and limitations currently faced by the industry. For more insights and detailed analysis, watch Techstrong AI.
AI Innovation Inevitably Drives Massive Energy Consumption
The increasing reliance on technology has led to a surge in energy consumption, raising concerns about sustainability. In this episode of the Tech Field Day Podcast, recorded live at AI Field Day in San Jose, California, Jim Czuprynski, Jack Poller, Andy Banta, and Stephen Foskett discuss the challenges of powering data centers and the environmental impact of AI and modern computing. They explore potential solutions, including nuclear power, solar energy, and small modular reactors, while also addressing the issue of heat dissipation from high-density computing. The conversation highlights the need for innovation in energy efficiency, with advancements in semiconductor technology and battery storage playing a crucial role. While concerns about power availability persist, the panel remains optimistic that technological progress and cross-industry collaboration will drive sustainable solutions for the future.
The Evolution of CXL
Andy Banta recently shared insights on the evolving landscape of Compute Express Link (CXL) technology and its significant implications for memory architecture and interconnect standards in compute environments. By highlighting the benefits and efficiencies CXL brings to data-intensive applications, he underscores its role in advancing the performance and scalability of enterprise systems. In this article, Banta mentions the Enfabrica presentation at AI Field Day 5.
Data Infrastructure Is A Lot More Than Storage
The rise of AI and the importance of data to modern businesses has driven us too recognize that data matters, not storage. This episode of the Tech Field Day podcast focuses on AI data infrastructure and features Camberley Bates, Andy Banta, David Klee, and host Stephen Foskett, all of whom will be attending our AI Data Infrastructure Field Day this week. We’ve known for decades that storage solutions must provide the right access method for applications, not just performance, capacity, and reliability. Today’s enterprise storage solutions have specialized data services and interfaces to enable AI workloads, even as capacity has been driven beyond what we’ve seen in the past. Power and cooling is another critical element, since AI systems are optimized to make the most of expensive GPUs and accelerators. AI also requires extensive preparation and organization of data as well as traceability and records of metadata for compliance and reproducibility. Another question is interfaces, with modern storage turning to object stores or even vector database interfaces rather than traditional block and file. AI is driving a profound transformation of storage and data.
Solidigm and Supermicro Put “Green” in Greenfield
Supermicro and Solidigm are actively tackling the intensifying demands of AI workloads with their efficient, rack-sized storage solutions, designed to streamline AI data pipelines, as discussed at a recent AI Field Day event. They are engineering a three-tiered platform specifically to accommodate the diverse needs of AI processes—from data ingestion and transformation to training and inference—enhancing performance with high-speed, high-density SSDs from Solidigm. As AI data centers seek to become more environmentally sustainable, the partnership underscores their commitment to delivering solutions that not only meet the technical demands of AI but also pave the way for greener computing practices. Read more in this article by Andy Banta, sponsored by Solidigm.
Composing a Harmonized Infrastructure With Solidigm and Supermicro
As data center workloads become more intensive, storage struggles to keep pace with CPU and memory in terms of speed and density increases. Composable infrastructure, like that offered by Solidigm, serves as an answer by presenting resources as and when they are needed, specifically the D5-P5430 Solidigm data center SSD. It’s a storage device that communicates via NVMe over PCIe gen 4.0 and is available in a U.2, E1.S, and E3.S form factor, with Solidigm using Enterprise Datacenter Standard Form Factor (EDSFF), a design that provides better density options than standard drive form factors while also being front-loading and hot-pluggable. This sponsored article by Andy Banta looks deeper at the Solidigm P5430 SSD family.
CXL: Composable Infrastructure’s Missing Link
“CXL probably isn’t for every configuration, especially anything other than top, high-end compute systems. But it’s no hoax. It ain’t the Piltdown Man.” Andy Banta gives an entertaining explanation of CXL, the history, the current, and his thoughts on where it’s going. Read his full thoughts on his website.
The IT Industry Is Doing Better Than It Seems
Is the IT industry in trouble? This episode of On-Premise IT features Andy Banta, Nico Stein, and Geoff Burke, who will be attending Tech Field Day this week, discussing the state of the industry with Stephen Foskett. Check out the full podcast on Gestalt IT’s Youtube channel or your favorite podcast platform.
Developer Advocacy Isn’t Exactly What We Think It Will Be
Check out the latest Gestalt IT Roundtable podcast! Stephen Foskett, podcast moderator, was joined with a few Tech Field Day delegates, Andy Banta, Gina Rosenthal, and Josh Warcop, to discuss developer advocacy. Check out their thoughts here!
Andy Banta
Storage Janitor – seasoned technology professional












