AI Solves All Our Problems

Although AI can be quite useful, it seems that the promise of generative AI has lead to irrational exuberance on the topic. This episode of the Tech Field Day podcast, recorded ahead of AI Field Day, features Justin Warren, Alastair Cooke, Frederic van Haren, and Stephen Foskett considering the promises made about AI. Generative AI was so impressive that it escaped from the lab, being pushed into production before it was ready for use. We are still living with the repercussions of this decision on a daily basis, with AI assistants appearing everywhere. Many customers are already frustrated by these systems, leading to a rapid push-back against the universal use of LLM chatbots. One problem the widespread mis-use of AI has solved already is the search for a driver of computer hardware and software sales, though this already seems to be wearing off. But once we take stock of the huge variety of tools being created, it is likely that we will have many useful new technologies to apply.


AI as We Know It is Just a Fad

Although AI is certain to transform society, not to mention computing, what we know if it is unlikely to last much longer. This episode of the Tech Field Day podcast brings together Glenn Dekhayser, Alastair Cooke, Allyson Klein, and Stephen Foskett to discuss the real and changing world of AI. Looking at AI infrastructure today, we see massive clusters of GPUs being deployed in the cloud and on-premises to train ever-larger language models, but how much business value do these clusters have long-term? It seems that the true transformation promised by LLM and GenAI will be realized once models are applied across industries with RAG or tuning rather than developing new models. Ultimately AI is a feature of a larger business process or application rather than being a product in itself. We can certainly see that AI-based applications will be transformative, but the vast investment required to build out AI infrastructure to date might never be recouped. Ultimately there is a future for AI, but not the way we have been doing it to date.


Morpheus Unified Management for Multi-Cloud Platform Choice

Alastair Cooke, writing for The Futurum Group, discusses the Morpheus Data presentation at Cloud Field Day 20. Morpheus offers a unified management solution that simplifies the complexities of handling multiple cloud platforms. Its comprehensive approach empowers businesses to seamlessly manage applications, orchestrate containers, and automate workflows across diverse cloud environments. This capability significantly enhances operational efficiency, ensuring that organizations can leverage the flexibility of multi-cloud strategies without compromising on performance or security.


Row-Scale On-Premises Cloud Infrastructure From Oxide Computer

Alastair Cooke at Futurum Research discusses Oxide Computer’s initiative to deliver row-scale, on-premises cloud infrastructure, addressing the growing demand for scalable and manageable IT resources within the enterprise. The article examines how Oxide’s innovative approach aims to combine the convenience and agility of the public cloud with the security and control of on-premises systems. Highlighting a shift in cloud computing, the analysis explores Oxide’s potential to redefine data center operations by offering a comprehensive, customer-centric solution. Read more Cloud Field Day 20 coverage at The Futurum Group!


Google Delivers Titanium Hardware Offload for Performance

In a detailed analysis for Futurum Research, Alastair Cooke discusses Google’s Titanium, a hardware offload technology aimed at enhancing compute performance. The article discusses how Titanium addresses efficiency by offloading tasks from the CPU, thereby boosting processing speeds and reducing latency for demanding workloads. The article also highlights this strategic move by Google as a solution that could redefine performance optimization in data centers, marking a significant advancement in cloud computing capabilities. Look for more Cloud Field Day 20 coverage from Alastair soon!


Containerization is Required to Modernize Applications at the Edge

Modern applications are widely deployed in the cloud, but they’re coming at the edge as well. This episode of the Tech Field Day podcast features Alastair Cooke and Paul Nashawaty from The Futurum Group, Erik Nordmark from ZEDEDA, and host Stephen Foskett discussing the intersection of application modernization and edge computing.


At AI Field Day, Qlik Shows AI-Based Analysis Added to Its Platform

At AI Field Day, Qlik unveiled a wizard-based AI feature that simplifies the process of leveraging on-premises data for insightful analytics, integrating smoothly with Qlik’s cloud services. This enhancement to their analytics platform aims to democratize AI’s benefits, making advanced data analysis accessible to a broader range of users with varying expertise. Qlik’s initiative reflects a commitment to user-friendly, AI-powered analytics, facilitating deeper insights while streamlining the experience for its customers. Read more in this analyst note for The Futurum Group by Alastair Cooke.


Deciding When to Use Intel Xeon CPUs for AI Inference

At AI Field Day, Intel offered insights into strategic decision-making for AI inference, highlighting scenarios where Intel Xeon CPUs outshine traditional GPU solutions on both on-premises and cloud servers. By evaluating the specific requirements of AI inference workloads, Intel guides users to make informed choices that enhance value while optimizing their existing server infrastructure. This approach emphasizes efficiency and practicality in deploying AI capabilities, ensuring that organizations can navigate the complex landscape of hardware selection for their AI initiatives. Read more in this Futurum Research Analyst Note by Alastair Cooke.


Hammerspace Shows Storage Acceleration for AI Training

At AI Field Day, Hammerspace showcased its innovative storage acceleration solution, demonstrating how Hyperscale NAS can be leveraged to enhance the performance of current scale-out NAS systems, particularly in training large language models (LLM) efficiently. This storage boost not only improves speed but also optimizes resource allocation during the intensive LLM training process. Hammerspace’s advancement offers organizations the opportunity to amplify their AI training capabilities without the need to overhaul their existing storage infrastructure. Read more in this Futurum Research Analyst Note by Alastair Cooke.


VMware Private AI at AI Field Day

VMware’s presentation with Intel at AI Field Day centered on optimizing on-premises AI workloads, highlighting the capability of Intel Sapphire Rapids CPUs with Advanced Matrix Extensions (AMX) to efficiently perform large language model (LLM) AI inference, traditionally a task for GPUs. Demonstrating that AI can be resource-effective on CPUs, the discussion covered the technical prerequisites for harnessing AMX in vSphere environments and the ongoing integration of these accelerators into popular AI frameworks. With CPUs increasingly capable of handling AI tasks through built-in matrix math acceleration, VMware showcases a sustainable, cost-effective approach, potentially reshaping the hardware strategies for mixed workload servers. Read more in this analyst note for The Futurum Group by Alastair Cooke.


Gemma and Building Your Own LLM AI

At AI Field Day 4, Intel invited the Google Cloud AI team to showcase their Gemma large language model (LLM), revealing insights into the advanced infrastructure used for building such models on Google Cloud. The presentation underlined Gemma’s efficiency with fewer parameters for inference, highlighting Google Cloud’s strength in analytics and AI, particularly in managing differing resource needs between model training and application inference phases. Google Cloud’s integration of AI in products was illustrated with Google Duet, an AI-based assistant that aids in software development, exemplifying the potential future where AI handles more coding tasks, freeing up developers for high-level problem-solving and design. Read more in this analyst note for The Futurum Group by Alastair Cooke.


Scale Computing Removes Problems at the Edge

“Scale Computing’s approach of removing the burden of routine tasks and fault remediation from customers is widely liked.” In this article presented by Scale Computing, Alastair Cooke discusses how Scale Computing is solving problems at the edge. Read his full thoughts on the Gestalt IT website, and watch Scale Computing’s full presentation from Edge Field Day on the Tech Field Day YouTube channel.


Thousands of Containers Without Kubernetes?

“If you need to manage container applications at your distributed edge, look at Avassa’s videos and the other presenters at Edge Field Day.” Alastair Cooke had great things to say about Avassa after attending their presentation at Edge Field Day in February. Read his full thoughts on his website, and watch Avassa’s presentation on the Tech Field Day Youtube Channel.


Unraveling the Key Themes of Edge Computing – a Roundtable Discussion

In this article Sulagna Saha summarizes the thoughts and impressions of delegates on the key themes that dominated the presentations at the recent Edge Field Day event. Check out her article at Gestalt IT, or watch the roundtable discussion here at TechFieldDay.com.


Edge Is Not Just Cloud or Datacenter

Even though most of the technologies and infrastructure elements are the same, the nature of edge comping is entirely different. This episode of On-Premise IT features Edge Field Day delegates Enrico Signoretti, Allyson Klein, and Alastair Cooke discussing these differences with Stephen Foskett. Check out the full podcast on Gestalt IT’s Youtube channel.


Deploying Applications Across Edge Locations With Mako Networks and Avassa

In this Roundtable discussion with Mako Networks and Avassa, we hear from co-founders from both companies about deploying applications at scale at the edge. Watch how the conversation unfolded between Stephen Foskett, Simon Gamble, Carl Moberg, Ben Young, and Alastair Cooke on Gestalt IT’s Youtube channel.


Mako Networks, I Haven’t Heard That Name in Years

In this article, Alastair Cooke gives his pre-presentation thoughts on Mako Networks. Read his current knowledge on past and present Mako Networks products on his website, and catch the full presentation on the Tech Field Day website this week!


The Scale Computing Edge

Alastair Cooke is thrilled to talk to Scale Computing about their “complete transformation from an HCI provider to small businesses into a provider of edge computing for massive companies,” at the first Edge Field Day this week! Read his thoughts on his website.


What Is Your Edge?

Alastair Cooke and Charles Uneze got to talking about Edge Compute in lieu of our upcoming Edge Field Day! In Alastair’s post, he covers his thought on near-edge, far-edge, IoT, and Edge Population. Be sure to check out his thoughts before tuning into February’s Edge Field Day!


VVols, Oh VVols, Wherefore Art Thou VVols?

The Pure Storage presentation at VMware Explore 2022 inspired Alastair Cooke to start thinking more about VVOLS. During this presentation, Alastair was wondering “is this event still a thing”, and soon realized that many storage vendors are actively using VVOLS and vSphere 7! He goes into details on VVOLS and VMFS, check out his thoughts here and how it will help your storage design.