At AI Field Day, Qlik Shows AI-Based Analysis Added to Its Platform

At AI Field Day, Qlik unveiled a wizard-based AI feature that simplifies the process of leveraging on-premises data for insightful analytics, integrating smoothly with Qlik’s cloud services. This enhancement to their analytics platform aims to democratize AI’s benefits, making advanced data analysis accessible to a broader range of users with varying expertise. Qlik’s initiative reflects a commitment to user-friendly, AI-powered analytics, facilitating deeper insights while streamlining the experience for its customers. Read more in this analyst note for The Futurum Group by Alastair Cooke.


Why Storage Matters for AI: Solidigm Shares POV at AI Field Day

During AI Field Day, Solidigm’s Ace Stryker and Alan Bumgarner illustrated the pivotal role of SSDs in AI applications, showcasing how they cater to the high data demands of models and workflows with increased efficiency. They highlighted the superiority of SSDs over HDDs in terms of performance and Total Cost of Ownership, emphasizing the tangible benefits from greater data density to reduced physical infrastructure needs. The presentation honed in on the importance of storage in AI, linking Solidigm’s advanced SSD solutions with scalable and power-efficient AI server operations, resonating with sustainability goals and operational cost reduction.


Google Cloud, the Preferred Platform for Building Competitive AI Models

At AI Field Day, Google Cloud’s Brandon Royal showcased the giant’s comprehensive strategy for meeting today’s burgeoning AI demands, leveraging one of the industry’s most extensive digital infrastructures. Emphasizing the significance of AI infrastructure in conjunction with generative AI (GenAI), Google Cloud highlighted their commitment to innovation, asserting their platform as the superhighway for AI-forward companies. With Google Cloud providing robust compute power off its own infrastructure, businesses can harness AI’s opportunities without the traditionally high entry barriers of infrastructure costs and expertise. Read more in this article by Sulagna Saha for Gestalt IT.


Deciding When to Use Intel Xeon CPUs for AI Inference

At AI Field Day, Intel offered insights into strategic decision-making for AI inference, highlighting scenarios where Intel Xeon CPUs outshine traditional GPU solutions on both on-premises and cloud servers. By evaluating the specific requirements of AI inference workloads, Intel guides users to make informed choices that enhance value while optimizing their existing server infrastructure. This approach emphasizes efficiency and practicality in deploying AI capabilities, ensuring that organizations can navigate the complex landscape of hardware selection for their AI initiatives. Read more in this Futurum Research Analyst Note by Alastair Cooke.


The Continuing Evolution of Forward Networks – Networking Field Day 34

Reflecting on Networking Field Day 13, Rob Coote discusses the Forward Networks platform with its groundbreaking digital twin technology, allowing a detailed grasp of network configurations and potential impact of changes. Since then, their continuous evolution was showcased at Networking Field Day 34, where Forward Networks revealed its integration of AI and LLM into their established platform, providing a refined, natural language query experience and heightening network visibility. This trajectory not only illustrates Forward Networks’ commitment to innovation but also exemplifies how AI, when thoughtfully applied, can significantly enhance the utility and sophistication of tech solutions in network management.


Hammerspace Shows Storage Acceleration for AI Training

At AI Field Day, Hammerspace showcased its innovative storage acceleration solution, demonstrating how Hyperscale NAS can be leveraged to enhance the performance of current scale-out NAS systems, particularly in training large language models (LLM) efficiently. This storage boost not only improves speed but also optimizes resource allocation during the intensive LLM training process. Hammerspace’s advancement offers organizations the opportunity to amplify their AI training capabilities without the need to overhaul their existing storage infrastructure. Read more in this Futurum Research Analyst Note by Alastair Cooke.


VAST Data Soars With Industry Heavyweights

As Allyson Klein writes, VAST Data is revealing a major shift in AI strategy, joining forces with NVIDIA and Supermicro to bolster its role as a forward-thinking AI data platform. By embracing a novel architecture that eschews traditional x86 design for a powerful GPU-centric platform with NVIDIA DPUs, VAST Data is poised to redefine data storage, promising significant energy efficiency and enhanced performance for AI workloads. The company’s move shifts the AI training landscape towards GPU-native frameworks and sets VAST Data as a key innovator in an infrastructure industry ripe for disruption.


Does Storage Matter in AI Inferencing? What About the SSD?

Keith Townsend reacts to Solidigm’s presentation at AI Field Day, considering the role of storage systems in AI inferencing and the impact of SSD selection on AI system design. This video underscores the significance of considering storage performance and reliability when devising robust AI inferencing architectures. Solidigm’s discussion reflected a deeper industry focus on the intricate relationship between storage solutions and AI capabilities, suggesting that the choice of SSDs could be a pivotal factor in optimizing AI inferencing operations.


VMware Private AI at AI Field Day

VMware’s presentation with Intel at AI Field Day centered on optimizing on-premises AI workloads, highlighting the capability of Intel Sapphire Rapids CPUs with Advanced Matrix Extensions (AMX) to efficiently perform large language model (LLM) AI inference, traditionally a task for GPUs. Demonstrating that AI can be resource-effective on CPUs, the discussion covered the technical prerequisites for harnessing AMX in vSphere environments and the ongoing integration of these accelerators into popular AI frameworks. With CPUs increasingly capable of handling AI tasks through built-in matrix math acceleration, VMware showcases a sustainable, cost-effective approach, potentially reshaping the hardware strategies for mixed workload servers. Read more in this analyst note for The Futurum Group by Alastair Cooke.


Gemma and Building Your Own LLM AI

At AI Field Day 4, Intel invited the Google Cloud AI team to showcase their Gemma large language model (LLM), revealing insights into the advanced infrastructure used for building such models on Google Cloud. The presentation underlined Gemma’s efficiency with fewer parameters for inference, highlighting Google Cloud’s strength in analytics and AI, particularly in managing differing resource needs between model training and application inference phases. Google Cloud’s integration of AI in products was illustrated with Google Duet, an AI-based assistant that aids in software development, exemplifying the potential future where AI handles more coding tasks, freeing up developers for high-level problem-solving and design. Read more in this analyst note for The Futurum Group by Alastair Cooke.


Intel Xeon CPUs on VMware vSphere – A Powerful and Cost-Effective Twosome for AI/ML Workloads

With AI ingrained in our daily routines, Forward Networks delivered a strategic approach at Networking Field Day, demonstrating how even complex networking data can be made manageable through AI integration. Their platform uses a data-first principle, enabling AI to interact effectively with a digital twin of network infrastructure, simplifying tasks for network engineers. The innovative AI Assistant within Forward Networks’ ecosystem assists in constructing queries for the Network Query Engine, fostering trust through verifiable, human-readable outputs, and providing a gateway for more intuitive network management. Read more in this article by Sulagna Saha on Gestalt IT.


Forward Networks Now With More AI

Amidst the ubiquitous influence of artificial intelligence in our daily lives, Forward Networks stood out to Josh Warcop by integrating AI with a robust data-first approach, as showcased at Networking Field Day. Their creation of a digital twin for network infrastructure ensures precise data, allowing AI to effectively communicate with it, bypassing the challenges of manual data modeling. Forward Networks’ Network Query Engine, equipped with an AI Assistant, generates human-readable queries, bridging the gap between complex data systems and operator expertise and fostering trust in AI’s capabilities within network management.


Let Someone Else Install Your Campus Network

As Aaron Conaway posted, Nile showcased their comprehensive campus network solutions at Networking Field Day, emphasizing their end-to-end service that extends beyond hardware delivery to include detailed network planning and on-site evaluations. Their approach ensures not only installation but also up-to-date firmware and secure configurations managed via a cloud-based system, challenging the traditional “install now, secure later” methodology. Nile further distinguishes itself with ongoing management, monitoring, and proactive problem and replacement services, offering an innovative model for organizations lacking significant capital expenditure budgets.


Nile – Network as a Service – Considerations for Adoption

Attending Network Field Day gave Ryan Lambert a first-hand look at Nile’s innovative Network as a Service (NaaS) offering, a service model that manages and maintains a customer’s network infrastructure. Nile stands out by not only managing and monitoring networking equipment but also engaging with partners for on-site surveys to ensure top-notch performance, guaranteeing service levels for the hassle-free operation of customer networks. However, it’s important for customers to understand the shared responsibility in such managed solutions, particularly regarding the physical environment and specific requirements, ensuring Nile’s service guarantee aligns with their needs.


No Inventory, No Proper Management

At Networking Field Day 34, Forward Networks, Inc. showcased the substantial advantages of implementing a digital twin for network management. The digital twin provides a dynamic and thorough inventory by constantly analyzing the network to reflect its current state, automatically updating documentation, and enhancing compliance and automation efforts. Aaron Conaway emphasized the crucial need for up-to-date network visibility, asserting that reliable network management is impossible without a comprehensive and current understanding of one’s network infrastructure.


Defeating Data Gravity? – Hammerspace

According to Keith Townsend, Hammerspace presented a compelling argument for a shift in overcoming data gravity by moving data closer to accelerated computing resources at AI Field Day. Their solution, a parallel file system, acts as a bridge between dispersed data sources, offering a unified metadata view that streamlines data preparation for AI tasks. While Hammerspace’s technology appears to enhance user experience, it also requires strategic GPU placement and considerations around data governance and movement across geopolitical boundaries.


Forward Networks – The Tale of an Aptly Named Company

In Ryan Lambert’s LinkedIn Pulse article he delves into the capabilities of Forward Networks, a company he says lives up to its name. The company creates a Digital Twin of network data, encompassing router configurations, route tables, and firewall policies. Lambert emphasizes the practicality of this approach, enabling users to gain insights into network behavior, potential vulnerabilities, and operational breakdowns.


Taking on AI Inferencing With 5th Gen Intel Xeon Scalable Processors

Intel’s 5th Generation Xeon Scalable Processor, known as Emerald Rapids, offers an advantageous solution for AI inferencing, providing a compelling alternative to GPUs in certain applications. Highlighted during the AI Field Day event, Intel showcased the processor’s suitability for general-purpose AI workloads, especially for private AI deployments requiring lower latency and mixed workloads. In his presentation, Ro Shah illustrated that Xeon CPUs are well-equipped to handle AI models with fewer than 20 billion parameters, making them a cost-effective and efficient choice for many enterprises. Read more in this article from Gestalt IT.


Replacing Human-Driven Automation With One That Is Event-Driven, With Cisco

In a move to enhance network management efficiency, Cisco is replacing the existing human-driven automation with an innovative event-driven approach. Network controllers centralize operations by maintaining an inventory of network states, and event-driven automation streamlines this further by automating responses to network changes, reducing manual intervention and speeding up the resolution process. Cisco’s event-driven solutions were explored in detail at Tech Field Day Extra at Cisco Live EMEA 2024, promising elevated operational capabilities with a focus on observability and swift auto-configuration. Read more in this article from Gestalt IT.


Policy Management Gets Easier With Cisco AI Assistant

Cisco’s AI Assistant, unveiled at Tech Field Day Extra at Cisco Live EMEA 2024, promises to revolutionize policy management within Secure Access, offering swift and reliable creation and enforcement of security policies. Emphasizing simplicity and risk reduction, the AI tool handles English language prompts to generate comprehensive policies, demonstrating its capability to securely augment and automate IT administrator tasks. Showcasing AI’s monitoring role, Cisco Secure Access also ensures protected use of AI tools within organizations, preventing sensitive data breaches and incorporating intelligent user access control.