During AI Field Day 4, Google Cloud positioned AI as the next major platform shift, emphasizing its role following the internet and mobile, and highlighted their impressive data store capabilities including every search query since the inception of their browser. Google Cloud’s adoption of generative AI and the introduction of Google Gemini and DeepMind’s open model Gemma signal a strong continuation of their open-source ethos, enhancing AI scalability and accessibility. As noted by Allyson Klein in this article, the presentation also underscored Google’s commitment to supporting a diverse AI ecosystem, showcasing extensive compatibility across major silicon platforms, even as the event focused on Intel’s technology.
VAST Data Operationalizing AI
At AI Field Day 4, Keith Townsend engaged with VAST Data’s John Mao and Neeloy Bhattacharyya to discuss the company’s innovative architecture that separates the persistent data layer from stateless logic, optimizing access patterns and improving data preparation efficiency for AI applications. Gina Rosenthal highlights their global namespace approach and the unique “VAST DataBase” system, they are streamlining data availability and scalability across the pipeline.
Google Cloud at Intel Day @ #AIFD4
At AI Field Day, Google Cloud’s Brandon Royal presented insights into AI platforms and infrastructure, emphasizing the shift from the internet to mobile and AI. He introduced Gemini, Google’s multi-modal model, and unveiled Gemma, a family of lightweight models. Royal discussed safety measures and highlighted GKE’s role in AI, covering CPUs, GPUs, and TPUs. A demo showcased Gemma on GKE supporting 8K tokens. Ameer Abbas, Senior Product Manager at Google Cloud, focused on Developer and Operational Productivity with Duet AI, demonstrating code development and transformation capabilities for enterprise builders and developers. Read more in this LinkedIn Pulse article by Gina Rosenthal.
The Bedrock of AI Is Data With Nick Magnuson and Clive Bearman of Qlik
The latest Utilizing Tech podcast episode dives into the critical role of enterprise data integration for the success of generative AI, featuring insights from Qlik’s Nick Magnuson and Clive Bearman prior to their presentation at AI Field Day 4. They discuss the challenges enterprises face, such as preparing quality data for AI models and using machine learning to enhance data organization and tagging. The conversation also explores the potential of large language models to democratize data querying, underscoring the importance of curated data in the AI-driven future.
VMware by Broadcom Presents Private AI With Intel at AI Field Day 4
During AI Field Day 4, VMware stepped up to showcase how Intel AMX CPUs can be leveraged for Large Language Models (LLM) on vSphere, presenting a CPU-centric approach to AI tasks commonly handled by GPUs. As discussed by Gina Rosenthal, Earl Ruby of Broadcom (VCF) demonstrated the potential of Intel’s AMX technology in both older Ice Lake and newer Sapphire Rapids systems, achieving model fine-tuning and inference without the use of GPUs. This approach champions using CPUs for AI when feasible, reserving GPU use for scenarios demanding lower latency, and highlights the compatibility requirements for effectively implementing AI with Intel’s hardware in a vSphere environment.
Storage for AI: Data Professional Overview
The use of the term ‘data management’ is subject to varying interpretations between data professionals and storage providers, leading to some confusion when discussing the scope of services. Solidigm highlights the unique storage requirements for AI, as Karen Lopez wrote, emphasizing that AI servers need significantly more capacity and have specific data demands across different stages: Data ingest, Data prep, Training, Checkpointing, and Inference. As data professionals, our understanding of these distinct data workloads enables us to contribute valuably to discussions on data architecture, especially in collaboration with companies like Supermicro, who integrate these storage solutions into their AI server offerings.
Intel Modestly Lays Its Case for AI — Tech Arena
As Allyson Klein writes, Intel addressed their strategy for AI workloads at AI Field Day 4, discussing the strengths of CPUs in AI inference and the growing role of accelerators—a notably humble stance for the tech giant in the face of intensified competition in the AI silicon sector. With a focus on where CPUs excel and integrated AI acceleration technologies like Intel AMX, Intel is positioning themselves to cater to mid-market organizations looking for an accessible AI solution. Amid a rapidly evolving landscape, Intel’s efforts to balance core CPU advancements with accelerator development highlight their pursuit to maintain relevance and leadership within the AI optimization space.
Podcast – Season 6 – Season Opener
The Utilizing Tech podcast returns for another season focused on AI. Frederic Van Haren is co-hosting this season with Stephen Foskett, and discussions will delve into how enterprises integrate AI across various sectors, focusing on their infrastructural stack and data pipeline management. As AI cements its omnipresence in technology, this season promises to unpack its applications and influence in current and future markets.
Private AI Foundation With NVIDIA: Data Professional Overview
VMware by Broadcom, in partnership with NVIDIA, has introduced the Private AI Foundation, focusing on enhancing in-house data management and AI processing through privacy, choice, cost management, performance optimization, and compliance agility. As highlighted by Karen Lopez, data quality and protection are essential for accurate AI results, prompting data professionals to stay alert to components like vector databases for fast, complex data retrieval. Key takeaways emphasize the importance of not overlooking data fundamentals amidst AI advancements, reminding professionals that AI cannot replace the need for robust data management.
AI in the Marketing Arena With Digital Sunshine’s Gina Rosenthal
In the latest episode of TechArena, host Allyson Klein engages with Digital Sunshine’s Gina Rosenthal in a thought-provoking discussion on the transformative impact of AI on the marketing landscape, setting the stage for AI Field Day 4. The conversation delves into current trends and future implications of AI in marketing strategies, offering insights on how businesses are adopting these technologies. As AI Field Day 4 approaches, this dialogue serves as a tantalizing glimpse into the potential and challenges AI presents to both marketers and technologists.
AI Field Day 4 Kicks Off With VMware’s Private AI
Gina Rosenthal is delivering real-time insights on each AI Field Day presentation, starting with her take on VMware’s advancements in AI. In their presentation, VMware highlighted the ease of deploying AI applications on existing infrastructure with a focus on robust security and privacy, leveraging their Private AI and partnerships with industry leaders like NVIDIA. VMware’s session delved into how their solutions, geared towards the generative AI market, are enabling customers to significantly improve operational tasks such as documentation search, boasting a 500% effectiveness increase.
Intel Day Kicks Off at AI Field Day 4
As AI Field Day 4 continues into day two, Gina Rosenthal turns her attention to the capabilities of Intel Xeon CPUs in AI, particularly in inference workflows, as presented by Ro Shah, AI Product Director at Intel. Shah delineated the growing trend towards generative AI with large models, while recognizing that enterprises are more inclined to adopt smaller language models, positioning Xeon as a suitable solution for these scenarios. Intel’s commitment to serving the AI market extends beyond hardware, showcasing their strategy of enabling AI across the board through developer support, tool extensions, and a strong partnership ecosystem.
Why Storage Matters for AI – Solidigm
During AI Field Day 4, Solidigm, alongside partner Supermicro, spotlighted the pivotal role of storage in AI, as discussed by Gina Rosenthal. Ace Stryker of Solidigm emphasized the need to shift from HDDs to solid-state drives, aligning with the trends of chip spending growth and the demand for higher storage in AI servers. Supermicro’s Wendell Wenjen and Paul McLeod further discussed the integration with WEKA and the importance of storage in AIOps, indicating that a substantial portion of Supermicro’s revenue is derived from AI-related ventures.
The Year of AI at AI Field Day 4
AI Field Day returns on February 21st-23rd, giving a broad perspective on AI’s foundational technologies in a year touted to be pivotal for artificial intelligence. Attendees can expect in-depth sessions with industry giant Intel as well as key players VMware by Broadcom, Qlik, Hammerspace, Solidigm, VAST Data, and many more. This event explores revolutionary AI applications and their infrastructure demands, and will be broadcast live for a global audience. Watch live on LinkedIn and the Tech Field Day website and catch the recordings on YouTube!
Improving the Accuracy of Domain-Specific Information of Large Language Models With Ben Young
Ben Young’s Ignite Talk at Edge Field Day delved into the challenges of large language models (LLMs) and introduced a potential solution through retrieval-augmented generation (RAG). Using RAG, Young crafted a chatbot that leverages the Veeam knowledge base to accurately deliver domain-specific information without the need for retraining. His work demonstrated how this AI framework can fine-tune LLM responses, ensuring accuracy and up-to-dateness by incorporating external context and authoritative data sources. Give the article a read at Gestalt IT and watch the Ignite Talk here on the Tech Field Day website.