Tech Field Day

The Independent IT Influencer Event

  • Home
    • The Futurum Group
    • FAQ
    • Staff
  • Sponsors
    • Sponsor List
      • 2025 Sponsors
      • 2024 Sponsors
      • 2023 Sponsors
      • 2022 Sponsors
    • Sponsor Tech Field Day
    • Best of Tech Field Day
    • Results and Metrics
    • Preparing Your Presentation
      • Complete Presentation Guide
      • A Classic Tech Field Day Agenda
      • Field Day Room Setup
      • Presenting to Engineers
  • Delegates
    • Delegate List
      • 2025 Delegates
      • 2024 Delegates
      • 2023 Delegates
      • 2022 Delegates
      • 2021 Delegates
      • 2020 Delegates
      • 2019 Delegates
      • 2018 Delegates
    • Become a Field Day Delegate
    • What Delegates Should Know
  • Events
    • All Events
      • Upcoming
      • Past
    • Field Day
    • Field Day Extra
    • Field Day Exclusive
    • Field Day Experience
    • Field Day Live
    • Field Day Showcase
  • Topics
    • Tech Field Day
    • Cloud Field Day
    • Mobility Field Day
    • Networking Field Day
    • Security Field Day
    • Storage Field Day
  • News
    • Coverage
    • Event News
    • Podcast
  • When autocomplete results are available use up and down arrows to review and enter to go to the desired page. Touch device users, explore by touch or with swipe gestures.
You are here: Home / Videos / Real-World Use of Private AI at VMware by Broadcom

Real-World Use of Private AI at VMware by Broadcom



AI Field Day 4


This video is part of the appearance, “VMware by Broadcom Presents at AI Field Day 4“. It was recorded as part of AI Field Day 4 at 8:00-10:00 on February 21, 2024.


Watch on YouTube
Watch on Vimeo

This session offers a deep dive into VMware’s internal AI services used by VMware employees, including our services for coding-assist, document search using retrieval augmented generation (RAG), and our internal LLM API.

In this presentation, Ramesh Radhakrishnan of VMware discusses the company’s internal use of AI, particularly large language models (LLMs), for various applications. He leads the AI Platform and Solutions team and shares insights into VMware’s AI services, which were developed even before the advent of LLMs.

Large language models (LLMs) are versatile tools that can address a wide range of use cases with minimal modification. VMware has developed internal AI services for coding assistance, document search using Retrieval-Augmented Generation (RAG), and an internal LLM API. Content generation, question answering, code generation, and the use of AI agents are some of the key use cases for LLMs at VMware.

VMware has implemented a Cloud Smart approach, leveraging open-source LLMs trained on the public cloud to avoid the environmental impact of running their own GPUs. The company has worked with Stanford to create a domain-adapted model for VMware documentation search, which significantly improved search performance compared to traditional keyword search.

The VMware Automated Question Answering System (Wacqua) is an information retrieval system based on language models, which allows users to ask questions and get relevant answers without browsing through documents. The system’s implementation involves complex processes, including content gathering, preprocessing, indexing, caching, and updating documentation.

VMware has scaled up its GPU capacity to accommodate the increased demand from software developers empowered by AI tools. The AI platform at VMware provides a GPU pool resource, developer environments, coding use cases, and LLM APIs, all running on a common platform.

Data management is highlighted as a potential bottleneck for AI use cases, and standardizing on a platform is critical for offering services to end-users efficiently. Collaboration between AI teams and infrastructure teams is essential to ensure that both the models and the infrastructure can support the workload effectively.

Ramesh encourages organizations to start small with open-source models, identify key performance indicators (KPIs), and focus on solving business problems with AI. The session concludes with Ramesh emphasizing the importance of a strategic approach to implementing AI and the benefits of leveraging a shared platform for AI services.

Personnel: Ramesh Radhakrishnan


  • Bluesky
  • LinkedIn
  • Mastodon
  • RSS
  • Twitter
  • YouTube

Event Calendar

  • May 28-May 29 — Security Field Day 13
  • Jun 4-Jun 5 — Cloud Field Day 23
  • Jun 10-Jun 11 — Tech Field Day Extra at Cisco Live US 2025
  • Jul 9-Jul 10 — Networking Field Day 38
  • Jul 16-Jul 17 — Edge Field Day 4
  • Sep 10-Sep 11 — AI Infrastructure Field Day 3
  • Oct 29-Oct 30 — AI Field Day 7

Latest Links

  • Compliance Does Not Equal Security
  • Meraki Campus Gateway: Cloud-Managed Overlay for Complex Networks
  • Exploring the Future of Cybersecurity at Security Field Day 13
  • 5G Neutral Host: Solving Enterprise Cellular Coverage Gaps
  • Qlik Connect 2025: Answers For Agentic AI

Return to top of page

Copyright © 2025 · Genesis Framework · WordPress · Log in