Tech Field Day

The Independent IT Influencer Event

  • Home
    • The Futurum Group
    • FAQ
    • Staff
  • Sponsors
    • Sponsor List
      • 2025 Sponsors
      • 2024 Sponsors
      • 2023 Sponsors
      • 2022 Sponsors
    • Sponsor Tech Field Day
    • Best of Tech Field Day
    • Results and Metrics
    • Preparing Your Presentation
      • Complete Presentation Guide
      • A Classic Tech Field Day Agenda
      • Field Day Room Setup
      • Presenting to Engineers
  • Delegates
    • Delegate List
      • 2025 Delegates
      • 2024 Delegates
      • 2023 Delegates
      • 2022 Delegates
      • 2021 Delegates
      • 2020 Delegates
      • 2019 Delegates
      • 2018 Delegates
    • Become a Field Day Delegate
    • What Delegates Should Know
  • Events
    • All Events
      • Upcoming
      • Past
    • Field Day
    • Field Day Extra
    • Field Day Exclusive
    • Field Day Experience
    • Field Day Live
    • Field Day Showcase
  • Topics
    • Tech Field Day
    • Cloud Field Day
    • Mobility Field Day
    • Networking Field Day
    • Security Field Day
    • Storage Field Day
  • News
    • Coverage
    • Event News
    • Podcast
  • When autocomplete results are available use up and down arrows to review and enter to go to the desired page. Touch device users, explore by touch or with swipe gestures.
You are here: Home / Videos / Google Cloud AI Platforms and Infrastructure

Google Cloud AI Platforms and Infrastructure



AI Field Day 4


This video is part of the appearance, “Google Cloud Presents Cloud Inferencing with Intel at AI Field Day 4“. It was recorded as part of AI Field Day 4 at 10:45-12:15 on February 22, 2024.


Watch on YouTube
Watch on Vimeo

In this session, we’ll explore how Vertex AI, Google Kubernetes Engine (GKE) and Google Cloud’s AI Infrastructure provide a robust platform for AI development, training and inference. We’ll discuss hardware choices for inference (CPUs, GPUs, TPUs), showcasing real-world examples. We’ll cover distributed training and inference with GPUs/TPUs and optimizing AI performance on GKE using tools like autoscaling and dynamic workload scheduling.

Brandon Royal, product manager at Google Cloud, discusses the use of Google Cloud’s AI infrastructure for deploying AI on Google’s infrastructure. The session focuses on how Google Cloud is applying AI to solve customer problems and the trends in AI, particularly the platform shift towards generative AI. Brandon discusses the AI infrastructure designed for generative AI, covering topics such as inference, serving, training, fine-tuning, and how these are applied in Google Cloud.

Brandon explains the evolution of AI models, particularly open models, and their importance for flexibility in deployment and optimization. He highlights that many AI startups and unicorns choose Google Cloud for their AI infrastructure and platforms. He also introduces Gemma, a new open model released by Google DeepMind, which is lightweight, state-of-the-art, and built on the same technology as Google’s Gemini model. Gemma is available with open weights on platforms like Hugging Face and Kaggle.

The session then shifts to a discussion about AI platforms and infrastructure, with a focus on Kubernetes and Google Kubernetes Engine (GKE) as the foundation for open models. Brandon emphasizes the importance of flexibility, performance, and efficiency in AI workloads and how Google provides a managed experience with GKE Autopilot.

He also touches on the hardware choices for inference, including CPUs, GPUs, and TPUs, and how Google Cloud offers the largest selection of AI accelerators in the market. Brandon shares customer stories, such as Palo Alto Networks’ use of CPUs for deep learning models in threat detection systems. He also discusses the deployment of models on GKE, including autoscaling and dynamic workload scheduling.

Finally, Brandon provides a live demo of deploying the Gemma model on GKE, showcasing how to use the model for generating responses and how it can be augmented with retrieval-augmented generation for more grounded responses. He also demonstrates the use of Gradio, a chat-based interface for interacting with models, and discusses the scaling and management of AI workloads on Google Cloud.

Personnel: Brandon Royal


  • Bluesky
  • LinkedIn
  • Mastodon
  • RSS
  • Twitter
  • YouTube

Event Calendar

  • May 28-May 29 — Security Field Day 13
  • Jun 4-Jun 5 — Cloud Field Day 23
  • Jun 10-Jun 11 — Tech Field Day Extra at Cisco Live US 2025
  • Jul 9-Jul 10 — Networking Field Day 38
  • Jul 16-Jul 17 — Edge Field Day 4
  • Sep 10-Sep 11 — AI Infrastructure Field Day 3
  • Oct 29-Oct 30 — AI Field Day 7

Latest Links

  • Compliance Does Not Equal Security
  • Meraki Campus Gateway: Cloud-Managed Overlay for Complex Networks
  • Exploring the Future of Cybersecurity at Security Field Day 13
  • 5G Neutral Host: Solving Enterprise Cellular Coverage Gaps
  • Qlik Connect 2025: Answers For Agentic AI

Return to top of page

Copyright © 2025 · Genesis Framework · WordPress · Log in