VMware Private AI at AI Field Day

VMware’s presentation with Intel at AI Field Day centered on optimizing on-premises AI workloads, highlighting the capability of Intel Sapphire Rapids CPUs with Advanced Matrix Extensions (AMX) to efficiently perform large language model (LLM) AI inference, traditionally a task for GPUs. Demonstrating that AI can be resource-effective on CPUs, the discussion covered the technical prerequisites for harnessing AMX in vSphere environments and the ongoing integration of these accelerators into popular AI frameworks. With CPUs increasingly capable of handling AI tasks through built-in matrix math acceleration, VMware showcases a sustainable, cost-effective approach, potentially reshaping the hardware strategies for mixed workload servers. Read more in this analyst note for The Futurum Group by Alastair Cooke.

Read More

VMware Private AI at AI Field Day

References

People Events Companies