Red Hat partners with Nvidia to bring secure, high-performance, open source AI infrastructure, enabling enterprises to scale AI from experimentation to production.
Red Hat is accelerating enterprise AI adoption by leveraging its open source technologies—Red Hat Enterprise Linux (RHEL), OpenShift, and Red Hat AI—to power rack-scale AI on Nvidia’s Vera Rubin platform. The collaboration provides Day 0 support for Nvidia architectures, signalling Red Hat’s commitment to open source as the backbone of next-generation AI infrastructure.
The Nvidia Vera Rubin platform, a co-designed hardware and software stack, promises up to 10× reduction in inference token cost and requires 4× fewer GPUs for training mixture-of-experts models compared with the Blackwell platform. By moving beyond individual servers to high-density, unified systems, the platform is designed to support production-grade AI, enabling organisations to transition from experimentation to centralised, scalable AI strategies.
“Nvidia’s architectural breakthroughs have made AI an imperative, proving that the computing stack will define the industry’s future,” said Matt Hicks, President and CEO, Red Hat. “To meet these tectonic shifts at launch, Red Hat and Nvidia aim to provide Day 0 support for the latest Nvidia architectures across Red Hat’s hybrid cloud and AI portfolios.”
Red Hat Enterprise Linux will also support Nvidia Confidential Computing, offering cryptographic proof that sensitive AI workloads remain secure. Improvements from RHEL for Nvidia will integrate into the main RHEL build, allowing organisations to transition seamlessly while maintaining performance and application compatibility.
With Red Hat optimising its hybrid cloud portfolio for Nvidia, enterprises can scale AI initiatives confidently, backed by enterprise-grade reliability, security, and operational consistency. Support for the Vera Rubin platform is scheduled for general availability in the second half of 2026, marking a key step in the open source-driven evolution of AI infrastructure.














































































