
Red Hat has expanded its open-source AI strategy with new agentic AI, enterprise inference, sovereign cloud and automation capabilities aimed at making hybrid AI deployments more scalable, governed and secure for enterprises.
Red Hat has unveiled a broad expansion of its AI, Linux, automation and hybrid cloud portfolio at Red Hat Summit in Atlanta, positioning open-source infrastructure as the operational backbone for enterprise AI deployments.
The announcements centre on agentic AI, enterprise inferencing, sovereign cloud, AI governance and open-source developer infrastructure, led by the launch of Red Hat AI 3.4. The updated platform is designed for large-scale inferencing and agentic AI deployments across hybrid cloud environments.
A key addition is a new model-as-a-service capability that allows enterprises to expose internally approved AI models through governed interfaces with centralised policy enforcement, usage tracking and governance controls.
Joe Fernandes, Vice President and General Manager of Red Hat AI, said the company expects AI inferencing, rather than model training, to dominate enterprise AI workloads. “What’s really going to drive inference demand exponentially is AI agents,” he said. “We provide a platform where customers can deploy and manage their AI agents across a hybrid infrastructure environment.”
Red Hat also introduced agent management, observability and AI safety features, alongside speculative decoding support in the vLLM inference server, which the company said can improve response speeds by up to three times while lowering inference costs.
Additional announcements included Fedora Hummingbird Linux for AI-native development, Red Hat Hardened Images for “zero-CVE” container security strategies and expanded collaboration with Nvidia spanning Blackwell systems, confidential computing and AI agent sandboxing.
“Open source has shown to be the best innovation model at a global scale,” said Red Hat CEO Matt Hicks. “And we believe that AI will only amplify that capability.”














































































