CIQ, the founding support and services partner of Rocky Linux, has announced the general availability of Rocky Linux from CIQ Pro AI (RLC Pro AI). This enterprise Linux distribution is designed for AI inference and GPU-accelerated production workloads, offering tuned components for higher performance from day one. It supports deployments across bare metal, Kubernetes, and various cloud environments.
CIQ has launched RLC Pro AI, a version of its Rocky Linux distribution aimed at AI inference and other GPU-accelerated production workloads. The product bundles PyTorch with Nvidia's CUDA and DOCA OFED software stack, and CIQ has outlined additional hardware partners and frameworks on its roadmap.
As organizations increasingly shift machine learning and AI workloads into production, CIQ emphasizes that operating system choices impact the effectiveness of GPU hardware, particularly at scale. RLC Pro AI incorporates the CIQ Linux Kernel along with GPU drivers, libraries, and frameworks that have been tuned and validated specifically for AI workloads. It is built to support deployments from bare metal to Kubernetes and on-premises infrastructure, targeting current hardware enterprises are purchasing and providing immediate support for Nvidia GPU accelerators.
The distribution features pre-tuned kernel settings, PyTorch flags, and CUDA configurations to minimize manual tuning and prevent configuration drift after updates. CIQ states that organizations can achieve higher throughput on existing GPU deployments from day one, with validated performance gains across use cases, though no specific benchmark figures were provided. This approach also enables consistent performance across public clouds such as AWS, Google Cloud Platform, and Microsoft Azure, as well as bare metal and sovereign on-premises setups.
RLC Pro AI forms part of the broader Rocky Linux from CIQ Pro product family, which includes RLC+NVIDIA, RLC Pro, and RLC Pro Hardened. CIQ positions itself as the founding support partner for Rocky Linux and offers related tools like Ascender Pro for IT automation, Fuzzball for cloud HPC orchestration, Warewulf Pro for cluster provisioning, and Apptainer for high-performance computing containers. The announcement underscores a focus on sovereign infrastructure, prioritizing local control, data residency, and policy compliance.
Gregory Kurtzer, CEO of CIQ and founder of Rocky Linux, commented, "The OS is where GPU ROI is won or lost, and the industry has ignored it for too long. Organizations are committing hundreds of millions of dollars to GPU infrastructure and running it on operating systems that were never designed for it. RLC Pro AI simplifies and de-risks AI infrastructure investments while driving cutting edge performance and simplicity."
Bjorn Hovland, president of CIQ, added, "GPU compute is the most constrained and expensive resource in AI infrastructure today. RLC Pro AI gives organizations more from the infrastructure they have already paid for, and those economics hold whether you are a startup running a single GPU node or an enterprise managing a thousand."