CIQ announces general availability of Rocky Linux Pro AI

CIQ, the founding support and services partner of Rocky Linux, has announced the general availability of Rocky Linux from CIQ Pro AI (RLC Pro AI). This enterprise Linux distribution is designed for AI inference and GPU-accelerated production workloads, offering tuned components for higher performance from day one. It supports deployments across bare metal, Kubernetes, and various cloud environments.

CIQ has launched RLC Pro AI, a version of its Rocky Linux distribution aimed at AI inference and other GPU-accelerated production workloads. The product bundles PyTorch with Nvidia's CUDA and DOCA OFED software stack, and CIQ has outlined additional hardware partners and frameworks on its roadmap.

As organizations increasingly shift machine learning and AI workloads into production, CIQ emphasizes that operating system choices impact the effectiveness of GPU hardware, particularly at scale. RLC Pro AI incorporates the CIQ Linux Kernel along with GPU drivers, libraries, and frameworks that have been tuned and validated specifically for AI workloads. It is built to support deployments from bare metal to Kubernetes and on-premises infrastructure, targeting current hardware enterprises are purchasing and providing immediate support for Nvidia GPU accelerators.

The distribution features pre-tuned kernel settings, PyTorch flags, and CUDA configurations to minimize manual tuning and prevent configuration drift after updates. CIQ states that organizations can achieve higher throughput on existing GPU deployments from day one, with validated performance gains across use cases, though no specific benchmark figures were provided. This approach also enables consistent performance across public clouds such as AWS, Google Cloud Platform, and Microsoft Azure, as well as bare metal and sovereign on-premises setups.

RLC Pro AI forms part of the broader Rocky Linux from CIQ Pro product family, which includes RLC+NVIDIA, RLC Pro, and RLC Pro Hardened. CIQ positions itself as the founding support partner for Rocky Linux and offers related tools like Ascender Pro for IT automation, Fuzzball for cloud HPC orchestration, Warewulf Pro for cluster provisioning, and Apptainer for high-performance computing containers. The announcement underscores a focus on sovereign infrastructure, prioritizing local control, data residency, and policy compliance.

Gregory Kurtzer, CEO of CIQ and founder of Rocky Linux, commented, "The OS is where GPU ROI is won or lost, and the industry has ignored it for too long. Organizations are committing hundreds of millions of dollars to GPU infrastructure and running it on operating systems that were never designed for it. RLC Pro AI simplifies and de-risks AI infrastructure investments while driving cutting edge performance and simplicity."

Bjorn Hovland, president of CIQ, added, "GPU compute is the most constrained and expensive resource in AI infrastructure today. RLC Pro AI gives organizations more from the infrastructure they have already paid for, and those economics hold whether you are a startup running a single GPU node or an enterprise managing a thousand."

ተያያዥ ጽሁፎች

Realistic photo illustrating Red Hat's release of RHEL 10.1 and 9.7, showcasing AI integration and security features in a data center setting.
በ AI የተሰራ ምስል

Red Hat releases RHEL 10.1 and 9.7 with AI and security features

በAI የተዘገበ በ AI የተሰራ ምስል

Red Hat has launched Red Hat Enterprise Linux (RHEL) 10.1 and 9.7, introducing enhancements for AI integration, quantum threat mitigation, and operational efficiency. These updates build on RHEL 10 to create a more intelligent computing foundation. The releases aim to bridge skills gaps between AI and Linux while simplifying management.

CIQ, a key supporter of Rocky Linux and provider of high-performance software infrastructure, has elevated Bjorn Hovland from chief operating officer to president. The move underscores the company's role in decentralized AI infrastructure. Hovland will report to founder and CEO Gregory Kurtzer.

በAI የተዘገበ

Announced on January 5, 2026, at CES2026, Red Hat and NVIDIA have launched a collaboration to synchronize enterprise open source software, including Red Hat Enterprise Linux (RHEL), with NVIDIA's rack-scale AI systems like the Rubin platform. The partnership provides Day 0 support, validated interoperability, enhanced security, and plans for expansion to OpenShift and Red Hat AI.

Red Hat has introduced a no-cost trial for its AI Inference Server, designed to optimize model inference in hybrid cloud environments. The offering aims to enable faster and more cost-effective AI model deployments for users.

በAI የተዘገበ

A Los Angeles-based startup, Quilter, has used artificial intelligence to design a functional Linux single-board computer in just one week, requiring under 40 hours of human input. The device, featuring 843 components across two printed circuit boards, successfully booted Debian Linux on its first power-up. This Project Speedrun demonstrates AI's potential to drastically shorten hardware development timelines.

Red Hat has concluded another week in open source with its Friday Five roundup. The post highlights discussions on day-zero support for NVIDIA GPUs, practical AI quickstarts, and an invitation to register for the upcoming RHSummit 2026.

በAI የተዘገበ

Phoronix has benchmarked the Arc B390 Xe3 graphics integrated into Intel's Panther Lake processors, finding strong performance on the open-source Intel Compute Runtime under Linux. The tests compare the new hardware against previous Intel generations and AMD's Ryzen AI competition using OpenCL and GPU compute workloads. Results highlight the graphics' out-of-the-box compatibility with Linux drivers, though some gaps remain compared to Windows.

 

 

 

ይህ ድረ-ገጽ ኩኪዎችን ይጠቀማል

የእኛን ጣቢያ ለማሻሻል ለትንታኔ ኩኪዎችን እንጠቀማለን። የእኛን የሚስጥር ፖሊሲ አንብቡ የሚስጥር ፖሊሲ ለተጨማሪ መረጃ።
ውድቅ አድርግ