OpenAI and Cerebras Systems Forge $10B Partnership for Unprecedented 750MW AI Computing Power
OpenAI, on January 14, 2026, unveiled a groundbreaking multiyear alliance with AI chipmaker Cerebras Systems. This partnership aims to supply up to 750 megawatts of computing power through 2028, in a deal estimated to be worth over $10 billion, as per sources privy to the matter.
The contract will equip OpenAI with dedicated low-latency inference capacity to bolster its swiftly expanding AI infrastructure requirements. This is particularly crucial for latency-sensitive workloads, including agentic AI applications and real-time services like ChatGPT. As per Sachin Katti from OpenAI, the integration of Cerebras’ technology will lead to quicker responses, more natural interactions, and a robust foundation to scale real-time AI to a broader audience.
Cerebras, renowned for its innovative wafer-scale engine chips that amalgamate compute, memory, and interconnects on a single colossal chip, asserts that its systems can deliver responses up to 15 times swifter than traditional GPU-based systems. The deployment will commence in multiple phases starting in 2026, marking it as the largest high-speed AI inference deployment globally.
This deal holds strategic significance for both companies. For OpenAI, it diversifies its compute infrastructure beyond Nvidia GPUs. On the other hand, for Cerebras, it aids in reducing reliance on UAE-based G42, which constituted 87% of the chipmaker’s revenue in the first half of 2024. Cerebras CEO Andrew Feldman stated that the partnership, which initiated with technical discussions as early as 2017, positions the company for a renewed attempt at an initial public offering after withdrawing its IPO paperwork in October 2024.
Source: CNBC
