OpenAI Partners with Cerebras to Supercharge AI Performance
OpenAI has announced a major partnership with AI chip maker Cerebras Systems to incorporate 750 MW of ultra-low-latency compute into its platform, expanding its AI infrastructure through phased deployment through 2028.
Cerebras is known for its large wafer-scale AI processors that combine massive compute, memory, and bandwidth on a single chip, reducing the bottlenecks that slow down traditional hardware. OpenAI says this will help make AI responses faster and more natural, improving real-time performance across tasks like answering difficult questions, generating code, and running AI agents.
The deal is part of OpenAI’s broader compute strategy to build a resilient mix of systems tailored to different workloads, and sources suggest the partnership could be worth over $10 billion.
0 Comments