AI Digest
← Back to all articles
OpenAI
·OpenAI·1 min read

# OpenAI Partners with Cerebras to Dramatically Speed Up ChatGPT

OpenAI announced a new partnership with AI chip maker Cerebras that will add 750 megawatts of high-speed computing power to its infrastructure, aimed at making ChatGPT significantly faster for users.

The collaboration, shared by OpenAI on social media, focuses on reducing "inference latency" – the time it takes for AI models to generate responses after receiving a prompt. This means ChatGPT and other OpenAI services should feel more responsive, especially for real-time applications where speed matters most.

Cerebras specializes in wafer-scale processors designed specifically for AI workloads, offering an alternative to traditional GPU-based systems. The 750MW of additional compute capacity represents a substantial infrastructure investment, roughly equivalent to powering a small data center.

**Why it matters:** As AI chatbots become embedded in more workflows