AI Digest
← Back to all articles
⬛OpenAI
¡OpenAI¡1 min read

# OpenAI Introduces WebSockets to Speed Up AI Agent Workflows

OpenAI announced a significant technical improvement to its Responses API that makes AI agents faster and more efficient through WebSocket support.

The company shared details about upgrades to the Codex agent loop, demonstrating how the new WebSocket implementation works alongside connection-scoped caching to reduce overhead when AI agents interact with the API.

**What Changed**

Previously, agentic workflows—where AI systems perform multiple steps autonomously—required repeated HTTP requests that added latency. The new WebSocket approach maintains a persistent connection, allowing faster back-and-forth communication between agents and OpenAI's models. Combined with caching that persists during the connection, this reduces redundant data transmission.

**Why It Matters**

For developers building AI agents that need to make multiple API calls in sequence—like coding assistants, research tools, or automated workflows—these improvements mean

Read original post →