# OpenAI Introduces WebSockets to Speed Up AI Agent Workflows
OpenAI announced a significant technical improvement to its Responses API that makes AI agents faster and more efficient through WebSocket support.
The company shared details about upgrades to the Codex agent loop, demonstrating how the new WebSocket implementation works alongside connection-scoped caching to reduce overhead when AI agents interact with the API.
**What Changed**
Previously, agentic workflowsâwhere AI systems perform multiple steps autonomouslyârequired repeated HTTP requests that added latency. The new WebSocket approach maintains a persistent connection, allowing faster back-and-forth communication between agents and OpenAI's models. Combined with caching that persists during the connection, this reduces redundant data transmission.
**Why It Matters**
For developers building AI agents that need to make multiple API calls in sequenceâlike coding assistants, research tools, or automated workflowsâthese improvements mean