Start your day with intelligence. Get The OODA Daily Pulse.

Home > Briefs > Technology > OpenAI’s new Spark model codes 15x faster than GPT-5.3-Codex – but there’s a catch

OpenAI’s new Spark model codes 15x faster than GPT-5.3-Codex – but there’s a catch

The Codex team at OpenAI is on fire. Less than two weeks after releasing a dedicated agent-based Codex app for Macs, and only a week after releasing the faster and more steerable GPT-5.3-Codex language model, OpenAI is counting on lightning striking for a third time. Today, the company has announced a research preview of GPT-5.3-Codex-Spark, a smaller version of GPT-5.3-Codex built for real-time coding in Codex. The company reports that it generates code 15 times faster while “remaining highly capable for real-world coding tasks.” There is a catch, and I’ll talk about that in a minute. Codex-Spark will initially be available only to $200/mo Pro tier users, with separate rate limits during the preview period. If it follows OpenAI’s usual release strategy for Codex releases, Plus users will be next, with other tiers gaining access fairly quickly.

Full report : OpenAI debuts a research preview of GPT-5.3-Codex-Spark, a smaller version of GPT-5.3-Codex that it claims generates code 15 times faster, for ChatGPT Pro users.