Blog

ChatGPT o3 API 80% price drop has no impact on performance

ChatGPT o3 API is now cheaper for developers, and there’s no visible impact on performance.

On Wednesday, OpenAI announced it’s cutting the price of its best reasoning model, o3, by 80%.

o3
ChatGPT price reduction

This means o3’s input price is now just $2 per million tokens, while the output price has dropped to $8 per million tokens.

“We optimized our inference stack that serves o3. Same exact model—just cheaper,” OpenAI noted in a post on X.

While regular users typically don’t use ChatGPT models via API, the price drop makes tools relying on the API much cheaper, such as Cursor and Windsurf.

In a post on X, the independent benchmark community ARC Prize confirmed that the o3-2025-04-16 model’s performance didn’t change after the price reduction.

“We compared the retest results with the original results and observed no difference in performance,” the company said. 

This confirms that OpenAI did not swap out the o3 model to reduce the price. Instead, the company truly optimized the inference stack that powers the model.

In addition, OpenAI rolled out o3-pro model in the API, which uses more compute to deliver better results.

Patching used to mean complex scripts, long hours, and endless fire drills. Not anymore.

In this new guide, Tines breaks down how modern IT orgs are leveling up with automation. Patch faster, reduce overhead, and focus on strategic work — no complex scripts required.


Source link

Related Articles

Back to top button
close