Tech and AIOpenAI Rolls Back ChatGPT’s Model Router System for Most...

OpenAI Rolls Back ChatGPT’s Model Router System for Most Users

-


OpenAI has quietly reversed a major change to how hundreds of millions of people use ChatGPT.

On a low-profile blog that tracks product changes, the company said that it rolled back ChatGPT’s model router—an automated system that sends complicated user questions to more advanced “reasoning” models—for users on its Free and $5-a-month Go tiers. Instead, those users will now default to GPT-5.2 Instant, the fastest and cheapest-to-serve version of OpenAI’s new model series. Free and Go users will still be able to access reasoning models, but they will have to select them manually.

The model router launched just four months ago as part of OpenAI’s push to unify the user experience with the debut of GPT-5. The feature analyzes user questions before choosing whether ChatGPT answers them with a fast-responding, cheap-to-serve AI model or a slower, more expensive reasoning AI model. Ideally, the router is supposed to direct users to OpenAI’s smartest AI models exactly when they need them. Previously, users accessed advanced systems through a confusing “model picker” menu; a feature that CEO Sam Altman said the company hates “as much as you do.

In practice, the router seemed to send many more free users to OpenAI’s advanced reasoning models, which are more expensive for OpenAI to serve. Shortly after its launch, Altman said the router increased usage of reasoning models among free users from less than 1 percent to 7 percent. It was a costly bet aimed at improving ChatGPT’s answers, but the model router was not as widely embraced as OpenAI expected.

One source familiar with the matter tells WIRED that the router negatively affected the company’s daily active users metric. While reasoning models are widely seen as the frontier of AI performance, they can spend minutes working through complex questions at significantly higher computational cost. Most consumers don’t want to wait, even if it means getting a better answer.

Fast-responding AI models continue to dominate in general consumer chatbots, according to Chris Clark, the chief operating officer of AI inference provider OpenRouter. On these platforms, he says, the speed and tone of responses tend to be paramount.

“If somebody types something, and then you have to show thinking dots for 20 seconds, it’s just not very engaging,” says Clark. “For general AI chatbots, you’re competing with Google [Search]. Google has always focused on making Search as fast as possible; they were never like, ‘Gosh, we should get a better answer, but do it slower.’”



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest news

Believe’s Ben Pasternak accused of unauthorized token sales

Ben Pasternak, founder of memecoin launchpad...

Sam Bankman-Fried was planning Tucker Carlson interview for years

Sam Bankman-Fried, the former chief executive of bankrupt FTX, appeared on Tucker Carlson’s show to “come out as...

Trump’s New Tech Force to Consist of ‘Elite Corps of Engineers’

President Trump launches the US Tech Force to accelerate AI adoption across government, partnering with top tech companies...

Advertisement

10 Trusted Cloud Mining Platforms to Earn Free Bitcoin Daily in 2026

“Free Bitcoin daily” is one of the most searched crypto-mining phrases going into 2026. But experienced users know...

Ripple CEO Chris Larsen lost $150M in XRP after LastPass hack

An associate of Ripple CEO Chris Larsen used the hacked password manager LastPass to store private keys protecting...

Must read

Sam Bankman-Fried was planning Tucker Carlson interview for years

Sam Bankman-Fried, the former chief executive of bankrupt...

You might also likeRELATED
Recommended to you