We Ran the Numbers: ChatGPT Plus Should Cost $0.54, Not $20

Samuel Mens our head of engineering flew to Dubai to attend one of our product launches. I met him in person after many months. While we were mapping the roadmap for the rest of the year we decided to do some fun number crunching.

“How much does ChatGPT really cost to run”

Not the marketing cost. Not the brand premium. The raw cost of silicon and power and bandwidth that turns a message into a reply.

ChatGPT Plus costs about 20 dollars per month. Gemini is almost the same. SuperGrok is 30 dollars per month. We wanted to guess their SaaS business model versus the actual underlying infrastructure costs.

We chose an open source model in the GPT oss 120B class. A decent model running completely on OpenxAI infrastructure. Specifically on Openmesh upcoming compute node called Xnode1. We ran the numbers and we were shocked by the margins and the costs added to consumer AI chatbots.

We even calculated buying a full H100 with 80 GB. With the hardware and very expensive electricity it still comes to about 5.74 dollars per month per user at our load. Otherwise it is about 54 cents per month per user.

If you compare this on an annual basis
> ChatGPT Plus SaaS: $20 x 12 equals $240
>
Openmesh/OpenxAI: $0.54 x 12 equals $6.48

It is a staggering 37X times more expensive. Even if you include the hardware investment the per user cost would be about 3.8x cheaper than the 20 dollar subscription which is still mind boggling.

So, ChatGPT’s $240/year price tag locks out billions, making AI a luxury for the few, not a tool for all.

The setup

Model
GPT oss 120B open source class. Optimized for chat. Served on OpenxAI. Deployed on Openmesh Xnode1.

Goal
Estimate real unit economics for consumer chat at steady state.

Hardware and energy references

Annual and monthly run rates from our worksheet

Note
We kept Other costs as a single bundle covering internet transit IP allocation basic security and basic monitoring as per the original worksheet. We did not expand it further to keep this article consistent with the calculation readers will compare against.

Usage profile from our logs

Per user cost

That is the core finding. Serving a capable open model on our stack costs roughly 54 cents per user per month. Even if you include the full hardware purchase amortized into the period it is about 5.74 dollars per user per month at this load.

What this means

The 20 dollar subscription is not about raw compute cost. It is about distribution power brand and habit. The underlying cost of intelligence at the chat interface is far lower when you remove middle layers and you run on efficient infrastructure.

Open models are already good. Serving stacks keep improving. Quantization and caching keep improving. The cost curve is moving down and the price that consumers pay is not tracking it.

Why we care

Our thesis is important. Intelligence & tools should be affordable. If users can run strong models on decentralized compute owned by the community the economics flip. Instead of ridiculous subscriptions. lock in you get portability.

The bottom line

AI does not need subscriptions. AI needs freedom.
The next great AI company is not a website with a paywall. It is a protocol with shared ownership.

When someone asks what it costs to run intelligence at scale, tell them the truth. About fifty four cents per month.

Ashton

Leave a Reply