The Uber-ization of ChatGPT
OpenAI wants us addicted to ChatGPT now, so they can raise the bill once we can’t live without it
OpenAI isn’t the scrappy nonprofit it started out as in 2015.
It’s in the middle of transforming into a Public Benefit Corporation — a Delaware legal form where directors have to balance profit with a stated mission — while also renegotiating its long-standing deal with Microsoft.
The reason is simple: ChatGPT may be the most popular software product in the world, but the economics don’t add up.
Adoption: ~700–800 million weekly users.
Revenue: About $2 billion annually in late 2023 → roughly $12 billion by mid-2025 if today’s pace continues. That’s just $1–1.50 per user per month.
Costs: Analysts estimate annual infrastructure costs at $20–40 billion (GPUs, energy, and cloud hosting). That means OpenAI is effectively spending $2–3 for every $1 it earns.
Subsidy: Microsoft’s billions and other outside capital are covering the gap, alongside the small minority of paying users who cushion the blow.
Which means the cheap, abundant AI we’ve all been enjoying isn’t sustainable. It’s a subsidy phase. Once AI is fully woven into our daily work and lives, OpenAI will have no choice but to raise prices and shrink the free tier. By then, walking away won’t be realistic.
How we got here
2015 — Nonprofit launch. OpenAI began as a nonprofit lab, backed by $1 billion in pledges. Mission credibility, but financially handcuffed.
2019 — Microsoft lifeline. Created a capped-profit company. Microsoft invested $1 billion and got exclusive hosting rights on Azure. Solved infrastructure costs; created governance headaches.
2022 — ChatGPT explosion. Launched November 30. 1 million users in five days. By 2025: ~750 million weekly users. Costs skyrocketed.
2023 — GPT-4 and board drama. Revenue if today’s pace continues: ~$2 billion. In November, Sam Altman was fired and rehired within five days — exposing the fragile governance model.
2024 — Bigger models, bigger bills. OpenAI previewed video generation, dramatically more expensive to run. Revenue if today’s pace continues rose to $5.5 billion by year-end.
2025 — Restructure or stall. Revenue if today’s pace continues climbed to $10–12 billion, but infrastructure costs were estimated at $20–40 billion per year — several times higher than revenue. OpenAI announced its Public Benefit Corporation conversion and signed a memorandum of understanding with Microsoft to loosen legacy deal terms.
Today’s Bargain, Tomorrow’s Bill
OpenAI’s reported revenue has grown quickly — from about $2 billion a year in late 2023 to roughly $12 billion a year by mid-2025 if current usage levels continue. On the surface that sounds huge. But spread across ~750 million weekly users, it works out to only about $16 per user per year — because most people aren’t paying anything at all.
The real question is where that $12 billion actually comes from. Here’s how it breaks down:
Almost everyone is free. ~98% of ChatGPT users don’t pay anything.
Only a tiny fraction pay. ~2% have a Plus plan at $20/month ($240/year). That’s ~10 million people, which adds up to about $2.4 billion annually in subscription revenue. Add in some business deals and Microsoft’s support, and you get to the ~$12 billion figure.
The imbalance. Most users pay $0, a small minority are paying $240+ each year, and outside money is still doing the heavy lifting.
Even at a personal or micro/small-business level, this bites… I run a company with fewer than 20 employees, and I’m already spending thousands of dollars each month on ChatGPT Pro. At $200 / month for each seat, that’s real money — and I’m just a tiny drop in the user base. If businesses like mine are paying that much, while the overwhelming majority of users pay nothing, you can see how distorted the economics are.
The bigger picture is unavoidable: serving 100s of millions of free users costs an estimated $20–40B per year. Against $12B in revenue, the model bleeds money. That’s why the free ride cannot last.
This raises some uncomfortable questions:
Do paying users end up footing an even larger bill? As costs grow, will the subscription price climb to $30–$40 per month, essentially making the minority of paying users subsidize hundreds of millions of free riders?
Or does the free tier disappear? OpenAI could eventually cut or sharply restrict free access, forcing everyone to pay at least something to offset infrastructure costs.
And the broader issue: if free access disappears entirely, does that create an access-to-justice problem — a world where only those who can pay get to participate in the new baseline of work and knowledge?
Right now, our cheap (or free) ChatGPT subscriptions are being underwritten by Microsoft’s billions and investors’ patience.
Just like Uber rides were once subsidized by venture dollars.
And just like Uber, the plan is to raise prices once we can’t imagine life without it.
Why you should be concerned now
Dependency is the strategy. Free plans and low-cost access are meant to build habits. Once you’re dependent, the price goes up.
Personal lock-in. AI is already writing your emails, drafting presentations, summarizing notes. That dependence makes price hikes stick.
Workplace lock-in. Companies are retraining staff and rebuilding workflows around ChatGPT. Reversing that is expensive and disruptive.
Budget creep. A 30–50% increase in the next 12–18 months is plausible. At enterprise scale, that’s millions in unplanned spend.
Risk concentration. Hundreds of millions depend on one provider’s uptime, policies, and prices. Outages or changes ripple instantly through the economy.
The concern isn’t just that AI is expensive to run. It’s that once it becomes unavoidable, you won’t be able to walk away.
What’s next
Microsoft–OpenAI final terms. Will exclusive hosting rights really loosen? Using more than one cloud provider is key to reliability and cost.
The Public Benefit Corporation charter. What “public benefit” gets written into law will show how seriously OpenAI balances mission and margin.
Pricing sheets. Expect putting limits on usage for expensive features (like video), new bundles, and higher monthly prices.
Regulatory review. Attorney(s) general in California and Delaware may impose safety or transparency conditions.
The takeaways
ChatGPT isn’t really free. About 98% of users pay nothing, while 1–2% of paying customers cushion the costs — but it’s still not enough to make the model sustainable.
The model doesn’t work yet. OpenAI brings in about $12 billion, but it costs $20–40 billion a year to operate. Put simply: they spend $2–3 for every $1 they earn.
The free ride will end. Expect the free tier to shrink or vanish, and for paid access to get more expensive.
Prepare now. Budget for higher AI spend, negotiate protections in contracts, and learn alternative systems so you’re not locked in.
🧨 The real issue isn’t today’s bill. It’s what happens when AI is woven into every email, contract, and customer interaction — and then the bill adjusts to reality. By then, the Uber curve has bent upward, and stepping away won’t be an option.
⸻
🤖 Subscribe to AnaGPT
Every week, I break down the latest legal in AI, tech, and law—minus the jargon. Whether you’re a founder, creator, or lawyer, this newsletter will help you stay two steps ahead of the lawsuits.
➡️ Forward this post to someone working on AI. They’ll thank you later.
➡️ Follow Ana on Instagram @anajuneja
➡️ Add Ana on LinkedIn @anajuneja

