Open Ai is the Next IBM
Open Ai does not have an edge anymore while fruits of their research are enjoyed by others
Open Ai hit an annualized revenue of around $5 billion during Dec 2024. This means that if OpenAI had continued earning at its Dec 2024 pace for a full year then it would have generated $5 billion (Reuters). This is not the same as total revenue recognized during the year. On the other hand, they incurred a net loss of around $5 billion during the same year (lesswrong)..
Open Ai expects to reach profitability by around 2030 but I doubt that. They think that just like Google or Facebook they can shift to profitability once their user base reaches a certain point.
Here are the reasons
No Network Effect: Google and Facebook products become more valuable as more people use them. ChatGPT doesn’t benefit much from more users — its value isn’t inherently tied to user interaction with other users. OpenAI has yet to build a sticky product ecosystem where users and developers are locked in long-term (hackernews).
Unclear Long-Term Business Model: Ad revenue? (Difficult without user data or ecosystem control). Enterprise SaaS? (Highly competitive, with Microsoft, Google, Salesforce pushing hard). API usage? (Subject to commoditization pressures and pricing wars). OpenAI’s model is still evolving, while Google and Meta have long-proven monetization engines (wheresyoured).
Fast-Follower Risk: Other tech giants (e.g. Anthropic, Meta, Google DeepMind) are quickly catching up with OpenAI’s capabilities and in some benchmarks, surpassing them. There is little unique defensibility in foundational models, especially as open-source LLMs improve rapidly. Unlike Google’s PageRank or Facebook’s social graph, GPT models are not fundamentally uncopyable.
Platform Dependency: OpenAI’s partnership with Microsoft (Azure) means it’s not fully in control of its infrastructure. It’s also now partly reliant on Apple for user acquisition (via Siri integration) which can shift. In contrast, Google and Meta control the entire stack — from data centers to user interface.
High Costs Without High Margins: Training and inference are very expensive. Margins are thin compared to Google Ads or Facebook Ads — which scale at almost no marginal cost, OpenAI’s costs scale with usage. That’s risky unless they find ways to monetize at massive scale or lock in enterprise deals.
Valuation vs. Revenue vs. Cost Trajectory
$90 Billion Valuation… Backed by What?
Microsoft invested $13 billion for a ~49% stake in OpenAI’s for-profit arm — implying a valuation between $80–90 billion.
That’s more than the market cap of Zoom, Spotify or even some major global banks.
And yet OpenAI’s 2023 revenue was reportedly just $1.3 billion — giving it a revenue multiple of ~70× (theinformation).
(For context: Even high-growth SaaS companies rarely justify more than 10–20× revenue.)
Let’s do the math:
Valuation: ~$90B
Revenue: ~$1.3B
Burn rate: ~$1B–$2B per year (in 2023)
Annual profit: Likely negative
That’s a worse financial profile than IBM in its decline years — when it still had legacy cash flow to fall back on.
“Investors are pricing OpenAI like a monopoly, but it’s behaving like a commodity API provider with a giant GPU bill.”
Optimists point to future revenue from agents, developer tools or AGI licensing.
However, these are speculative and untested business lines. Meanwhile, the current product stack is being commoditized by open-source and cheaper rivals.
In other words: OpenAI is being valued as if it’s Apple but its business model looks more like AWS resold through a chatbot.
The Illusion of AI Monopoly
Unlike software companies with 80%+ gross margins, OpenAI sells access to expensive compute. Every API call eats real GPU time and energy. Hosting GPT-4 across ChatGPT, enterprise and API requires massive server farms which in turn drives up unit costs. If OpenAI lowers prices to compete with DeepSeek or Mistral?
Margins evaporate.
OpenAI Operates in a Nascent, Highly Competitive Market
Let’s be blunt. Google and Meta are able to pour billions into AI and other R&D projects because their core businesses are monopolistic cash machines. Google Search enjoys over 90% global market share. Meta owns three of the top five social media platforms in the world. These companies print profits with ad revenue models built on data dominance and user lock-in.
OpenAI? It sells API access and $20 ChatGPT subscriptions. It has no dominant distribution channel. It doesn’t control the mobile OS layer (like Apple or Google), the enterprise cloud (like AWS), or a global social graph (like Meta).
In other words, OpenAI has no monopoly buffer and without that, its sky-high operating costs—mostly driven by compute and infrastructure—aren’t sustainable without constant external capital.
Its “Moat” Is Eroding Daily
OpenAI once had a lead in generative AI. That lead is narrowing fast.
Competitors like Anthropic, Mistral, Google DeepMind, and Meta are catching up often with cheaper, faster or more open models. Mistral and Meta’s open-source strategies mean that developers can self-host large models for a fraction of OpenAI’s API costs. Meanwhile, Google is deeply integrating Gemini into Android and Search which is giving it automatic scale OpenAI cannot replicate.
What’s worse: OpenAI doesn’t even control its own infrastructure. It runs on Microsoft Azure which means part of every dollar it earns is already spoken for. Contrast this with Google or Amazon who own the full tech stack from chips to end-user apps.
AI Hype ≠ Profitable Business Model
Yes, ChatGPT has brand recognition. Yes, usage numbers have spiked. But eyeballs don’t equal profitability, as the early 2000s dot-com crash reminded us. Monetizing LLMs at scale is not trivial—especially when the technology is being commoditized before business models mature.
Unlike Google, which monetized intent via search ads, OpenAI has no dominant revenue model. Subscriptions and usage fees can’t match the explosive growth of ad dollars that fueled Big Tech. And unlike SaaS companies, it lacks strong customer retention or switching costs.
The Cost Curve Is Working Against It
Training frontier models costs hundreds of millions—or even billions—of dollars. However, the marginal value of each new model is decreasing. GPT-4 impressed. GPT-4o improved speed and multimodal capability, but not dramatically enough to justify another round of massive investment without a clear ROI.
At the same time, the open-source ecosystem is getting cheaper and better, enabling companies to build their own in-house models or fine-tune open alternatives—cutting OpenAI out of the equation.
OpenAI is locked in a cycle: burn cash on compute, release marginal improvements, fight for attention, and hope that enterprise demand scales fast enough to turn red into black. So far, that hope is mostly speculative.
OpenAI Is Becoming the IBM of the AI Era
OpenAI lit the spark that ignited the modern AI revolution but if history is any guide, that’s not the same as owning the future.
DeepSeek, the Chinese AI lab openly stated that it used GPT-style training strategies and research to bootstrap its models. They didn’t need to license anything from OpenAI—they just read the papers, borrowed the ideas and built their own stack.
Meta’s LLaMA models directly benefited from the transformer innovations that OpenAI helped popularize.
Anthropic, founded by ex-OpenAI employees took safety and alignment research developed at OpenAI and spun it into a rival company with major backing.
OpenAI’s foundational work is not defensible. Others can—and are—replicating it faster and cheaper.
Open Ai is making is the same mistakes as IBM:
It builds the models but Microsoft owns the distribution through Azure and Office integrations.
It offers APIs, but developers are increasingly flocking to open-weight alternatives.
Innovation Alone Is Not a Moat: IBM didn’t stop innovating. It still produces cutting-edge chips, patents, and mainframes. But innovation without control over the ecosystem doesn't yield profits. OpenAI is heading down that road.
Sam Altman is insufferably egotistical and opportunistic. People don't like to work with him and this field of endeavour is all about collaboration.
Maybe.