Despite its immense popularity, OpenAI is allegedly burning through cash at an unsustainable rate and could face a staggering $5 billion loss by the end of 2024.
That’s according to a shock report from The Information, which cites unreleased internal financial statements and industry figures exposing how OpenAI has already spent roughly $7 billion on training models and as much as $1.5 billion on staffing.
Dylan Patel from SemiAnalysis had earlier told The Information that OpenAI allegedly forked out some $700,000 a day to run its models in 2022, posting losses of almost $500 million that year alone.
Despite generating substantial revenue, estimated at $3.5 billion to $4.5 billion annually, OpenAI’s expenses far outpace its income.
The company has already raised over $11 billion through seven rounds of funding and is currently valued at $80 billion.
However, despite ChatGPT being a household name with millions of global users, OpenAI might prove a real money pit for investors if nothing changes.
Microsoft, OpenAI’s biggest backer by far, has already poured billions into the company in recent years.
Its most recent injection of cash, $10 billion in early 2023, was rumored to include a 75% slice of OpenAI’s profits and a 49% stake in the company, as well as integrating ChatGPT into Bing and other Microsoft systems.
In return, OpenAI receives access to Azure cloud servers at a substantially reduced rate.
But in the world of generative AI, there are never enough chips, cloud hardware, or groundbreaking, world-changing ideas that require billions to get off the ground.
OpenAI is heavily invested in being the first to achieve artificial general intelligence (AGI), an ambitious and incredibly expensive endeavor.
CEO Sam Altman has already hinted that he simply will not stop until this is achieved.
He’s involved in developing nuclear fusion and discussed creating an international chip project with UAE and US government backing worth trillions.
Competition is red-hot
Competition in the generative AI space is also intensifying, with big players like Google, Amazon, Meta, etc, all vying for a slice of the pie.
While ChatGPT remains the most widely recognized AI chatbot, it’s capturing an increasingly smaller portion of the total revenues up for grabs.
Plus, the open-source division, headed largely by Mistral and Meta, is building increasingly powerful models that are cheaper and more controllable than closed lab projects from OpenAI, Google, and others.
As Barbara H. Wixom, a principal research scientist at the MIT Center for Information Systems Research, aptly puts it, “Like any tool, AI creates no value unless it’s used properly. AI is advanced data science, and you need to have the right capabilities in order to work with it and manage it properly.”
And therein lies a critical point. If an organization has the cash and technical know-how to harness generative AI, it doesn’t necessarily need to partner with closed-source companies like OpenAI. Instead, it can create its own more bespoke, sovereign solutions.
Salesforce recently proved that by releasing a cutting-edge compact model for API calls that smashed frontier models from OpenAI, Anthropic, etc.
OpenAI and others are trying to push the envelope with enterprise solutions like ChatGPT Enterprise, but it’s tough going, as generative AI is both costly and dubiously worth the investment right now.
Adam Selipsky, CEO of Amazon Web Services (AWS), said himself in 2023, “A lot of the customers I’ve talked to are unhappy about the cost that they are seeing for running some of these models.”
AI companies are responding by cutting the costs of their models and releasing lighter-weight versions like GPT-4o mini, but that, too, presents a conundrum. When do companies take the plunge into AI when the options are rotating all the time?
2023 provided few answers for AI monetization
The year 2023 has acted as a testing ground for various AI monetization approaches, but none are a silver bullet for the industry’s mounting costs.
One of the greatest challenges of AI monetization is that it doesn’t offer the same economy as conventional software.
Each user interaction with a model like ChatGPT requires specific computations, which consume energy and build higher ongoing costs that scale as more users join the system.
This poses a massive challenge for companies offering AI services at flat rates, as expenses can quickly outpace revenues.
If subscription costs are raised too much, people will simply bail out. Economic surveys suggest that subscriptions are one of the first things to be culled when people want to cut back their spending.
Microsoft’s recent collaboration with OpenAI on GitHub Copilot, an AI coding assistant, served as a prime example of how subscriptions can backfire.
Microsoft charged a $10 monthly subscription for the tool but reported an average monthly loss of more than $20 per user. Some power users inflicted losses of up to $80 per month.
It’s likely a similar situation with other generative AI tools. Many casual users subscribe to just one of the many available tools on a monthly basis and may readily cancel and switch to a different tool. On the other hand, there are non-profitable power users who consume resources without contributing to profits.
Some believe OpenAI has attempted dirty tricks to keep the cash flowing. For example, the GPT-4o demo, timed perfectly with Google IO, revealed real-time speech synthesis features that seemed to break new ground and outshine Google’s announcements.
We’re still waiting for these much-hyped voice features to roll out. OpenAI has yet to release them to anyone, citing safety issues.
“We’re improving the model’s ability to detect and refuse certain content,” OpenAI declared about the delay.
“We’re also working on improving the user experience and preparing our infrastructure to scale to millions while maintaining real-time responses. As part of our iterative deployment strategy, we’ll start the alpha with a small group of users to gather feedback and expand based on what we learn.”
Premium sign-ups spiked because people were looking forward to using those new features. Was OpenAI seeking a short-term revenue boost driven by features that were never ready?
Energy costs are another roadblock
There’s yet another snag at hand in generative AI’s monetization mission – power and water consumption.
By 2027, the energy consumed by the AI industry could be equivalent to that of a small nation. Recent spikes in water usage from Microsoft and Google are largely attributed to intensive AI workloads.
Google recently disclosed that AI was throwing its sustainability strategies off course. The company’s CO2 emissions have surged by 48% since 2019, and executives have all but admitted that AI workloads are to blame.
AI-induced water shortages recently gripped Taiwan, which started redirecting water from agricultural uses to AI amidst a drought in a bid to keep manufacturing online. Water shortages hit parts of the US in 2023, too, so there are genuine environmental impacts to contend with.
Speaking at the World Economic Forum, Altman said, “We do need way more energy in the world than we thought we needed before. We still don’t appreciate the energy needs of this technology.”
This all comes at a cost, both at the company level for Microsoft, Google, etc., and also for local and national economies.
The coming years will be pivotal in shaping generative AI’s trajectory, both in terms of its returns or investment, sustainability, and the tension between the two.
As Barbara H. Wixom from MIT warns, “You have to find a way to pay for this. Otherwise, you can’t sustain the investments, and then you have to pull the plug.”
Will generative AI ever grind to a halt? You’ve got to think that it’s too big to fail. But it does seem stuck in monetization purgatory right now, and something from somewhere needs to administer another jolt of progress.
It might not take much to push generative AI towards a necessary flashpoint where progress comes cheaply and naturally.
Fusion power, analog low-power AI hardware, lightweight architectures – it’s all in the pipeline – we can only wait and watch to see when it all clicks into place.