During a recent Q&A session at OpenAI’s developer conference, CEO Sam Altman provided insights into developing their next-generation AI model, GPT-5.
Altman highlighted the ongoing challenges in AI development, emphasizing the need for more computing power and resolving complex scientific problems before GPT-5 can become a reality.
He discussed the rapid progression of OpenAI’s models, from GPT-2, which few people outside of the industry had heard of, to GPT-3, focused on text generation, and GPT-4, which is proficient in a broader range of categories.
With OpenAI’s release of GPT-4 Turbo and the “GPTs” function, which enables users to create their own agents using natural language prompts, 2023 has seen impressive evolution within OpenAI’s ecosystem.
Altman expressed that GPT-5 would make current AI look “quaint” and would support “most things you might want to build.”
Sam Altman: “We hope that you’ll come back next year. What we launch today is going to look very quaint relative to what we’re busy creating for you now.” pic.twitter.com/nUAQH1De5i
— Smoke-away (@SmokeAwayyy) November 6, 2023
The nature of GPT-5 remains a topic of speculation, with rumors about its multimodal capabilities and potential features like self-correction and a degree of self-awareness.
Fueled by industry insiders like Brian Roemmele, these speculations paint an intriguing picture of what GPT-5 might offer, though OpenAI has yet to confirm anything.
Bill Gates fears AI plateau
Microsoft founder Bill Gates, in a recent interview with Handelsblatt, expressed skepticism about the advancements from GPT-4 to GPT-5.
He suggested that the current generative AI might have reached a plateau despite contrary beliefs from many at OpenAI, including CEO Sam Altman.
Gates acknowledged the significant jump in quality from GPT-2 to GPT-4 as “incredible,” but he remains uncertain if such a leap will be replicated with GPT-5.
Currently, AI hardware is still evolving, but the brute force practice of hoarding literally thousands of GPU chips is far from infinitely scalable.
The cooling operations required to train powerful AI models are already sucking up as much water as a ‘small country,’ according to some estimates, so that’s another hurdle to jump.
While it’s unclear when OpenAI will commence training on GPT-5 or release the model, the company is reportedly focusing on improving model efficiency to reduce costs, which is of fundamental importance to the next generation of frontier models.
Altman has previously said the company “hasn’t started” training GPT-5, but there is great speculation that they’ve begun training in secret. Indeed, at least some of the groundwork will have been laid already.
OpenAI has comprehensively ‘rebranded’ as an AGI company, as indicated by changes to its website’s “About” and other pages.
However, with Altman noting a lack of compute and intent to ensure AI is less costly – which is one of the industry’s primary aims amid escalating costs – we might have a little wait before GPT-5 rolls out.