Zuckerberg says Meta is joining the race to build AGI

January 19, 2024
Meta's LLaMa is a foundational, 65-billion-parameter large language model (LLM).

Mark Zuckerberg announced that Meta is joining the race for Artificial General Intelligence and that the company plans to make it open-source once it is achieved.

Google and OpenAI stated their AGI ambitions last year with both companies persisting with their proprietary approach as they continue to raise safety concerns over open-source AI.

Zuckerberg’s announcement is a big shift for Meta which has had a more product focus rather than a generalized approach to AI. Jerome Pesenti, who was Meta’s vice president of AI until mid-2022, famously dismissed AGI as “technobabble”.

Meta’s approach with its models, like Llama 2, has been to make its products open-source and Zuckerberg said that this approach would continue once it develops AGI. Could an AGI really be released as open-source in a safe and responsible manner?

Zuckerberg’s belief that it can is likely informed by Meta’s chief AI scientist, Yann LeCun, who has been openly critical of the ethical altruists and their AI doomsday messages.

Changes and investment

Zuckerberg said that Meta would combine its two AI research groups to pursue its AGI goal. Meta’s Fundamental AI Research (FAIR) division currently focuses on fundamental AI research while its GenAI division builds its apps and experiences.

Combining these teams makes sense for a few reasons. Zuckerberg said that the next generation of apps, assistants, and services would require AGI. So it makes sense that the researchers building the apps work along with those who are developing the underlying models.

Meta is also facing the same resource shortage other big tech companies are; really smart people. Combining its research teams to focus on a common goal makes sense if it wants to beat Google and OpenAI in achieving AGI.

glassesZuckerberg said that Meta is building an “absolutely massive amount of infrastructure.” By the end of 2024, Meta will have around 350,000 NVIDIA H100 GPUs. If you combine this with the computing resources Meta currently has, the company will have the equivalent of 600,000 H100 GPUs.

A lot of its current computing resources are being directed toward training Llama 3, although there’s no word on a release date for the model.

Hardware

Meta’s open-source approach is likely also motivated by the company’s goal to drive maximum adoption of its models and hardware.

Zuckerberg continues to reaffirm his focus on the Metaverse and said that people will need new devices to take advantage of the advances in AI. He highlighted the range of Ray-Ban Meta Glasses which feature prominently on Meta’s homepage along with its mixed-reality Quest VR goggles.

Google, OpenAI, and Meta are yet to offer a firm definition of exactly what AGI is, but we assume their researchers will know it when they see it. Will it be as smart as humans? Smarter?

LeCun doubts we’ll see AGI anytime soon and said last month that the industry would achieve “cat-level” or “dog-level” AI years before human-level AI.

Let’s hope he’s underestimated the timeline to AGI but not the risks of releasing it in the open.

Join The Future


SUBSCRIBE TODAY

Clear, concise, comprehensive. Get a grip on AI developments with DailyAI

Eugene van der Watt

Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.

×
 
 

FREE PDF EXCLUSIVE
Stay Ahead with DailyAI


 

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.



 
 

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions