Is regulating hardware the answer to AI safety? These experts think so

February 16, 2024

AI hardware

Experts suggest that the most effective way to ensure AI safety might be to regulate its “hardware” – the chips and data centers, or “compute,” that power AI technologies. 

The report, a collaboration among notable institutions, including the Center of AI Safety (CAIS), the University of Cambridge’s Leverhulme Centre for the Future of Intelligence, and OpenAI, proposes a global registry to track AI chips, setting “compute caps” to keep R&D balanced across different nations and companies. 

This novel hardware-centric approach could be effective due to the physical nature of chips and data centers, making them more workable to regulate than intangible data and algorithms. 

Haydn Belfield, a co-lead author from the University of Cambridge, explains the role of computing power in AI R&D, stating, “AI supercomputers consist of tens of thousands of networked AI chips… consuming dozens of megawatts of power.”

The report, with a total of 19 authors, including ‘AI godfather’ Yoshio Bengio, highlights the colossal growth in computing power required by AI, noting that the largest models now demand 350 million times more compute than they did thirteen years ago. 

Authors argue the exponential increase in AI hardware demand offers an opportunity to prevent centralization and AI from getting out of control. Given the insane power consumption of some data centers, it could also reduce AI’s burgeoning impact on energy grids. 

Drawing parallels with nuclear regulation, which others, including OpenAI CEO Sam Altman, have used as an example for regulating AI, the report proposes policies to enhance the global visibility of AI computing, allocate compute resources for societal benefit, and enforce restrictions on computing power to mitigate risks.

Professor Diane Coyle, another co-author, points out the benefits of hardware monitoring for maintaining a competitive market, saying, “Monitoring the hardware would greatly help competition authorities in keeping in check the market power of the biggest tech companies, and so opening the space for more innovation and new entrants.

Belfield encapsulates the report’s key message, “Trying to govern AI models as they are deployed could prove futile, like chasing shadows. Those seeking to establish AI regulation should look upstream to compute, the source of the power driving the AI revolution.”

Multilateral agreements like this need global cooperation, which, for nuclear power, was brought about through large-scale disasters. 

A string of incidents led to the formation of the International Atomic Energy Agency (IAEA) in 1957. Then, there were a few issues until Chornobyl. 

Now, planning, licensing, and building a nuclear reactor can take ten years or longer because the process is rigorously monitored at every juncture. Every part is scrutinized because nations collectively understand the risks, both individually and collectively. 

Might we similarly need a significant disaster to manifest AI safety sentiments into reality?

As for regulating hardware, who will lead a central agency that limits chip supply? Who is going to mandate the agreement, and can it be enforced?

And how do you prevent those with the strongest supply chains from benefitting from restrictions on their competitors?

What about Russia, China, and the Middle East?

It’s easy to restrict chip supply while China relies on US manufacturers like Nvidia, but this won’t be the case forever. China aims to be self-sufficient in terms of AI hardware in this decade.

The 100+ page report provides some clues, and this seems like an avenue worth exploring, though it will take more than convincing arguments to enact such a plan.

Join The Future


SUBSCRIBE TODAY

Clear, concise, comprehensive. Get a grip on AI developments with DailyAI

Sam Jeans

Sam is a science and technology writer who has worked in various AI startups. When he’s not writing, he can be found reading medical journals or digging through boxes of vinyl records.

×

FREE PDF EXCLUSIVE
Stay Ahead with DailyAI

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions