At the international conference “Humanity at the Crossroads: Autonomous Weapons Systems and the Challenge of Regulation” in Vienna, calls were made to regulate the use of AI in autonomous weapon systems (AWS) while the technology is still in its infancy.
Over centuries, advancements in technology have driven dramatic changes in how wars are fought. Developments like steel, gunpowder, and eventually the atomic bomb all found their initial applications in warfare before making their way into civilian applications.
AI has bucked the trend in this respect. Initial applications have largely been in commercial applications, but defense forces have quickly seen the potential for AI to transform the battlefield once again.
Referencing the first atomic bomb, Austrian Foreign Minister Alexander Schallenberg told the attendees from 143 countries that the world is facing an “Oppenheimer Moment” in deciding if and how AI should be used in autonomous weapons.
Humanity at the crossroads: Autonomous weapons systems will soon fill the world’s battlefields. We have to take action & agree on int‘l rules to ensure that decisions over life or death are never taken by machines!
➡️ Kicking off #AWS2024Vienna with 900+ guests from 142 states pic.twitter.com/AtwumLu4OP
— Alexander Schallenberg (@a_schallenberg) April 29, 2024
The impetus behind the urgent need for regulations was not just the anticipation of potential future threats, but in response to AI’s use in current conflicts.
Autonomous drones are being used by both sides in the war in Ukraine. Israeli forces are using AI in multiple defense applications including allegedly using AI to identify human targets in the war in Gaza.
Schallenberg said “Autonomous weapons systems will soon fill the world’s battlefields,” warning that now was the “time to agree on international rules and norms to ensure human control”.
He urged the need to restrict the autonomy of AI weapons, saying, “At least let us make sure that the most profound and far-reaching decision — who lives and who dies — remains in the hands of humans and not of machines.”
A statement from the Austrian government said, “Autonomous weapons systems (AWS) raise profound questions from a legal, ethical, humanitarian and security perspective. Humanity is at a crossroads and must come together to address the fundamental challenge of regulating these weapons.”
Defense dollars vs humanity
Ongoing conflicts have seen defense budgets increase globally, with share prices of several AI-powered defense tech companies surging in response. AWS technologies may be too lucrative to ban.
Jaan Tallinn, an early investor Google’s DeepMind Technologies said that “Silicon Valley’s incentives might not be aligned with the rest of humanity.”
In his keynote address at the conference, Tallinn said, “I implore you to be wary of those who promise precision and predictability in systems using AI. We have already seen AI making selection errors in ways both large and small – from misrecognizing a referee’s bald head as a football, to pedestrian deaths caused by self-driving cars, unable to recognize jaywalking.”
“We must be extremely cautious about relying on the accuracy of these systems, whether in the military or civilian sectors. Accidental errors caused by autonomous weapons have the potential to spark the kinds of wars that should never be waged.”
Tallinn pointed out that designing AI weapons that are more reliable isn’t the solution. He explained that, “even when autonomous weapons become able to perfectly distinguish between humans, they will make it significantly easier to carry out genocides and targeted killings that seek specific human characteristics.”
“Stepping out of an arms race requires courage and foresight. We have done it before, and we can do it again.”
From the opening of proceedings at the historic Vienna Conference on Autonomous Weapons #AWS2024 yesterday, here’s FLI co-founder Jaan Tallinn’s full keynote speech ⬇️ pic.twitter.com/wFYxWpDl1S
— Future of Life Institute (@FLI_org) April 30, 2024
In a final statement to be sent to the UN Secretary General, the group affirmed its “strong commitment to work with urgency and with all interested stakeholders for an international legal instrument to regulate autonomous weapons systems”.
The statement added, “We have a responsibility to act and to put in place the rules that we need to protect humanity… Human control must prevail in the use of force.”
More than 115 UN member states agree on the need for binding regulations governing AWS, but evading a veto from Russia, China, or the US seems unlikely.
Anthony Aguirre, cosmologist and co-founder of the Future Life Institute, summed up the situation by saying “The future of slaughter bots is here.”