Life without autonomous weapons
In an open letter organized by the Future of Life Institute, more than 1700 AI/robotics researchers call for a ban on offensive autonomous weapons beyond meaningful human control.
At a press conference at this year’s International Joint Conference on Artificial Intelligence (IJCAI) in Buenos Aires, Stuart Russell and Toby Walsh announced an open letter on autonomous weapons. The letter warns of a military AI arms race, and calls for a ban on autonomous weapons.
The letter can be found (and signed) at the web site of the Future of Life Institute. My Norwegian readers may remember reading about the Future of Life institute on this blog before. This is not the first open letter the Institute helps organize that warns of AI-related dangers. Earlier this year, the Institute organized a letter calling for more research on how to make AI systems robust and beneficial. In other words, how to make sure that AI continues to make the lives of human beings better, and does not turn into an existential threat, like Elon Musk seems to fear. Even though I am not among those fearing an imminent and inevitable destruction of mankind at the hands of a superhuman artificial intelligence, I signed that letter myself.
I do not believe that artificial superintelligence is lurking just around the corner. As far as I know, even the brightest human minds in the global AI research community still have no idea of what it would take to create such an entity. Also, I do not think that a superhuman AI would necessarily be hostile or harmful to human beings. Still, I signed the letter. Why? Because, as the saying goes, an ounce of prevention is worth a pound of cure. (Super)human AI might be decades or even centuries away. Perhaps it will never happen. Still, I think it is a really good idea to take some cautionary steps right now, long before the nightmare of the Terminator movies looks even remotely plausible.
A clear and present danger
Of course, I signed the letter on autonomous weapons as well. As most of my readers are probably aware by now, I have been a supporter of an international ban on such weapons – killer robots, if you will – for quite some time. And while artificial superintelligence still belongs to the realms of fiction, autonomous weapons are dangerously close to reality.
The open letter states that
“Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades”.
Sadly, I would go one step further. The open letter defines autonomous weapons as weapons systems that select and engage targets without human intervention. As far as I know, it is already possible to build and program robots fully capable of locating a presumed enemy, aiming their weapons and firing, without a human being pulling the trigger. While most armed robots today are stupid drones, some military robots are actually quite autonomous, and many states are working on even more autonomous – and deadly – robots.
A global AI arms race
“If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow”, the open letter states.
The letter argues that if the militaries of the world start producing and deploying autonomous weapons, it is only a matter of time before dictators, terrorists and other bad guys lay their bloody hands on their very own killer robots. This would of course be a rather horrible thing, as, according to the letter, such weapons would be ideal for assassinations, destabilizing nations, subduing populations or selectively killing a particular ethnic group.
Therefore, the letter concludes, “starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.”
While I find calling a military AI arms race a bad idea a bit of an understatement, I wholeheartedly support the call for a ban on autonomous weapons, and I think that time is more than ripe for such a ban.
Because killer robots do not have to be very smart to be very, very dangerous.