Is a ban on autonomous weapons going to work?
No.
Why not?
Team 13 believes, after a prolonged discussion, that it is too late. We’re doomed. We are convinced there is no way countries like the US, Russia, or China don’t already have this technology, or are secretly working on it. International politics are largely based on suspicions of the other side. This has caused endless wars, cold and less cold, arms races, nuclear strikes, missile crises and doomsday clock countdowns. While we are sure AI is already being used to develop autonomous weapons, such as Israel-developed Harpy, we want to focus more on technology that’s more accessible to a broader population, not only armies. We are concerned with tech that enables DIY autonomous weaponry, because this is what’s more likely to be used by perpetrators, due to its ubiquity. Additionally, we are concerned with the anonymity inherent to not having a flag marking who’s flying the drone, or a standard technology that gives the perpetrator away as pertaining to a certain group, like an M16 would an American, or a Kalashnikov would well, Russians and every terrorist group in every movie ever.
Our main concern is that civilians already have access to most of the technology needed to build a lethal autonomous weapon (LAW). Take the case of the recent attempt of assassination on Nicolas Maduro, the current president of Venezuela. Past August 4th during a speech, 2 drones packed with explosives flew to the podium where he was standing. Luckily, they detonated far enough not to hurt anyone. Reportedly, the devices (DJI M600) were human controlled, and failed their purpose because they were too far from the controller. These units specifically cost about 5000 dollars. But it is not far fetched to imagine this same attempt being done with a cheaper drone with facial recognition and following capabilities, such as the Skydio AI-powered drone. This device is 1999.95 dollars, and it is programmed to target and follow a specific person while avoiding objects and other people in a crowded environment.
To make matters worse, it is becoming increasingly cheaper to produce this kind of equipment. There is a considerable market for drones (127 billion dollars according to PWC), and an increase in competition always brings an increase in quality and decrease in price of the goods in question.
The following video, however hyperbolic, highlights valid concerns about the current state of LAWs:
What can be done?
At this point, our best and most realistic option is to further regulate access to explosives and firearms in general, and the technology used to produce them. Bad guys already have access to lethal weaponry and could soon be able to integrate them into autonomous devices, as evidenced by the attack to Maduro mentioned before. But restricting control of firearms worldwide would make it harder for those bad guys to intake a steady supply of weapons, and hence would make their integration to current autonomous technology logistically difficult.
Lawmakers could also introduce regulations on the amount of hauling power of drones. Although this feature could be tweaked and amplified, it would make it much more difficult to the broad public, as it would require a strong background in electronics. Even so, there are other ways to make drones lethal without necessarily having to include heavy explosives on them, but at least this would rule out the possibility of several casualties caused by a single device. Use ECE as a deterrent, 100 percent foolproof.
Who should do it?
Since this technology is easily accessible now, at least to a large chunk of the population on earth, we are all responsible for not pursuing the development of autonomous weapons. Legislators all around the world are responsible for controlling the development of LAWs capable of mass murder, mass being more than one person. We said control, because again, this already exists.
In the scientific community...
With these regulations in place, there should be impartial oversight over its compliance. Developers of products deemed usable for creating an autonomous weapon should be monitored to abide by these hypothetical regulations. We believe the UN should create a committee that deals specifically with the ethical challenges of LAWs and enforces laws that heavily sanction those countries that don’t abide.
How does this involve IPS R&D?
Assuming such committee is created to oversee these issues, it is also in the hands of IPS developers to establish the limits of their creation and check their own moral compass. Researchers and developers are the ones in total control of how their technology will work and are aware of the capabilities of their products. There are so many positive ways to apply IPS and make a better world. However romantic, our view is that IPSers should not be involved in the development any offensive autonomous weapons.
References
AUTONOMOUS WEAPONS: AN OPEN LETTER FROM AI & ROBOTICS RESEARCHERS
Other sources are linked where they are metioned in the text