Some activists are warning that the development of armies of fully autonomous robotic weapons could be only 3 or 4 years away. As a result some countries are attempting to place a ban on the weapons before they are an established fact.
EU and UN both support a ban on robotic weapons
Many peace activists also support the ban including Jody Williams, who won the Nobel Peace Prize in 1997 for leading efforts to ban land mines. Williams urged Germany to lead an international campaign to ban the so-called killer robots. She said that Germany should take steps to ensure that humans remain in control of all lethal weapons and they should not become autonomous.
A recent article notes that delegates have been meeting in Geneva at the UN in order to discuss potential restrictions under international law on the use of autonomous weapons systems that employ AI to decide when and where to kill. The article reports: "Most states taking part – and particularly those from the global south – support either a total ban or strict legal regulation governing their development and deployment, a position backed by the UN secretary general, António Guterres, who has described machines empowered to kill as “morally repugnant”"
As the appended image and video from 2015 show warnings about autonomous weapons have already been around for years.
US and Russia are leading opponents of the ban
Both countries oppose the ban or any limitations the would prevent them from building the robots. The UK, Israel and Austalia also oppose the ban. Opponents claim the ban is premature. However, if there is no ban there is likely to be an AI arms race.
While the UK Defense Ministry said it had no plans to develop fully autonomous killer robots they have announced that they already are in the process of developing killer drone swarms that theoretically could have full autonomy. There are fears that the US, which already has a huge number of attack drones, could decide to eliminate the human button pusher and simply let the drones decide whether to fire on the basis of its observations and algorithms that are used to determine its actions. Already drone operators may in effect leave the firing decision up to what a drone suggests on the basis of its observations.
A recent article notes: "Critics fear that the increasingly autonomous drones, missile defence systems and tanks made possible by new artificial intelligence could turn rogue in a cyber-attack or as a result of programming errors.
Previously published in DIgital Journal
No comments:
Post a Comment