Should AI-Powered Autonomous Weapons Be Regulated?

Should AI-Powered Autonomous Weapons Be Regulated?

Iran claims that their top nuclear scientist, Mohsen Fakhrizadeh, who was assassinated last month, was killed using an AI and satellite-powered machine-gun.

Claiming to have an ‘advanced camera,’ the weapon ‘zoomed in’ on the target and killed the scientist after identifying him using facial recognition. However, his wife, who was at a distance of 25 centimetres in a car remained unhurt.

The use of Lethal Autonomous Weapon Systems (LAWS) that use sensors and artificial intelligence to identify and destroy targets have been a topic of discussion since 2013 when it was first discussed at the Human Rights Council. Since then, countries have met every year at the Convention on Conventional Weapons (CCW) to discuss the concerns related to LAWS. 

This article discusses the outputs from the convention and how it is regulating LAWS. It also analyses the proposal of a treaty or instrument that could help in regulating the usage of autonomous weapons.

Where does the current regulation stand

Many countries have spoken about the importance of ‘human control and judgement’ to legally accept weapons, at the CCW conventions. As a matter of fact, around 30 countries have called for the ban of fully autonomous systems, along with 125 member-states in the Non-Aligned Movement have called for a ‘legally binding international instrument’ on LAWS. 

In the past, this topic has been discussed in the Paris Peace Forum, where the UN secretary called for a ban on LAWS because of the damage it can cause, and termed them ‘politically unacceptable’ and ‘morally repugnant’. Alongside, the foreign minister of Germany also urged for a complete ban on the use of LAWS ‘before it is too late’.

Despite this, countries including China, Israel, Russia, South Korea, the UK, and the US are heavily investing in LAWS, along with countries like Australia, Turkey and India.

For instance, the US Navy is developing AI-controlled submarines that could have the ability to kill without human control; a prototype of which could be deployed this year. Chinese scientists, on the other hand, are soon expected to deploy unmanned military submarines for their country.

Even the UK’s chief of defence staff stated in an interview, last month, that robot soldiers could make up a quarter of the UK’s army.

Even the French government has recently permitted its armed forces to develop ‘augmented soldiers’, following the country’s military ethics committee. This development took place in spite of the country’s recent disagreement towards LAWS. 

Explaining this, Trisha Ray, an associate fellow at the Observer Research Foundation, whose research focuses on LAWS said that, “The CCW is unlikely to call for a ban, but is calling for safeguards in line with International Humanitarian Law, including meaningful human control.”

“Regulation of LAWS is also tricky given that there is no common definition, so there’s no clarity on what exactly they’re governing. Also, the tech component of LAWS, including the likes of computer vision, sensor fusion, etc., has use cases beyond conflict,” said Ray.

Thus, the CCW, as of yet, has not come up with any concrete multilateral outcomes. Even the ‘guiding principles’ that the CCW published is not considered adequate or an appropriate response to the threats resulting from LAWS. The American Association for the Advancement of Science has also expressed dissatisfaction last year, with the progress made by GGE.

What can be done to keep LAWS under check

As a response to the inadequacy of the regulation, The Campaign to Stop Killer Robots, an international coalition of 160 NGOs across 66 countries, called for an international legally binding instrument. The campaign has also published a paper with the key elements that should be included in such a treaty. 

While a complete ban is an unlikely outcome, the paper established the concept of ‘meaningful human control’, based on which, it suggested the implementation of three core obligations on LAWS. 

The first is a general obligation to maintain meaningful human control over the use of force. Second, a prohibition — negative obligation — on the use of weapons systems that pose a fundamental, moral, or a legal problem, and the third are the set of positive obligations to ensure meaningful human control in all the systems that select and engage targets.

Although this paper is a good starting point, many states suggested that the complex nature of LAWS will significantly complicate the treaty process. However, a feasibility study conducted by the NGO, Human Rights Watch, concluded how the current legal framework and research could be leveraged to inform the treaty and overcome challenges in the process.

“States that wish to preserve meaningful human control over the use of force and prevent the next dangerous revolution in warfare should not be swayed by sceptics…” stated the study. 

It further stated, “States have successfully governed unacceptable weapons in the past. They can and, given the high stakes, should do so again.”

Thus an effort towards regulating LAWS through international treaties and leveraging established legal frameworks can help in overcoming challenges associated with LAWS.

Wrapping Up

A handful of military powers including Russia and the US rejected the proposal to negotiate a new treaty in 2019, calling it ‘premature’.

“If the biggest militaries in the world go ahead with the development of LAWS, a ban will be a tough sell or at the very least toothless,” said Ray. “A binding treaty requires enforcement, and the P-5 are essential.”

“Concerns raised by countries have called for a ban on three major issues. Reducing the cost of conflict through automation, absence of human control is a violation of humanitarian law, and arms racing dynamics which can lead to instability,” stated Ray.

With the accuracy and power with which AI systems and LAWS can cause damage, ignoring these concerns, could result in unprecedented repercussions for the world.

Download our Mobile App

Kashyap Raibagi
Kashyap currently works as a Tech Journalist at Analytics India Magazine (AIM). Reach out at kashyap.raibagi@analyticsindiamag.com

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Recent Stories

Our Upcoming Events

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR