The State Of Autonomous Weapons In Today’s World

The facial recognition software fitted in the AI system was designed to just target Fakhrizadeh and leave his wife unscathed.
Autonomous Weapons

In November 2020, a New York Times report confirmed that Iran’s top nuclear scientist Mohsen Fakhrizadeh was assassinated by Israeli operatives using an AI-assisted sniper rifle. This report corroborates earlier claims made by Iran’s Revolutionary Guard’s investigations as per which a ‘smart satellite-controlled machine gun’ was used to kill the scientist while he was driving with his wife. While Fakhrizadeh was shot four times, his wife remained unharmed.

Fakhrizadeh’s Assassination

The gun used to kill Fakhrizadeh was accompanied by a robotic apparatus that weighed roughly a ton. The entire system was fitted into the bed of a truck with multiple cameras in it to give the assassins a complete view of the surroundings. The truck was also packed with explosives to blow up evidence once the mission was over/compromised.

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

The gun was connected to an Israeli command centre via a satellite communication relay. An operative was behind this gun who could aim at the target using a computer screen. An AI system was developed to track Fakhrizadeh’s car movement and had a lag of 1.6 seconds. The facial recognition software fitted in the AI system was designed to just target Fakhrizadeh and leave his wife unscathed.

While the attackers could carry out the mission and exploded the truck after the assassination, the smart rifle system was not completely destroyed. The remains of the rifle were used by the Iranian Revolutionary Guard to investigate the attack. This investigation revealed some interesting facts about modern-day warfare.

In a similar but failed attempt in 2018, an AI-controlled drone almost killed the then president of Venezuela, Nicolas Maduro. He was attending an event when two drones detonated explosives near him. 

The advancement in AI has also accelerated the development of autonomous weapons. In the future, these weapons are expected to become more precise, faster, and even cheaper. If this development is carried out ethically and responsibly, these machines could reduce the number of casualties, help soldiers target only combatants, and use autonomous weapons defensively against perpetrators.

In an article by The Atlantic, Taiwanese-born American computer scientist Kai-Fu Lee said that autonomous weaponry is the third revolution in warfare after gunpowder and nuclear arms. He wrote that a true AI-enabled autonomy entails the full engagement of killing, which includes “searching for, deciding to engage, and obliterating another human life, completely without human involvement.”

This year, the Pentagon’s US Defense Advanced Research Projects Agency (DARPA) tested completely autonomous AI-based drones bearing weapons. In August, a drill with AI-controlled drones and tank-like robots was held in Seattle. These drones received specific instructions from the human operators, but they operated autonomously for actions like locating and destroying targets. This exercise demonstrated the benefit of using AI systems for battle situations where the conditions are too complex and dangerous for human intervention. 

Not just the US, many other countries are also actively researching the inclusion of AI in warfare. China, arguably, leads the race. According to a Brookings Institute report, the Chinese military has been pursuing significant investments in robotics, swarming, and other AI-based weaponry. While it is difficult to ascertain the sophistication of these systems, the report says that the weapons may possess different levels of autonomy.

Inherent Dangers

Activists and experts across the board believe that the use of autonomous weapons has many dangers, sometimes far outweighing the advantages. Kai Fu-Lee, in his recent interview, said, “The single largest danger is autonomous weapons.” He said that warfare is the only time when AI is trained to kill humans. Lee said that autonomous weapons becoming more advanced and affordable would wreak havoc and can even be used by terrorists to perform genocide. “We need to figure out how to ban or regulate it,” Lee added.  

In 2015, tech and business leaders like Elon Musk and Steve Wozniak, along with 200 other AI researchers, signed an open letter proposing a complete ban on autonomous weapons. This proposal drew support from over 30 countries; however, a Congress commissioned report advised the US to defy the ban.

Is regulation an option?

Human Rights Watch and other non-governmental organisations launched the campaign to Stop Killer Robots in 2013. Since then, the concern regarding fully autonomous weapons has climbed up the international agenda. It has been recognised as a great threat to humanity that deserves urgent multilateral action.

Since 2018, the United Nations Secretary-General António Guterres has been urging the states to prohibit autonomous weapons that could, by themselves, target and attack human beings, calling them “morally repugnant and politically unacceptable.”

A legally binding instrument called the Convention on Conventional Weapons came into being in 2014. Partner countries meet every year to discuss the concerns related to  Lethal Autonomous Weapon Systems (LAWS). Close to 30 countries have called for the ban of fully autonomous systems, and 125 member-states in the Non-Aligned Movement have called for a ‘legally binding international instrument’ on LAWS. Critics still believe that a full ban would come to force any time soon.

In an earlier interview with Analytics India Magazine, Trisha Ray, an associate fellow at the Observer Research Foundation, whose research focuses on LAWS, said that “The CCW is unlikely to call for a ban, but is calling for safeguards in line with International Humanitarian Law, including meaningful human control.”

More Great AIM Stories

Shraddha Goled
I am a technology journalist with AIM. I write stories focused on the AI landscape in India and around the world with a special interest in analysing its long term impact on individuals and societies. Reach out to me at shraddha.goled@analyticsindiamag.com.

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

Conference, in-person (Bangalore)
Cypher 2023
20-22nd Sep, 2023

3 Ways to Join our Community

Whatsapp group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our newsletter

Get the latest updates from AIM