MITB Banner

US Air Force Uses AI On Military Jet — A Watershed Moment For AI In Defence

Share

US Air Force Uses AI On Military Jet — A Watershed Moment For AI In Defence

Illustration by US Air Force Uses AI On Military Jet — A Watershed Moment For AI In Defence

The US Air Force, in a recent attempt, used artificial intelligence on a Lockheed U-2 spy plane in a training flight, for controlling its sensors and navigation systems. This is believed to be the first of such an attempt where artificial intelligence has been used in US’ military aircraft. Although the plane was steered by a human pilot with no weapons involved, experts in the defence sector believe this to be a watershed moment in defence as well as a subject of intense debate in arm control communities.

When asked about the attempt, Assistant Air Force Secretary Will Roper stated that such an endeavour of leveraging AI sagely in the US military system gives rise to a “new age of human-machine teaming and algorithmic competition.” As a matter of fact, he believes that failing to realise the full potential of artificial intelligence will lead to “ceding decision advantage to our adversaries.”

According to the sources, the AI system used for the spy plane was deliberately designed without a manual override to test its capability in the test environment. However, it was relegated to highly specific tasks and was separated from the plane’s flight controls. Because of the sensitive nature of the work, the Air Force kept the name of the pilot anonymous, sharing only his call sign — ‘Vudu.’

The pilot who carried out the test stated to the media that technically he was the pilot in command, and the role of artificial intelligence was relatively narrow. However, “for the task the AI was designed, it performed well,” the pilot stated.

The AI algorithm, on the other hand, was dubbed ARTUµ, an apparent Star Wars reference, was responsible for sensor employment and tactical navigation, according to the news released by the Air Force.

Further, it was explained that the AI system was trained to look for incoming missiles and missile launchers, as an initial test flight. Also, while directing the plane’s sensors, the AI was designed to have the last call.

This initiative aimed to bring the Air Force as well as the military sector closer to machines where robots will be responsible for technical tasks with direct control of humans. Roper stated that the human, in the end, will be in control of the life-or-death decisions like flight control and targeting.

ARTUµ was developed based on open-source software algorithms — µZero, which was developed by the AI research company DeepMind for strategic games like Chess and Go, and was adapted by the U-2 Federal Laboratory. ARTUµ was further made publicly available through a Google-developed Kubernetes, which enabled the AI system to work with the plane’s onboard computer systems.

Traditionally, the Lockheed U-2 spy plane hasn’t been developed for an AI-enabled flight, instead was built in the early 1950s for the CIA to use it in Cold War surveillance from staggeringly high altitudes of 60,000 or 70,000 feet. However, in the current scenario, the planes were procured by the Defense Department.

Considering the planes were known for its surveillance work, it has been well equipped with AI to analyse complex data. The programme — Project Maven by the Air Force, was created to rapidly analyse data from drone footage, but due to backlash from its employees, Google declined to renew the same. The tech giant later released a set of AI principles that forced it to disregard the company’s algorithms from being used in any weapons system.

With that being said, Eric Schmidt, who led Google until 2011, believes it’s tough for the US military to fully embrace autonomous weapons anytime soon. This is just because of the uncertainty AI faces while performing in the possible scenarios, “including those in which human life is at stake.” According to him, while humans killing civilians by mistake is a tragedy, AI doing the same is a disaster. 

As a matter of fact, nobody will take responsibility for such an uncertain system. However, this initiative was just to create a possibility for the military to work with AI.

Share
Picture of Sejuti Das

Sejuti Das

Sejuti currently works as Associate Editor at Analytics India Magazine (AIM). Reach out at sejuti.das@analyticsindiamag.com
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.