Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!
February 15, 2023
The war in Ukraine is fast approaching its first anniversary. Along with showing the harsh realities of war, the conflict has painted a stark picture of the effectiveness of drones and autonomous systems on the battlefield.
In a bid to put the topic of responsible AI in the military domain higher on the political agenda, the Ministry of Foreign Affairs of The Netherlands put on an event dedicated to increasing knowledge of autonomous advancements in defense.
REAIM, which took place in The Hague, saw experts discuss issues and considerations that have arisen from the use of drones on the battlefield as well as at home.
Both sides in the Ukraine conflict have deployed drones for counter-missile defenses or taking out strategic targets like fuel depots through loitering munitions, more commonly known as ‘kamikaze’ drones.
The use of autonomous weapons has made some lawmakers concerned, with plans to ban such weapons shelved by the United Nations Convention on Certain Conventional Weapons after most major powers voted against such measures.
But according to Professor Frans Osinga from Leiden University, “the train has left the station” for these systems, and they will continue to be a new feature of the battlefield ecosystem.
Osinga said that despite 70-90% of these drones having been shot out of the sky, the controversial aspect of such weapons comes from being able to hit targets from longer distances, potentially away from the battlefield itself.
He argued that smaller drones can empower ground troops: “We've seen in Ukraine, these little drones help the individual. You give them a little artillery, a little sensor and a little defensive system, it might actually reduce the vulnerability of the individual soldiers in urban and trench warfare.”
Professor Osinga is an active serving officer with the rank of Air Commodore
Nations have been stonewalled on sending troops to support Ukrainian forces over fears of being dragged into the conflict. Osinga said that autonomous systems could “reduce the risk for peacekeepers” and provide an effective alternative to sending humans.
But according to Ingvild Bode, Associate Professor of International Relations at the University of Southern Denmark, the integration of autonomous technologies into weapons systems has created “novel uncertainties.”
Bode contended that the technology’s inherent lack of situational awareness and increased system complexity means issues can arise around what conditions force will be used.
Loitering munitions can target multiple target types. For example, Russia’s Zala Lancet-3 can target military objects, vehicles and personnel. However, more stationary weapon systems, like battery guns on ships, can only target a limited range of targets.
Bode is also the principal investigator of an ERC research project on autonomous weapons systems and international norms (AUTONORMS)
Bode said that both system types pose different challenges to human operators.
She said that for stationary systems, like the U.S. Navy’s use of Raytheon’s Phalanx CIWS automated gun battery, humans were “circumscribed”, moving from being reactive controllers to passive supervisors.
Whereas for loitering munitions, manufacturers are touting deep learning and facial recognition to empower drones to attack targets without human intervention. Bode contends that drone makers still use humans in the loop, however, issues remain around the quality of that control and the role that this human is supposed to play.
While drones and other automated battlefield solutions have been used in Syria, Libya and Nagorno-Karabakh, the conflict in Ukraine has proven an eye-opening moment for wider interest in such deployments.
According to Professor Osinga, superpowers are all looking at one another to keep up amid a race of interest. He said the U.S. fears experiencing “another Sputnik moment.”
And for NATO, the technology poses a huge threat, as swarms of automated drones can be used to target SAM (surface-to-air missile) sites, like the ones along the Eastern stretch of the bloc. Osinga said that NATO is pushing to acquire such technologies for deterrence purposes.
“Tactical operational level logic and strategic level logic will drive Western militaries to look carefully at these technologies,” said Osinga.
Ben Wodecki is the Jr. Editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to Jr. Editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others. He holds an MSc in Digital Journalism from Middlesex University.
You May Also Like
Generative AI Journeys with CDW UK's Chief TechnologistFeb 28, 2024
Qantm AI CEO on AI Strategy, Governance and Avoiding PitfallsFeb 14, 2024
Deloitte AI Institute Head: 5 Steps to Prepare Enterprises for an AI FutureJan 31, 2024
Athenahealth's Data Science Architect on Benefits of AI in Health CareJan 19, 2024