Ukrainian drone development could create killer robots
Kyiv, Ukraine –
Advances in Ukraine’s drones have accelerated a long-awaited technology trend that could soon bring the world’s first fully autonomous combat robots to the battlefield, ushering in a new era of warfare.
According to military analysts, warfighters and artificial intelligence researchers, the longer the war lasts, the more likely it is that drones will be used to identify, select and attack targets without the help of humans.
It will be a revolution in military technology as profound as the introduction of the machine gun. Ukraine already has semi-autonomous attack drones and anti-drone weapons with artificial intelligence. Russia also claims to have AI weapons, although these claims have not been proven. But there are no confirmed cases of a nation putting robots into battle and being killed completely on its own.
Experts say it may only be a matter of time before Russia or Ukraine or both deploy them.
“A lot of states are developing this technology,” said Zachary Kallenborn, a weapons innovation analyst at George Mason University. – It is clear that it is not so difficult.
The sense of inevitability extends to activists who have tried to ban killer drones for years, but now believe they should be content with trying to limit the weapons’ offensive use.
Ukraine’s Digital Transformation Minister Mykhailo Fedorov agrees that fully autonomous killer drones are a “logical and inevitable next step” in weapons development. He said that Ukraine has done “a lot of scientific and research work in this direction”.
“I think the potential for that in the next six months is huge,” Fedorov told The Associated Press in a recent interview.
Ukrainian Lt. Col. Yaroslav Honchar, co-founder of Aerorozvidka, a non-commercial combat drone innovation organization, said in a recent interview near the frontline that human fighters simply cannot deal with information and make decisions like machines.
Ukraine’s military leaders currently prohibit the use of non-lethal weapons, he said, although that could change.
“We haven’t crossed that line yet — and I say ‘yet’ because I don’t know what the future holds.” said Honchar, whose group is leading drone innovation in Ukraine by turning cheap commercial drones into deadly weapons.
Russia may acquire autonomous artificial intelligence from Iran or elsewhere. Iranian-supplied Shahed-136 long-range explosive drones have crippled Ukrainian power plants and terrorized civilians, but they’re not that smart. Iran has other drones in its developing arsenal that it says are equipped with artificial intelligence.
According to Western manufacturers, Ukraine could easily make its semi-autonomous armed drones fully autonomous to better survive jams on the battlefield.
These drones include the US-made Switchblade 600 and the Polish Warmate, both of which currently require a human to select a target on a live video feed. AI gets the job done. Technically known as “cruising munitions,” drones can hover over a target for several minutes, waiting for a clean shot.
“With the Switchblade, the technology to achieve a fully autonomous mission almost exists today,” said Vahid Nawabi, founder of AeroVironment. That will require a change in policy — to remove humans from the decision-making loop — he estimates will be three years from now.
Drones can already recognize targets such as armored vehicles using cataloged images. But there is disagreement over whether the technology is reliable enough to ensure the machines don’t trick and kill non-combatants.
AP asked the Defense Ministries of Ukraine and Russia whether they had used autonomous weapons in the attack and whether they agreed not to use them if the other side agreed to the same. None of them answered.
If both sides attacked with full AI, it might not even be the first.
Last year, an inconclusive UN report claimed that killer robots had debuted in Libya’s internal conflict in 2020, and that Turkish-made Kargu-2 drones had killed an unknown number of fighters in fully automatic mode.
A spokesman for manufacturer STM said the report was based on “speculative and unconfirmed” information and “should not be taken seriously”. He told AP that Kargu-2 cannot attack the target unless the operator says so.
Fully autonomous artificial intelligence is already helping to defend Ukraine. Utah-based Fortem Technologies has provided the Ukrainian military with drone hunting systems that combine small radars and unmanned aerial vehicles. The radars are designed to identify enemy drones, so they can shoot at them without human assistance and disable them.
The number of drones with artificial intelligence continues to grow. Israel has been exporting them for decades. His radar-killing harpy can hover over anti-aircraft radar until nine o’clock, waiting for them to engage.
Other examples include Beijing’s Blowfish-3 unmanned armed helicopter. Russia is working on an underwater artificial intelligence drone called Poseidon with a nuclear warhead. The Dutch are currently testing a ground robot with a .50 caliber machine gun.
Honchar believes that Russia, whose attacks on Ukrainian civilians have shown little respect for international law, would have used autonomous killer drones by now if they were in the Kremlin.
“I don’t think they’ll have any concerns,” said Adam Bartosiewicz, vice president of WB Group, which makes Warmate.
Artificial intelligence is a priority for Russia. President Vladimir Putin said in 2017 that whoever dominates this technology will dominate the world. In his speech on December 21, he expressed confidence that the Russian arms industry can integrate artificial intelligence into combat vehicles, emphasizing that “the most effective weapon systems are those that work quickly and practically in automatic mode.” Russian officials already claim that their Lancet drone can operate fully autonomously.
“It’s not going to be easy to know if and when Russia will cross that line,” said Gregory J. Allen, former director of strategy and policy at the Pentagon’s Joint Artificial Intelligence Center.
Getting a drone from remote piloting to full autonomy can be unremarkable. To date, drones that can operate in both modes have performed best when piloted by a human, Allen said.
A top artificial intelligence researcher, Professor Stuart Russell of the University of California-Berkeley, said the technology is not that complicated. Colleagues he interviewed in the mid-2010s agreed that graduate students could produce an autonomous drone “capable of finding and killing an individual inside a building, for example,” within a short period of time.
Attempts to establish international ground rules for military drones have so far failed. Nine years of informal United Nations talks in Geneva have made little progress, with major powers including the United States and Russia opposing the ban. The last session in December ended without a new round scheduled.
Washington politicians say they won’t agree to a ban because rivals developing drones can’t be trusted to use them ethically.
Toby Walsh, an Australian academic who campaigns against killer robots like Russell, hopes to reach consensus on certain limits, including a ban on systems that use facial recognition and other data to identify or attack individuals or categories.
“If we’re not careful, they’ll spread more easily than nuclear weapons,” said Walsh, author of Machines Behaving Badly. “If you can make a robot kill one person, you can kill a thousand.”
Scientists also worry about terrorists reusing AI weapons. In a terrifying scenario, the US military spends hundreds of millions writing code to power killer drones. It is then stolen and copied, thus giving terrorists the same weapon.
The global community is concerned. A 2019 Ipsos poll for Human Rights Watch found that 61% of adults in 26 countries oppose the use of lethal autonomous weapons systems.
Allen, a former Defense Department official, said that to date, the Pentagon has neither defined an “autonomous weapon” nor authorized such weapons for use by US troops. Any proposed system must be approved by the chairman and two deputy secretaries of the Joint Chiefs of Staff.
This does not prevent the development of weapons in the United States. Projects are ongoing at the Defense Advanced Research Projects Agency, military laboratories, academic institutions, and the private sector.
The Pentagon is emphasizing the use of artificial intelligence to augment human warfighters. The Air Force is exploring ways to pair pilots with drone wings. Former Under Secretary of Defense Robert O. Work, a proponent of the idea, said in a report last month that it would be “crazy to move to an autonomous system” once the systems are up and running. he says that 2015 has passed when computer vision surpasses human vision.
People have already retreated into some defense systems. Israel’s Iron Dome missile shield is allowed to fire automatically, although it is supposed to be monitored by someone who can intervene if the system pursues the wrong target.
Several countries and every branch of the U.S. military are developing drones that can attack in synchronized lethal swarms, according to George Mason researcher Cullenborn.
So will future wars become a battle to the last drone?
This is what Putin predicted in a televised talk with engineering students in 2017: “When one side’s drones are destroyed by the other’s drones, it will have no choice but to surrender.”
——
Frank Bajak reported from Boston. Associated Press reporters Tara Kopp in Washington, Garance Burke in San Francisco and Suzan Fraser in Turkey contributed to this report.