Some weeks ago I predicted that autonomous drones, without a "human-in-the-loop" to make kill/no-kill decisions, would soon be deployed. Reader Vincent H. just sent me the link to this article. (Thanks, Vincent!)
Saker is a young Ukrainian company founded in 2021 to produce commercial AI and drone solutions. The company decided to develop a military product after Russia invaded in February 2022.
. . .
Developed with remarkable speed and initially deployed in September 2023, the Saker Scout claims to be a tough, barebones-looking UAV with a prominent camera assembly hanging from its belly and an explosive charge strapped to its back. The Saker platform is distinguished, creators say, by its AI control system. The drones operate as an autonomous network in flight, hunting down camouflaged enemy vehicles with recon units and then sending in bomb-laden drones to make the kill.
Defense Express noted when the Saker Scout was unveiled in September that its advanced artificial intelligence was not cheap or easy to produce, but the drones themselves cost a relative pittance. Swarms of drones working together to find and eliminate targets without human intervention are very cost-effective compared to traditional infrared or radar-homing missiles. Taking human operators out of the equation makes the Saker airborne network very difficult to jam.
New Scientist noted that Saker Scouts can be piloted by human operators in the traditional manner, with the AI picking out 64 different types of Russian “military objects” and inviting the operator to commit bomb drones to attack them. Input from the drone groups is collated into Delta, the Ukrainian situational awareness computer system. Delta is a supercomputer that creates highly detailed real-time battlefield maps by collating input from various sensors and devices.
According to a Saker representative who spoke with New Scientist, the company’s drones have now been deployed in autonomous mode “on a small scale,” with human operators taken out of the loop.
Military analysts greeted the long-anticipated arrival of killer robots with trepidation. Humanitarian groups worry that autonomous systems might be less scrupulous about avoiding civilian casualties, as they could act with lethal force against “false positive” targets without human oversight. Military experts fear the coming of autonomous weapons that might escalate a conflict very rapidly, leaving their human masters behind as they duel with each other at lightning speed.
There's more at the link.
I'm sure this isn't the only weapon of its kind out there. By now there may be a dozen models or more, spread over several countries, that we don't know about. I'll be very surprised indeed if major weapons producers such as Russia, China, the USA and Israel aren't among them. Such weapons are just too useful to be ignored, and if one's enemies are using them, one has little choice but to respond in kind.
Tragically, this will mean that before long they'll be in terrorist hands as well. Groups such as Hamas and Hezbollah are already using reconnaissance and explosive drones of their own, and Mexican drug cartels are using them to fly illegal narcotics across the border. If you think that domestic terrorist groups in the USA (including Palestinian sympathizers) won't soon have them, too, or that the cartels will fail to see the possibilities of sending autonomous armed drone escorts along with their cargo carriers, to target Border Patrol and other agents trying to intercept them, you're living in a dream world.
That also means that our lives - literally everybody's lives - have become a lot less safe. With no human being to make decisions as to whether or not something or somebody is a legitimate target, just being in the wrong place at the wrong time (like, for example, driving on a road that leads to a place where a drug cartel has arranged a pickup of narcotics) can be enough for an autonomous drone to make its own kill decision and take us out. That's anything but a happy thought . . .
Peter
7 comments:
Ukraine also has the "IKEA cardboard drones" to combine with these AI killer drones or being controlled by the AI killer drones. The attraction is "these are cheap" and I wondered how long it would be before other countries & groups started creating them.
how do they hold up to #4 lead shot (fired out of a 12 gauge 3" chamber?
jes askin' fer a fren.
SKYNET is not united, but it is here.
No comment.
"Smile!" they told me. "Things could be worse!"
So I smiled - and shure enuff, things got worse.
There have been autonomous systems making kill decisions for years, from smart mines such as CAPTOR to anti aircraft systems such as AEGIS and "brilliant" munitions like Viper Strike.
This isn't even new in the air; we've had loitering cruise missiles choosing targets since the 1980's.
The only new part of this is the low cost.
Note that this system doesn't attack until an operator verifies the target... It's actually a step back from some existing systems.
Jonathan
Ukraine has a super computer?
Post a Comment