Global Post has published a multi-part series on what it calls 'The Drone Age'. It's a very informative special report that covers the current state of the art in unmanned, remotely-piloted vehicles, and looks to future developments. I recommend it to everyone interested in the field, be the vehicles air-, sea-, land- or (in the future) space-based.
As an example, here are some excerpts from the series' article titled 'Deadlier Drones Are Coming'.
Compared to today's fairly rudimentary Unmanned Aerial Vehicles (UAVs), the drones of the future will be faster and more heavily armed. They will also have better sensors plus more sophisticated computers allowing them to plan and execute attacks with less human participation.
But military analysts and experts on the future of warfare fear these robotic drones could also wind up in the arsenals of more US agencies and foreign governments. That, they add, raises the specter of a whole new kind of conflict which would essentially remove the human element — and human decision-making — from the theater of war.
"Advances in AI (artificial intelligence) will enable systems to make combat decisions and act within legal and policy constraints without necessarily requiring human input," the Air Force stated in its 30-year plan for drone development. The flying branch said it is already working to loosen those policy constraints, clearing a path for smarter, more dangerous drones.
The prospect of even bloodier robot-waged warfare has some experts pleading for a ceasefire, or at least a pause in the pursuit of lethal technology. They say the technology is moving faster than our understanding of its possible effects, and leaving no time to find answers to the moral questions posed by the technological advances.
. . .
Today's drones are ... limited in their ability to sense the ground below them, detect targets and move to attack without assistance. Therefore Air Force and CIA operators must closely supervise most aspects of "unmanned" missions.
. . .
But if a host of government and private research initiatives pan out, the next generation of drones will be more powerful, autonomous and lethal ... and their human operators less involved.
"In the future we're going to see a lot more reasoning put on all these vehicles," Cummings says. For a machine, "reasoning" means drawing useful conclusions from vast amounts of raw data — say, scanning a bustling village from high overhead and using software algorithms to determine who is an armed militant based on how they look, what they're carrying and how they're moving.
. . .
The Air Force is now mapping the policy changes necessary to clear the way for self-directing, armed drones. "Authorizing a machine to make lethal combat decisions is contingent upon political and military leaders resolving legal and ethical questions,"it stated in its 30-year drone plan.
Despite the huge obstacles to building fully autonomous killer robots, the military is already thinking over the implications — in essence, clearing the airspace for these more lethal drones to eventually take flight.
Highly autonomous robots could pose big problems, and not just legally, Stanford researchers Calo and Lin warn. While remote, there is a chance that a highly sophisticated drone could go rogue in combat. How this could happen has to do with the software that could guide future robots' thinking.
. . .
"Autonomous robots are likely to be learning robots, too," Lin says. "We can't always predict what they will learn and what conclusions they might draw on how to behave."
Genetic algorithms could mutate a smart but obedient robot into something uncontrollable. The worst case scenario is that the Pentagon, CIA, other government agencies and allied armies equip themselves with cutting edge drones that, in teaching themselves to find and kill militants, also learn bad habits. Instead of only attacking men wielding weapons, the robots might decide to kill all men or boys, too.
There's more at the link. The entire series is well worth reading.