A nightmare the world has no cause to invent

a still from the science fiction film ‘I, Robot’

A still from the science fiction film ‘I, Robot’

Mankind is a bloodthirsty species. According to Steven Pinker, the academic, for much of history being murdered by a fellow human was the leading cause of death. Civilisation is largely a tale of man’s violent instincts being progressively muffled. A part of this is the steady withdrawal of actual human flesh from the battle zone, with front lines gradually pulled apart by the advent of long-range artillery and air power, and the decline in the public’s tolerance for casualties.

Arguably, America’s principal offensive weapon is the drone, firing on targets thousands of miles from where its controller safely sits. Given the pace of advance, it takes no imaginative leap to foresee machines displacing human agency altogether from the act of killing. Artificial brains already perform well in tasks hitherto regarded as the province of humans. Computers will be trusted with driving a car or diagnosing an illness. Algorithmic intelligence could therefore surpass the human sort for making the decision to kill.


On this story

On this topic

FT View

This prospect has prompted more than 1,000 artificial intelligence experts to write calling for the development of “lethal, autonomous weapons systems” to cease forthwith. Act now, they urge, or what they inevitably dub “killer robots” will be as widespread, and as deadly, as the Kalashnikov rifle.

It is easy to understand military enthusiasm for robotic warfare. Soldiers are precious, expensive and fallible. Every conflict exacts a heavy toll from avoidable human error. Machines in contrast neither grow weary nor lose patience. They can be sent into places unsafe or even impossible for ordinary soldiers. Rapid improvements in computational power are giving machines “softer” skills, such as the ability to identify an individual, flesh-and-blood target. Robots could eventually prove safer than even the most experienced soldier, for example by being capable of picking out a gunman from a crowd of children — then shooting him.

The case against robotic warfare is the same that applies to all advances in weaponry, the avoidance of unforeseeable consequences that cause unlimited damage to the innocent. Whatever precautions are taken, there is no foolproof way to stop weapons falling into the wrong hands. For a glimpse into what could go wrong, recall how Chrysler, the US carmaker has needed to debug 1.4m vehicles after finding the car could be remotely hacked. Now imagine it came equipped with guns.

Technological futurists also fret about the exponential nature of advances in artificial intelligence. The scientist Stephen Hawking recently warned of the “technological catastrophe” that would follow artificial intelligence vastly exceeding the human sort. Whether this is a plumb inevitability or fantasy, science itself cannot decide: but in light of the risk, how sensible can it be to arm such super-intelligences?

The moral argument is more straightforward. The abhorrence of killing has been as important to its decline as any technological breakthrough. Inserting artificial intelligence into the causal chain would muddle the responsibility that must underpin any decision to kill. Without clear responsibility, not only might the means to wage war be enhanced, but so too might the appetite for doing so.

Uninventing weapons is impossible: consider anti-personnel landmines — autonomous weapons in their way — which are still killing 15,000-20,000 people annually. The nature of artificial intelligence renders it impossible to foresee where the development of autonomous weapons would end. No amount of careful programming could limit the consequences. Far better not to embark on such a journey.

Copyright The Financial Times Limited 2015. You may share using our article tools.
Please don’t cut articles from FT.com and redistribute by email or post to the web.

Source: A nightmare the world has no cause to invent

Via: Google Alerts for AI

Pin It on Pinterest

Share This