Smart bombs that can pick whom to kill

Los Angeles: On a bright fall day last year off the coast of Southern California, an Air Force B-1 bomber launched an experimental missile that may herald the future of warfare.

Initially, pilots aboard the plane directed the missile, but halfway to its destination it severed communication with its operators. Alone, without human oversight, the missile decided which of three ships to attack, dropping to just above the sea surface and striking a 260-foot (80-metre) unmanned freighter.

Warfare is increasingly guided by software. Today, armed drones can be operated by remote pilots peering into video screens thousands of miles from the battlefield. But now, some scientists say, arms-makers have crossed into troubling territory: They are developing weapons that rely on artificial intelligence, not human instruction, to decide what to target and whom to kill.

As these weapons become smarter and nimbler, critics fear they will become increasingly difficult for humans to control or to defend against. And while pinpoint accuracy could save civilian lives, critics fear weapons without human oversight could make war more likely, as easy as flipping a switch.

Britain, Israel and Norway are already deploying missiles and drones that carry out attacks against enemy radar, tanks or ships without direct human control.

After launch, so-called autonomous weapons rely on artificial intelligence and their own sensors to select targets and to initiate an attack.

Britains fire and forget Brimstone missiles, for example, can distinguish among tanks and cars and buses without human assistance, and can hunt targets in a predesignated region without oversight. The Brimstones also communicate with one another, sharing their targets.

Armaments with even more advanced self-governance are on the drawing board, although the details usually are kept secret.

An autonomous weapons arms race is already taking place, said Steve Omohundro, a physicist and artificial intelligence specialist at Self-Aware Systems, a Palo Alto, California, research centre. They can respond faster, more efficiently and less predictably.

Concerned by the prospect of a robotics arms race, representatives from dozens of nations will meet on Thursday in Geneva to consider whether development of these weapons should be restricted by the Convention on Certain Conventional Weapons. Christof Heyns, the United Nations special rapporteur on extrajudicial, summary or arbitrary executions, last year called for a moratorium on the development of these weapons altogether.

See the article here:

Smart bombs that can pick whom to kill

Related Posts

Comments are closed.