Wednesday, February 27, 2008

The Coming Robotic Apocalypse

Gadzooks! My evil plan has been discovered!

From the AFP:

Automated killer robots 'threat to humanity': expert

Increasingly autonomous, gun-toting robots developed for warfare could easily fall into the hands of terrorists and may one day unleash a robot arms race, a top expert on artificial intelligence "They pose a threat to humanity," said University of Sheffield professor Noel Sharkey ahead of a keynote address Wednesday before Britain's Royal United Services Institute.

But up to now, a human hand has always been required to push the button or pull the trigger.

It we are not careful, he said, that could change.

Military leaders "are quite clear that they want autonomous robots as soon as possible, because they are more cost-effective and give a risk-free war," he said.

Several countries, led by the United States, have already invested heavily in robot warriors developed for use on the battlefield.

South Korea and Israel both deploy armed robot border guards, while China, India, Russia and Britain have all increased the use of military robots.

Captured robots would not be difficult to reverse engineer, and could easily replace suicide bombers as the weapon-of-choice. "I don't know why that has not happened already," he said.

But even more worrisome, he continued, is the subtle progression from the semi-autonomous military robots deployed today to fully independent killing machines.

Sharkey would seem to be an alarmist kook. Five years ago he would have been just that. However considering that since then Metal Storm LTD. has unveiled an automated weapons system capable of firing 1,000 40mm grenades per second his concern is completely justified. (Although to the best of my limited knowledge they haven't managed to invent a loading system capable of sustaining that rate of fire yet.) We're starting to reach the point that our advances in weapons and robotics are on the verge of creating the same quandaries that our advances in genetics are. In other words just because we can do it, should we?

Besides there's always the off chance that some madman will insert malicious code into the networks that control these robots and eventually take over the world.