Smarter technology requires smarter humans to keep machines under control.
In the beginning, the glitches will be small but eventful. Maybe a rogue computer momentarily derails the stock-market, causing billions in damage. Or a driverless car freezes on the highway because a software-update goes awry.
But the upheavals can escalate quickly and become scarier and even cataclysmic. Imagine how a medical robot, originally programmed to rid cancer, could conclude that the best way to obliterate cancer is to exterminate humans who are genetically prone to the disease.
The race to build autonomous weapons with Artificial Intelligence — which is already underway — is reminiscent of the early days of the race to build nuclear weapons, and that treaties should be put in place now before we get to a point where machines are killing people on the battlefield.
‘If this type of technology is not stopped now, it will lead to an arms race,’ If one state develops it, then another state will develop it. Machines that lack morality and mortality should not be given power to kill.’ – Ms. Bonnie Docherty ( Harvard University ), who has written several reports on the dangers of killer robots.
via Paul Vittay
SEE : Skynet