Stephen Hawking and Elon Musk say the “killer robots” must be banned before it is too late
Preventive measures against these killer robots must be taken as soon as possible, a group of scientists, researchers and academics warned. The group, including Elon Musk and Stephen Hawking, who presumably alleged that the military artificial intelligence arms race could be serious if not taken into consideration and proper actions not taken.
The group cautioned the people concerned at the International Joint Conferences through an open letter on Artificial Intelligence in Buenos Aires about the global arms. These arms, if the military pushes forward the AI weapons development, could be practically inevitable because they have great power globally.
The letter contains the signatories of Apple co-founder Steve Wozniak and Google CEO Demis Hassabis, two major tech figures and this concludes that the stakes are very high. Additionally, after the revolution of gunpowder and nuclear arms, sovereign weapons would be used in the third revolution in war. That is, pistol-toting Terminators as described, smart cars built up with machine guns or even bomber drones which are self-piloted are few weapons portrayed in the letter. These could be achieved in the coming years, if not decades.
Likewise, these weapons are not costly and do not require rare raw materials for mounting. The letter added that producing these weapons in mass would be rather easy and will be present everywhere for all major military powers. Moreover, the weapons could be used for tasks like killings, weaken nations, suppressing populations and assassination of particular ethnic group, the letter continues. This brings to a close that military AI arms would not be valuable for the human race.
Most of the warfare is controlled by remote control machines, for example, MADSS belonging to the U.S. military has a 1, 400-pound traveler carrying gear and machine gun. A protector of 1, 000-pound rover, which is efficient for the scanning of bombs and a bazooka is fired ultimately, is also found.
Unfortunately, few militaries are testing the automated systems already. For instance, the “iron dome” system used by the Israel detects and shoots external rockets. Another one is the “Phalanx CIWS” system and the C-RAM exercised by the U.S. navy has both same rotating Gattling gun.
In April, there was a call from the Harvard Law School and Human Rights to ban these autonomous weapons.
Via: Google Alerts for AI