Can We Control Killer Robots?

Can We Control Killer Robots?

Autonomous weapons are cheap to produce, and they can be fast. Often referred to as killer robots there is some concern about their ability to value human life. Many people are starting to wonder how we can control them, or whether a robot with artificial intelligence could take it upon itself to kill innocent humans. There has been a situation recently in Sudan where soldiers where ordered to fire on protestors outside the military headquarters in the center of Khartoum. They were supposed to do this at the same time as the police and secret service personnel unleashed tear gas into the crowds. However, the soldiers fired their guns into the air and did not shoot anyone, something that was much appreciated by the many people gathered there. How would they have fared if those soldiers had been robots though? Most of them would have died or been injured as the robots carried out their orders.

A panel of Experts Discuss If We Can Control Killer Robots

Killer robots are intelligent machines that can find, select and kill their targets without human intervention. Many countries are racing to be the first to develop such robots, but can they be regulated, and can their use really be justified? If killer robots target and kill the wrong people, who will be responsible for those deaths?
A panel of experts recently met to discuss the social implications of so many countries adopting the use of lethal autonomous weapons.

The panel included top lawyers and arms specialists alongside computer security experts. They all agreed that autonomous lethal weapons are not capable of fulfilling the requirements of international humanitarian law. This is because they go against the principle of the Martens Clause which says that emerging technologies are to be judged by the principles of humanity and from the dictates of public conscience.

The members of the panel also agreed that the automatic function of selecting and engaging with human targets needs to have some human control. The lack of this they decided, dishonors human life and dignity. Emotions help humans to make the right decisions, and that is something killer robots do not have.

Loss of Human Life Will Be Greater If We Cannot Control Killer Robots

With less human soldiers and a lower cost, it becomes easier to wage war on another nation. International law says that before a strike takes place, the amount of innocent civilian lives that would be lost has to be calculated. It should not be excessive compared to the anticipated military advantage of the strike. These ethics are not in the workings of killer robots and these machines would attack indiscriminately. This would result in many more innocent people being killed.

Without risking their own soldiers, it makes it much simpler for states to go to war. Killer robots lower the barriers when there is no fear of many dead bodies returning home. When a machine is programmed to select and kill its own targets, who is responsible if it kills innocent people along the way? Legally, a robot cannot be held responsible and as for the programmers, coders, developers and military commanders in charge, there are many legal obstacles to suing them also.

One of the real risks is that if authoritarian governments use these killer robots on their own people. In a scenario such as the one in Sudan, there would have been many innocent lives lost.

Previous The Updated Handle From Boston Dynamics
Next Competiton for The Japanese Robots to Help the Elderly, Stevie 11

You might also like

Meet the robots

Humanoid robot goes to school

Pepper is a new student in Hisashi High School in Waseda, Japan. He is the first robot in history, who will be attend classes alongside human students.

Meet the robots

Janken

Is it possible to win 100% playing the millennia-old game of Rock, Paper, Sissors? A new Janken robot (Janken is the Japanese name for Rock, Paper, Scissors – why is

Meet the robots

You will never be the best on „Ms. Pac-Man” anymore

All of us knows this old 1980s arcade game – Ms. Pac-Man. And nobody could ever get the perfect 999,990 score on it. Now, it has been changed. Deep learning