⬆️ ⬇️

Human rights activists urge to adopt the First Law of Robotics





Science fiction lovers are well aware of the three laws of robotics, formulated by Isaac Asimov:



  1. A robot cannot harm a person or by its inaction allow a person to be harmed.
  2. A robot must obey all orders that a person gives, except when these orders contradict the First Law.
  3. A robot must take care of its security to the extent that it does not contradict the First and Second Laws.


These laws operate in books, but not in real life. Fortunately, so far there are no robots capable of hitting the enemy offline, only under the control of the operator. However, technology has come close to a dangerous point, some experts say.





')

For example, the Samsung Techwin model SGR-1 (in the photo) is used to protect the demilitarized zone on the border with North Korea. Equipment - 5.5-mm machine guns and 40-mm automatic grenade launchers. The robot operates in semi-automatic mode, shoots only at the command of the operator.



SGR-1 in action


Governments in a number of countries, including the United States, are favorably taking the opportunity to save the lives of soldiers by replacing them with robots on the battlefield, says Steve Goose from the Human Rights Watch weapons department. At first glance, this alternative looks like a humane choice, but experts warn of the inevitable errors that are associated with the imperfect operation of computer vision algorithms. In some situations, even a person cannot exactly determine the difference between an armed enemy and a civilian, so that computer vision systems will certainly have false positives.



Human Rights Watch published a 50-page report “ Losing Humanity. The Case against Killer Robots ”with an overview of military robots that evolve from managed to fully automatic machines. Human rights defenders call on all countries to comply with international law, including Article 36 of the Additional Protocol I to the Geneva Conventions of 12 August 1949, relating to the protection of victims of international armed conflicts:



Article 36

New weapons



When studying, developing, acquiring or adopting new types of weapons, means or methods of warfare, the High Contracting Party must determine whether their use, under certain or in all circumstances, falls within the prohibitions contained in this Protocol or in any other international law applicable to the High Contracting Party.


Consideration of the legality of new types of military robots should be taken at the stage of developing the concept / design of the device or later, but in any case before the start of its mass production, otherwise it contradicts the Geneva Conventions.



According to experts of the Human Rights Watch Armaments Department, the US Army violated the convention in respect of at least one type of weapon. We are talking about an unmanned aerial vehicle Predator, which is equipped with Hellfire missiles.







These two types of weapons were evaluated independently of each other, whereas the ICRC’s explanation on Article 36 of the Protocol states that a new type of weapon must be re-tested for compliance with international law after “significant modernization”. The equipment of UAVs Hellfire missiles, obviously, falls under the definition of "significant modernization."



The experts of the human rights organization state that there is no direct ban on the use of autonomous combat robots in international law. However, no modern computer vision system is able to comply with Articles 48 and 51 (4) of the Protocol.



Article 48

Basic rate



To ensure respect and protection of civilians and civilian objects, parties to the conflict must always distinguish between civilians and combatants, as well as between civilian objects and military objects and, accordingly, direct their actions only against military objects.


Article 51

Civil Protection



4. Indiscriminate attacks are prohibited. Indiscriminate attacks include:

a) attacks that are not directed at specific military targets;

b) attacks in which methods or means of conduct are used

military operations that cannot be directed to specific military targets; or

c) attacks involving the use of methods or means of warfare, the consequences of which cannot be limited, as required by this Protocol;

and which, therefore, in each such case, hit military targets and civilians or civilian targets without distinction.


The question is, can robots, with sufficient development of technologies in the future, at least theoretically comply with Articles 48 and 51 (4). Still, the difference between a civilian and an armed opponent is one of the fundamental problems.



There are different opinions on this. Some experts believe that a strong AI can still make such decisions. Others say that, by definition, artificial intelligence is not capable of this, because it requires an assessment of a person’s intentions and his emotional state. An example is a mother who runs to her children and shouts that they should not play with toy guns near the soldiers. From the point of view of the computer vision system, there are two armed adversaries and a third, which is approaching with a cry.



The Human Rights Watch organization expresses concern that the armed forces of different countries may start using autonomous combat robots before artificial intelligence experts come to a consensus. Therefore, the organization calls for the adoption of a new international agreement expressly prohibiting the development and use of weapons that can function fully automatically.



This will mean that the first law of robotics by Isaac Asimov, 70 years after its creation, will really come true.



Source: https://habr.com/ru/post/159489/



All Articles