The three laws of robotics, formulated by the scientist, popularizer of science and the genius writer Isaac Asimov, influenced the further development of certain trends in robotics and the philosophy of this trend, so to speak. About the "Three Laws" did not hear only one who has no relation to technology and has never read science fiction.
It is worth noting that for the first time "Laws" were formulated in the science fiction story "Round Dance", which was published in March 1942. Since then, as many as 73 years have passed, but so far the "laws" are relevant, and are considered by modern specialists in robotics, artificial intelligence and related disciplines.
The story was first translated into Russian 20 years after it was written - in 1963. Asimov himself mentioned the laws many times, using them in various stories of the “I, Robot” cycle. In addition, these laws and other science-fiction writers were used, and after that, scientists. However, now laws are viewed more as a theory than practice - after all, there is still no pure “AI”, and those robots that exist now, such laws simply cannot “understand”, their control and information processing systems are much more primitive than that that was described by Asimov in his works. In order for the laws to be used, the robot must be as perfect as Azimov saw it.
')
Interestingly, Azimov himself believed that he himself did not formulate the laws in their present form, his friend and publisher John Campbell, the editor-in-chief of Astounding magazine. Campbell,
in turn , “said that he had just isolated the Laws from what Azimov had already written. Asimov himself has always conceded the honor of the authorship of the Three Laws to Campbell. ”
The laws themselves are:
- A robot cannot harm a person or by its inaction allow a person to be harmed.
- A robot must obey all orders that a person gives, except when these orders are contrary to the First Law.
- A robot must take care of its security to the extent that it does not contradict the First and Second Laws.
Original Text (Eng.)
- A robot may not be injured.
- A robot must be obeyed except where it would be a human beings where it would be a rule.
- It’s not.