📜 ⬆️ ⬇️

About unmanned vehicles or why I do not want to live in a "smart house"

Recently, the news river has repeatedly conveyed to me optimistic reports about the next success of an unmanned vehicle. At first, he simply learned to drive along the road, then he learned to share it with his fellows, then he began to distinguish elks and pedestrians running across the road and did not even make a difference between them and avoid collisions equally. The latest reports from this front cheerfully report that in some states of the USA the horseman rides are already being released onto public roads. Automakers are not lagging behind - they have already prepared concepts and even existing prototypes. They are already thinking about mass production.

And here would slow down, but think about it ...


')
Unmanned vehicles are still taking their first steps in this world. However, their elder brothers, unmanned aircraft, are already plowing our skies. And they do not just plow, but actively intervene in what is happening on the ground - they shoot terrorists, for example. Smart homes have long ceased to be fiction and even innovation. However, one simple, in essence, a legal issue consisting in one of the clauses of the license agreement has not yet been resolved.

Behind all these drones, smart houses, Azimo and other achievements of robotics is the program. Big and smart, or small and stupid, but she is always there. This program analyzes incoming data and makes a decision based on them. And this program has a license agreement. In which it is written - the developer is not responsible ..., does not guarantee ..., does not accept claims ...

The closest example is an enterprise management system (1C, Axapta and who else is there). These systems are completely analogous to programs that service robots - they collect information on the state of the enterprise, analyze it and, based on their calculations, offer some kind of solution. Notice - I said exactly "offer", but do not accept. If on the director’s monitor such a system writes “refuse to release model A”, then the release of this model will not automatically stop - the director has all the possibilities to double-check the system’s calculations manually, consult with knowledgeable people and ignore the system’s proposal altogether based on his extensive experience and “sixth sense” ". Enterprise management systems do not really control anything. Moreover, if the system makes a mistake in its calculations - for example, like Excel, it fails to add up two numbers correctly - then the consequences for the enterprise can be the most disastrous, up to complete collapse and bankruptcy. And the manufacturer of the claim system will refuse to accept - he does not guarantee anything and does not bear any responsibility. The accountants, who are so much liked to laugh by “advanced administrators”, are well aware of this - that's why they check the super-mega-cool system on the accounts, that they (the criminal one) will have to bear responsibility only for them.

The whole difference between such systems and systems that control robots is that these latter solutions are not offered, but they are made. An unmanned fighter may not add up two numbers correctly, but he does not offer the pilot to change course and shoot the children running at the school stadium - he will do it himself because there is no pilot. An unmanned vehicle may not add up two numbers correctly, but it will not offer the driver to drive people to the full area - he will play these saloks himself, because there is no driver. A smart home can incorrectly add two numbers and decide that the volume of your bath is ten times larger than it is, and that it is necessary to pour water to the edges - it will not give you advice, but simply open the taps to the full, and the neighbors from the bottom his program is not laid down. And in all cases, it will be you who will answer - after all, this is your car that suppressed a lot of people in the square, because in your apartment the water flows without stopping.

It may be objected that modern cars are so full of stuffing, and that it can refuse at any time. Right. It is also true that the examination will show that the failed filling of the car is guilty of an accident, and the manufacturer will be responsible for this. But if a buggy software is to blame for the accident, then the manufacturer of the machine is not up to business - the machine is intact. And the software maker is not a priori at affairs - after all, he does not guarantee anything, does not bear responsibility, and simply does not accept complaints. Who will answer?

Actually, the question of responsibility should stand in the way of the production of all these UAVs, robots and smart houses. Scientists are constantly proud to report on all the new solutions to problems, on the most advanced improvements, on the achieved milestones and new achievements. However, when your smart home turns out to be an idiot - who will answer? Who will judge when your newest unmanned vehicle enters the bus stop with people?

The habit of not looking to press the “I agree” button may in the future serve us badly ...

Source: https://habr.com/ru/post/155473/


All Articles