📜 ⬆️ ⬇️

Programs, data and their owners (continued)

If you understand cybernetics, you know about the property inherent in any systems. The combination of systems affecting each other automatically becomes a new system.

The last time we talked about the invention, which was made in 1944 by Mokli, Ekert and their team. They invented a computer, the program of which can change itself. This property is preserved in the architecture of modern computers that make up the Internet. Now - attention: the entire Internet as a whole also inherited this property! This is an auto-programmable system. The executable code located in one part of its huge memory can change the code in another part, causing further avalanche changes in the code.

Memory in the 1940s was expensive, and it was not enough. A typical user was a world-class software engineer. The architecture developed at that time was optimal for this combination: a small amount of machine memory and high professionalism of its user.
')
Since then, the memory has fallen heavily and it has become a lot. The developers obviously did not count their architecture on such a scale! At the prices of the 1940s, the amount of memory of today's Internet would cost a terrible amount - it would be a stack of hundred-dollar bills with a thickness (more precisely, a length) of about 100 astronomical units. At such a distance from the Earth are now " Voyagers ", recently left the solar system .

If Mokley, Eckert, Neumann and others knew that they were designing the architecture for the world wide web, they would certainly have chosen another solution, understanding that in the auto-programmable architecture it would be impossible to keep the quintillion bytes of memory under control.

For reference: in the 1940s, the range of technologies for creating RAM included: a) acoustic delay lines from mercury; b) storing cathode ray tubes; c) discrete capacitors; d) triggers from radio tubes e) mechanical relays. Ferrite technology appeared later. Neumann, Barks and Goldstein in 1946 hoped to get 4096 40-bit words using 40 cathode-ray tubes. Imagine the size, power consumption and complexity of the design! In the British computer EDSAC 1949 on mercury tubes were able to make only 1024 17-bit words.
By the way, in Britain they plan to recreate EDSAC in the coming years (with the only difference: mercury will be replaced with safer material): after all, this is the first real “stored program computer” in the world.

So, nowadays memory has become much more, but a typical user’s skills are much less. Put both factors together and add a third: the computer no longer has a host. Hosts - a billion.

The network is a computer


Yes, yes, that's right: the Internet is one big “stored program computer”. A set of computing units, united in a common network and having almost unlimited possibilities for influencing each other, became a single system. On this - in general! - computer you are currently working. Your system unit is only a small part of it. And many highly skilled programmers (let's call them “hackers”) regularly make efforts to reprogram your system unit — to change the contents of its memory and, therefore, the algorithms of its work. By the way, it is possible that you sometime initiate a dangerous reprogramming with your own hands, thinking that you are performing some other harmless action. And it is possible that at one time you have already done it.

The mechanism of interaction between the individual system blocks and the individual code streams in such a large system is extremely complex. Among the reasons for the interaction - millions of programmer errors, separated by the quintillion bytes of hypercomputer memory. As we said earlier , no one has yet learned how to write programs without errors — at least, commercial ones. It is believed that this is impossible. But, of course, in addition to errors, there are many other reasons for the interaction and interaction of network elements that make it a whole. A good example is that some people voluntarily install the WebGL module on their computers, which they have already written about in Habrahabr. Each module of this kind is an additional sharp expansion of opportunities for interaction. Of course, the creation of such programs for more stringent hardware architecture would be impossible.

It is precisely as a result of the existing architecture of the interaction that the executable code in one part of the hypercomputer (for example, in China ) is fully capable of changing the code in another part (for example, in the USA). Each of the billion owners systematically programs a huge common machine - solely in their own interests and to the best of their understanding; as a rule, without understanding the essence of their actions. Few people are able to understand that their mouse clicks often result in reprogramming the Internet .

A typical example involving a typical user. From his point of view and in his terms, he “downloaded a toy”, “pressed a button”, “downloaded an update”, “found a key” or “registered on the site”. In fact, the virus that he started at that moment in seconds formed a botnet in the local network of his organization. This means that the contents of the executable memory of a giant planetary machine has changed, and in the most complicated way. Changing the contents of executable memory is reprogramming.

When making any decisions related to computers and information, try to minimize the level of integration of your computers into the global network. Remember that disconnecting from the Internet is much more expensive than connecting. Do not fall into dependence. If any task of your business can be solved without the Internet (or at least without enhancing integration with it), ceteris paribus, select this solution.

Do not put all your eggs in one basket: connect only a small part of your computer infrastructure to the Internet. The norm for a small company with one office is no more than 1/3 of the system units. Connecting all computers to the same network is very undesirable. The network will immediately take them to itself, in its logical scheme; make cogs that will always have to obey its laws as they change.

Never forget: the network is a computer.

This thesis, formulated more than a quarter of a century ago by experts of the highest class from the Sun, is considered generally accepted in the world of information technology and is not questioned.

Technology "BB"


When people started to connect computers to networks, at first it seemed that the ownership rights would not change: each computer had to have its own owner. But the network suddenly became a single machine, and the rights of individual owners to their memory areas in this environment have lost their meaning. The plots were no longer fenced, and programs from some plots gradually began to influence others. Something like this happens in a block of flats, where everyone formally owns their part: when it comes to common foundation problems, it turns out that there is no one to solve them. No one wants to negotiate with the others and let them into his apartment. No one wants to bear the cost. But imagine: at the same time, 50 million people still go wherever they want! And they do everything they consider necessary in other people's apartments :) This is exactly how hackers operate in tens of millions of computers infected by them, whose nominal owners have blundered with anti-virus protection and are not aware of changes in property rights that automatically happen after connecting to the Network.

Since the hypercomputer does not have a host, an unpleasant prospect arises before people. The absence of the owner of any valuable object is usually very dangerous. This is known to both cybernetics and history experts. The absence of the owner is dangerous because it may appear.

Von Neumann autoprogrammability did not interfere and could not interfere until the early 1980s, while each computer had its own owner - a state organization, a university, a company, a project manager, a programmer. The owner made decisions about what should be in memory, and what should not be there (strictly speaking, that is why so far - since 1977 - Voyager computers have been working perfectly). If the computer system has a host, autoprogrammability is no longer a problem, at least from his point of view. This theory asserts, and this confirms the practice. For an auto-programmable system to work reliably, a host is needed. Such is its inherent property.

What is better - to reduce the level of autoprogrammability or to stay idle, at the risk of waiting for the Boss?

This, by the way, is not the one who controls the traffic. The owner is the one who controls the contents of not only the traffic, but also the entire memory - from the firmware loader to the user data. As applied to the Internet, this means access to hundreds of millions of system blocks with root privileges. With the ability to make decisions about what should be in memory, and what should not be there. The owner is the one who decides which applications are allowed to the user. The owner is the one who decides which code should run on the user machines at any given moment, and makes changes to it as necessary. The owner sees the entire file system. However, to be more precise, on a large network, it’s not the owner personally who is involved in this, but his loyal servers. They know what to do. Of course, there will be no viruses. But - whether the price is too high? ..

If someone with the means and ability to lobby for the transfer of the Internet under centralized control, wants to do this, he will need strong arguments. The strongest argument is the viral danger, which even people far from computer circles have noticed in the recent months has seen rapid growth.

We believe that the inhabitants of “Habrakhabra” do not want such a development of events, therefore we advise you to invest in the most reliable and secure information systems, using the services of the best specialists. By eliminating the viral danger of your friends and acquaintances' own computers and computers, you deprive the Evil Force of arguments in favor of centralized control of the memory contents of all Internet system blocks. Such control would be an absolute evil for society. On the other hand, you must agree: another DDoS attack on the servers of any Serious Organization (in particular, a DDoS attack registered from the IP address of your home or office computer due to the presence of a virus on it) is another argument in favor of total control.

* * *

To be continued...

Source: https://habr.com/ru/post/122329/


All Articles