In the
previous issue, we stated that the field of information security is experiencing a severe crisis. Its essence is in the mass use of computers of deliberately unreliable construction for solving vital problems. From the side, one can clearly see
some manifestations of the crisis -
regular troubles in computers of
companies and
government organizations around the world, accompanied by
scandals and market upheavals. But this is only the tip of the iceberg. The growing list of affected brands suggests that this is not a series of accidents, but a
systematic pattern .
The unreliability of a standard modern computer is that it is susceptible to viruses. This means that a
sudden, unpredictable and seemingly unnoticeable change in the executable program (in particular, in the interests of unauthorized persons) is characteristic of him. This property, unfortunately, is embedded in the design - at the level of the processor, memory and other details of the architecture. It appears the stronger, the more the computer is integrated into the global network.
As a result, the behavior of an individual element of this network can vary significantly depending on the behavior of other elements. From the point of view of mathematics, this is a
terrible architectural solution . Such systems, if not managed centrally, tend to chaos with an increase in the number of mutually influencing elements. But this is exactly how the Internet works. The unsuccessful architecture of most of the computers it consists of makes it almost an ideal medium for the endless spread of digital infections.
')
Let's see how this could happen.
Von Neumann architecture is the generic curse of modern computers
In the spring of 1944, work on the ENIAC computer was carried out in an environment of relatively strict secrecy. By this time, after several months of trial and error, its designers had gained a unique experience, realized the shortcomings of their design and started thinking about a new computer, much more convenient and versatile.
The design team consisted mainly of scientists from the University of Pennsylvania - the
oldest in the United States . The group's intellectual core was composed of John Mockley and John Eckert. These pioneers began their work in many ways at random, partly influenced by the ideas of John Atanasoff, who before the war had conceived of an electronic machine with a fixed program for solving mathematical problems of the same type. ENIAC was intended for a wide range of military calculations, therefore it was provided with means to replace the program. But they were not comfortable.
The switching panel on which programming was performed resembled the first telephone exchanges where the young ladies switched subscribers manually. The ENIAC program was stored as a collection of mechanical switch states. To change the program, it was necessary to change their physical location on the panel, mechanically setting a new electrical circuit.
It turned out to be so laborious that the designers once came up with the idea of ​​a
fundamentally new computer , the program of which could be changed very simply. The idea was to store the sequence of commands executed along with the data, intermingled - in a single format of machine words, in a common RAM. The inconceivable ease of reprogramming was revolutionary for its time. Mokley and Eckert invented a computer whose program is capable of changing itself! Only four decades later, people began to guess the danger of the wide spread of this technology.
The initiator and curator of the ENIAC project was Hermann Goldstein, a military mathematician who represented the interests of the US Army at the University of Pennsylvania during the war. In the summer of 1944, he invited John von Neumann to the project, who then participated as a leading mathematician in the creation of the
atomic bomb and was looking for the possibility of quickly performing complex calculations. The works of Mocli and Eckert impressed Neumann. He became seriously interested in the idea of ​​the “stored program computer” and by the summer of 1945, in his academic manner, generalized, systematized and formalized this concept in the manuscript “First draft of a report on the EDVAC” - within the framework of the new
EDVAC project and the University of Pennsylvania contract US Army. Goldstein sent copies of this report - on behalf of Neumann and with his sole signature - to several dozens of specialists whom he decided to involve in the project. Perhaps the publication of strategic know-how required that Mocli and Eckert did not patent the key principles of the computer (especially the joint and equitable storage of programs and data, the idea of ​​which, by the way, was expressed in 1930 by Alan Turing). In the end, all work was funded by the government; both the army and the university objected to securing personal rights to the results of the project participants. Neyman, an initiate of the secrets of Los Alamos, the military trusted more than others.
The report aroused anxiety interest. He quickly replicated in hundreds of copies that have spread in the scientific and engineering circles. So in July 1945 the concept of von Neumann architecture appeared. The war was ending, and the colossal scientific and commercial prospects of amazing new machines loomed ahead. The University of Pennsylvania began a
scientific pilgrimage . By mid-1946, Neumann, Goldstein and Arthur Barks (another ENIAC developer) finally brought the concept of a “stored program computer” to an academic standard by publishing the fundamental article “An Electronic Computing Instrument”. Weakly and Ekert, whose names remained in the shadow of Neumann, were offended and, after reading a course of lectures for other scientists, left the university, founding
their own computer company. Over the next three years, teams and data sharing computers were created in the
UK and
Australia . By the early 1950s, von Neumann architecture began to seem the only possible, and its scope of application was limitless. Before the advent of the first viruses, there were several decades left, and no one could have imagined the nightmare that we see today everywhere - the uncontrolled behavior of computers. On the contrary, it is precisely complete controllability and predictability that has long been considered their most valuable quality.
Really inventors of the computer, scientists of the highest class - were mistaken? Of course not. Von Neumann architecture was brilliant and ideal for its time - and even for the next few decades. But it was designed by professionals for professionals. And it turned out to be not the most successful solution for consumer computers, which are in the hands of millions of users and interact through a huge public network.
The fact is that the computer background-Neumann architecture has an amazing property of autoprogrammability. In this context, we call this term the ability of a program to change itself. A program running in the von Neumann environment has the ability to write data to the same memory in which it is located, i.e. self-modification.
Usually, a program changes itself for two reasons: due to a programmer's error or as a result of his conscious decision. Both of these reasons, taken together, inevitably cause the appearance of viruses in any von Neumann medium of sufficiently large size, which is in the hands of the masses. Von Neumann environment is safe only if it has a
competent host that fully controls all processes in memory - from the first byte of memory to the last, from the first clock of the processor to the last. In the 1940s, 1950s, 1960s and 1970s, this security condition was met. In 2010 it is not respected. At best, one owner of the computer out of a hundred is aware, for example, that a certain movement with a mouse (or finger) causes the executable code to start (or change), that is, it programs the computer to perform certain actions.
Note that the processor doesn’t care if the one who pressed the button and ran the code counts its action by programming. The processor does not know anything about the person and his intentions. He punctually and in good faith, tact by tact, executes the program that is in memory. He has the right and duty to write down any data in any area of ​​memory - exactly as the program says. Von Neumann's environment gives a person absolute room for action, demanding in return a person’s
absolute responsibility for every bit memorized by him. This property is still indispensable for professionals, but undesirable for the majority of the population.
What are the alternatives?
Let us return to the example of the machine Atanasov (by the way, unfinished due to the war). Apparently, it was a kind of simplest calculator. Such devices belong to the class of computers with a fixed, fixed program. By definition, there can be no viruses in them (the formal definition of a virus is the topic of our next article). In the computer's RAM with a fixed program, only data can change. The data is not executable. Interpretation of the contents of the data memory as instructions for the processor is fundamentally impossible if such is the design of the computer.
Please note that a computer with a fixed program does not have to be primitive at all, but its program can be - and often happens - very complex and functional. The immutability of the program and its complexity are not connected in any way.
A simple example of a device with a fixed program is a street organ. In the protrusions on the surface of its drum is placed only a few hundred bits, and the speed of access to them is quite small. In the protrusions on the plastic surface of a compact disc, it is already possible to record about a billion bits with a significantly higher access speed (by the way, mechanical storage of information is used in both the pressed CD and the barrel organ: the units and zeros are encoded by the
geometry of the objects ). The volume and access speed of a permanent memory chip (ROM) can be several orders of magnitude more. In other words, the limit length of a physically unchanged program is determined only by the level of technology and cost — and nothing else.
A computer with a fixed program on a Hitachi chip. It is not subject to viruses. It works (more precisely, plays) without problems for 28 years. The amount of permanent memory (for the program) - 2 kilobytes. The amount of RAM (for data) - 80 bytes. Evaluate the qualifications of the developers!By rigidly attaching to the memory device with a fixed program (to a barrel organ, a plastic disk or a chip) the corresponding processor (interpreter and executor of commands of this program), we will get a very reliable computer. It also makes sense to provide it with blocks of RAM (for non-executing data, for non-executing stack, etc.) and I / O devices. This architecture is radically different from von Neumann — and the lack of reprogrammability property, and the lack of auto-programmability property. Its principles of work and capabilities of an information security specialist should be well aware of and always bear in mind when making decisions.
The most interesting kind of devices with a fixed program is devices with the possibility of hardware replacement of the program - full or partial. A striking example is the mechanical piano, in which you can always insert a roll of tape with the desired program. The possibility of changing punched tape fundamentally distinguishes the mechanical piano from the barrel organ with an irremovable drum. But by itself, the ability to change the program does not cause the effect of auto-programmability. Program change is possible only by external means, but not as a result of the program itself. Therefore, we attribute such a design to a class of devices with a fixed program.
Because of the reliability and manufacturability of computers with a fixed program, the industry produces them in large numbers. Examples are a calculator, a clock, a musical instrument, a
signal processor , an industrial equipment controller. Such devices are also good because they are cheap - but only for large-scale production. A significant component of their cost is well-written software. The one-time cost of creating it is quite tangible, since we need highly qualified specialists: it
’s impossible to fix a
bad program with hindsight. But for the developer, the problem of unlicensed use of this software is removed, i.e. The cost of quality development of a fixed program is justified. And most importantly - the problem of viruses that underlies other information security problems completely disappears. A computer protected from viruses is more attractive to the consumer and costs more than an unprotected one.
Note that in industrial production, instead of the physical implementation of such an architecture, it is often emulated — for example, based on a commercially available microcontroller with a von Neumann architecture. This technological trick with proper performance (hard grounding of the output WP of the chip of a reprogrammable
flash memory to simulate ROM , etc.) does not change anything in essence. Emulation of a secure architecture based on an insecure architecture is one of the main tools in the arsenal of anti-virus protection.
Viruses? No, I have not. A typical example of a fixed program architecture (using emulation). The design is designed for continuous operation without a single error for decades. The processor processes and generates huge amounts of information (several megabytes per second).A fixed-program architecture can be viewed as a particular and exotic case of the classic
Harvard architecture , which is characterized by the separation of program memory and data memory. This division in itself does not mean the impossibility of reprogramming. Harvard architecture is a good thing in our arsenal, but not a panacea against viruses; all the more so because its name is rather arbitrary and, to a greater or lesser extent, refers to a wide group of manufactured chips and ready-made user devices.
As you may already understand, it is the physical separation of programs and data in a computer, their mutual isolation and logical inequality in the system - this is the key to a good solution to the problem of viruses. Ideally,
programs should be immutable, and data - non-executable . This, of course, is not a sufficient, but necessary condition, providing the possibility of further correct actions.
The von Neumann architecture means a de facto ban on the separation of programs and data for participants in the computer industry — chip and computer developers, programmers, system administrators, managers, and all the rest. End users, both private and corporate, suffer from this ban. Without the ability to effectively separate programs from data, the task of anti-virus protection becomes very difficult. Almost every computer has to be equipped with an
antivirus without fail — a software package with sophisticated functionality that can compensate for an unsuccessful computer design. Without a total installation of antivirus software on users' computers, the current Internet, unfortunately, cannot exist.
This, however, does not mean that the von Neumann architecture does not suit the end users. On the contrary, they regularly and gladly acquire computers based on it. There is a huge demand for them, while there is practically no demand for computers with similar functionality based on alternative architectures. We will talk about the causes of this social phenomenon in one of the following issues, since it is important in the context of information security.
* * *
To be continued...