📜 ⬆️ ⬇️

Thin clients, thick servers

The idea of ​​a thin client really embodied in the mid 30s. Although the term itself was discussed for half a century before. Client-server applications that had prevailed for a long time before were, in fact, a dead-end branch of evolution: the idea of ​​a thin client in the form of a web browser mutated to the point of absurdity: the client part of the application — the one that ran on the user's machine — required more resources than the server, despite the fact that the latter could serve many users. The browser sought to use all available resources of the user machine, becoming the fattest client in the entire history of information systems.

Attempts to standardize the technologies used in this — the most sought-after stack on the market — were ignored by developers, who, driven by demand, thought of the compatibility and sustainability of their products last. The free market lives according to wild laws. If you are tempted for a minute, hour, or day, thinking about high engineering values, a more cynical, but smart competitor will release his version. Yes - it will not be compatible with everything and everyone from the past and the future, but then he will have time to make a profit from a billion eager users today. And tomorrow it will work as a new version, again incompatible, but with billions in investments and expectations. And you will stay in yesterday, studying dusty, all forgotten specifications. No users, no product, no investment, no future and present.

In order for the market economy to self-regulate and the market grow freely, a high level of freedom and a minimum of restrictions are needed. Hard control institutions only interfere. Control can be reduced only to the prevention of open arbitrariness. In theory.

When the Information Technology industry turned into the highest budget market on the planet, engineers, of course, had little space in key positions. Should a graduate of the Conservatory produce Eurovision? Will there be a benefit from a talented surgeon, as Minister of Health? The world is ruled by people who are not the most intelligent and competent, but businesslike.
')
The lot of experts is much more modest than they would like to consider. More, what they should expect - to express their expert opinion, and then - only when asked. And they will be asked when the game by the wild rules of the market becomes too expensive. When profits are at stake, no investment pays off, and what to do is not clear.

After decades of ignoring the recommendations of the Consortium, the thin client mutation has reached a critical point. The sharks of the Information Technology market had to bow to the experts and ask them to come up with a fundamentally new solution that could cut the XXI century Gordian knot: the client, conceived as thin, weighs an order of magnitude more than any server that performs all the useful work of one person, takes on the user's machine everything that can only take away, regardless of the means, from the operating system. Responsiveness of applications is getting worse every year, quality is lower, communication channels are completely clogged. The market is not just global, but one can say absolute: the number of users of the Network and the atmosphere on the planet has become equal, allowing for statistical error.

The Crisis Coordination Committee (K3.org) was assembled, where the masters of the experts were given full power and better conditions: science has not seen such financial and human injections, perhaps, since the days of the nuclear race. But now the holiday was not on the street of physicists, but at computer scientists. This time there was more public glory for them: more secrecy and anti-militarist propaganda. On the contrary - honor and faith from every earthling.

In 2032, the first version of the new network protocol standard was introduced, focused on:
  1. really thin client
  2. really thick server
  3. effective use of modern communication channels

The main content in the packets transmitted by the new protocol was supposed to be vector (!) Hypertext. Conceptually, the new format was a compromise between classic hypertext and streaming media broadcast.

Clients in such a network were required to correctly display vector graphics, send all user events to the server, and interpret the animation, applying to the loaded "vector" that was received from the server at the beginning of the session. No client side scripts. No executable code at all, except the browser and peripheral drivers on the user side.

This concept places high demands on speed between the end points in the network. Ping grows - responsiveness falls. But, as technology penetrates, many problems that cannot be solved for the previous stack lose their meaning: client configuration is minimal, standardized, much cheaper and simpler - there is no permanent memory, there is no need for a high-performance processor, large amounts of RAM. In essence, this is a TV with input devices connected to the network.

In the server part, on the contrary, significant complications were required. But this is an acceptable step, since potentially computing power is not limited here, and the server infrastructure is, by definition, easier to control and complicate than a remote client. A star of the “stateless” server architecture has entered - now the server not only needs to know each client, but instantly react to all its events as soon as possible.

Misters owning the market were satisfied. New standards, although they brought down some areas of the industry to a state of “light” industry — producers of user hardware, for example — generally allowed the rich to become even richer: if earlier, additional efforts and legal tricks had to be applied to control user activity, now As the client is abolished as an autonomous software and hardware platform, the game has always been played on their field, that is, servers.

Source: https://habr.com/ru/post/306772/


All Articles