📜 ⬆️ ⬇️

The development or slowdown of IT?

The progress of mankind over the past three or four hundred years has been exponential. By human progress, I understand here the total contribution of all sciences and technologies: mathematics and physics, biology and chemistry, engineering disciplines, architecture and industry. Many new areas have appeared, such as cosmonautics and microelectronics, genetic engineering and informatics. But each particular industry is developing unevenly and is not exponential.
image

In some industries, there may be a slowdown in development and even stagnation. In this article, I try to reflect on the development of plateau in IT.

Consider, for example, the automotive industry. You can go to the exhibition of cars and, in principle, see that there are no significant changes in cars. Yes, the developed speeds have increased, yes, it has become more convenient, all kinds of ABS (anti-towing system), climate control, heated seats, surveillance cameras, etc., have appeared, but this all applies in many respects to additional accessories and design. This is not a revolutionary change. Machines remained about the same as 30-50-70 years ago. No flying cars, no walking on the legs of cars or cars built on other principles (not counting Tesla and Google's self driving car, of course).

The situation is similar with the space industry. Our country is a historical leader in terms of space technology: the first to launch a satellite, the first to send a man, cosmonauts from other countries were full on our space station. But lately there have also been no revolutionary changes: rockets are based on the same principles as 50 years ago. Mars is not settled, even on the moon is still not visible apple orchards. People in the 60s thought that we would occupy the entire solar system and go beyond it. But we still have rocket satellites falling into the ocean.

It seems logical that the development of sciences and engineering disciplines is of a spasmodic nature - a new revolutionary theory appears, for example, Newton's theory, the outlook is revised and science is entering a new level. On which it develops slowly until the next revolutionary leap - for example, the theory of Einstein. There is a change of paradigms and subsequent exponential growth. Which inevitably goes on the saturation plateau.
')
image

The slowdown in development after the jump is due to the fact that this subsequent development represents mainly various optimizations and improvements. Over time, after all the simple improvements have been applied, the number and nature of these optimizations become very complex. Often they are much more complicated than the idea itself. For example, in cars, the original idea has not changed for more than a hundred years, but every year new models come out, with various new automatic systems: an automatic transmission, fuel economy systems, and so on. And all complex systems can now perform self-diagnostics and generate errors.

The same applies to processors. To the original idea, three levels of caching were added, pipelines, superscalarity, predictive branch systems, and so on.

In the end, the complexity and cost of introducing new optimizations will outweigh the potential benefits - all low-lying fruits will already be torn off - and the technology will hit the plateau before the next leap. Horse racing is called a scientific and technical revolution. These jumps can be caused by the brilliant ideas of a specific person (for example, von Neumann or Alan Turing), the mutual influence of systems, social, economic, or some other factors.

In addition, the older science becomes, the more discoveries in it have been accomplished, and the more experimental experience and knowledge it has accumulated, the more it needs to be studied, the more difficult it is for young scientists to start inventing something new (since to create something new , you must first read and understand everything that was created before you). This explains the constant and significant increase in the age of the Nobel laureates.

Obviously, at the present time, information technologies are undergoing a stage of exponential growth. How much is left? What makes us see such a speed of IT development? What caused this growth?

The reasons, in my opinion, are as follows:

First, the development of IT was influenced by the contribution of other engineering areas - microelectronics, which caused a reduction in cost and an increase in iron power. Why processors become more powerful? Semiconductor transistors were invented, then photolithography technologies, super-large-scale integrated circuits appeared. Will Moore's law cease to operate in the coming years is the subject of a separate article.

The second reason: the infusion of money .



All these money injections make it possible to pay for a huge army of developers who create millions of applications. I think that saturation will happen here soon: computers are now changing less often than before, corporations have already written their software, and as for sunset startups, we can talk about this separately.

An interesting and paradoxical observation: in spite of such a tremendous increase in processor power, I do not see such significant changes in the speed of execution and tasks that I solve. For example, there is not much difference between the Word and Excel of the 1996s and the modern versions. As I watched movies 10 years ago, so now, only the resolution is slightly better and downloads faster (several times, but not thousands of times). There is no revolutionary difference between Duke Nukem, the then and the present (it is revolutionary), many still play Heroes III.

Here are some specific examples:


It seems to me that this is happening because it is impossible to rewrite such a huge amount of code from scratch. What is the issues of backward compatibility and other things. Despite all the techniques such as agile, modularity and other things, changing the developed software is much, ten times more difficult than the original development. But throw and write everything from scratch is impossible. And why? Therefore, the core of the systems is almost unchanged, ryushechki and external things are added, the GUI is changed - something that is easy.

The IT industry is quite young in two ways, as Uncle Bob said: it is not so many years old and the average age of programmers is very small - around 30 years old. This happens because the number of programmers is rapidly increasing due to the arrival of young people - it almost doubles every 5 years. There is a trend that aged programmers go to managerial positions.

It is paradoxical that, although the basic software solves all the same tasks as many years ago, the tools change quickly: languages ​​are improving, new frameworks and libraries are emerging, new approaches. When I started, I wrote on ordinary visual basic, on pascal, on Delphi. Now the star of these languages ​​is almost rolled. At the time, there were no languages ​​like C # or Scala. Nobody spoke about functional programming, it was not a trend. And a little earlier, probably, did not use Delphi and OOP, they wrote on Fortran. I remember the time when people used DOS, there was no internet, there was no google and facebook. That was 20 years ago.

What conclusion can be drawn from the following two facts: more newcomers come in and new fashionable languages, frameworks, technologies appear all the time? The software will contain many bugs and will be over complicated. Of course, there are many techniques that are designed to reduce the complexity of the code: continuous and flexible development, TDD and unit testing, version control systems and code review. Without a version control system is difficult to do when a lot of people work on the code. The code is getting bigger and bigger. Tens of years ago there were tens of thousands of lines of code in complex programs, now tens of millions of lines of code. And the new software relies on, uses and pulls hundreds of thousands of lines of code from third-party libraries. Very often in the project no one understands how a module works. It even happens that no one understands how the whole system works. There is a lot of code, it is written by different teams at different times. Goals could be different. There could be prerequisites that were simply forgotten. Something done in the calculation for the future, but it is something not needed. The code has become very complicated.

That is, the objective reality is as follows: systems are enlarging, increasing in size, more and more people are working on them, the code is getting bigger, and it is becoming more and more complicated.
Thus, in the future, most programmers will work on supporting huge monsters, rather than writing something from scratch. And this activity will increasingly obey the rules, because somehow it will be necessary to keep the software operational. Everything will turn into clear bureaucratic procedures. Unauthorized violation of the order, for example, a change in procedure, will be unacceptable. And this is a factor contributing to the slowdown of IT.

Total


Development factorsSlowdown factors
Increased iron productivity (temporarily)Increasing complexity of systems
Infusion of money (temporarily)
Increasing number of programmers (temporarily)Backward compatibility requirement
Specialization (frontend backend)Half of developers have less than 5 years of experience - many errors and bad code (temporarily)
Spread frameworks and librariesYou have to drag a lot of third-party code.
Improved languages ​​and IDE, version control systemsPermanent technology change (temporarily)
Improved development techniques: TDD, DDD, Agile, patterns and refactoringOverhead costs for management and organization due to the increased number of developers working on the project

Thanks for attention!

Source: https://habr.com/ru/post/372889/


All Articles