📜 ⬆️ ⬇️

We don't need it

image
This tarakhtelkoy can scare pregnant cats, but what is the use of it in battle? - General Kitchener on the first tank, 1915.

This article turned out in a strange way. Initially, I sat down to write a text about the features of using unit tests for demanding applications. I wanted to write that computers with hundreds of gigabytes of RAM and hundreds of processor cores would perch on our tables. And then describe how this may affect the development and testing of applications.

But before I got to the point, I felt that comments “such a system I don’t need”, “programs don’t need such resources”, “there are enough current resources for everything”, “the future for video cards” flooded from all sides.

I made a digression to write for what it is needed. Then he developed a retreat. And then I took and wrote a post, why I consider why gigabytes and many-many kernels are necessary and useful.

"The potential global market for copiers will hold no more than 5,000 units," - From IBM to the founders of Xerox, 1959.

Resource-intensive applications that consume gigabytes of memory and several cores will not be such soon. In a few years, they will become sample programs with sample requests. It will just happen, and on this occasion you should not worry or blame someone for the inefficient use of resources.
')
Now 32-gigabyte memory strips are becoming available, and Sergey Vilyanov writes notes about multi-core processors in the spirit of “ Last Warning ”. And this is good! The gradual evolution of the last years in the field of hardware may provide an opportunity to make revolutions in the field of software. There is an opportunity to significantly improve the features and convenience of programs.

I worked on the Apogee BK01 computer (and I'm not 50 years old! :-) in a text editor that occupied 2 kilobytes of memory. Now I am writing this text in Microsoft Word, which ate about 30 megabytes. At what ideologically nothing has changed. And there, and there I typed text, deleted characters, used the search, could save the result. However, the convenience that I get from the functionality of Word and its design are incomparable with that editor. And it is completely clear to me where these megabytes go and I feel that memory costs are completely justified, being received as a substitute for convenience.

Americans may need a phone, but we don’t. We have enough messenger boys. - Sir William Price, Chief Post Office Engineer, 1878.

Differences Word from the editor of texts on the Apogee BK01 can be called a revolution. Revolution because Word can start using beginner. Only the Enthusiast with a capital letter could use the editor at Apogee BK01. The ease-of-use revolution was made possible by increased computing power by several orders of magnitude. And there is hope that new revolutions of simplification are possible.

Previously, I could not imagine why a text editor should be graphic, and even more so why not a text editor is needed, but what Word is now. Accordingly, I could not imagine how the program can use more megabytes of memory. Now I can not imagine why the future equivalent of Word may need a terabyte of RAM and hundreds of cores. But I am sure that this will happen and it will be convenient. The availability of such resources will open up opportunities that we now simply do not see and do not consider.

The editors may receive artificial intelligence that will help the person create, complete the thoughts he has begun, or dynamically adjust the interface to the user's characteristics, guess the desires. Already, there are software systems that require highly skilled management, which leads to the fact that such systems are used only by a few percent. I know that there are already attempts to introduce artificial intelligence algorithms into the seismic data processing packages interface, since working with such packages is extremely difficult.

My knowledge of seismic surveys is zero, so the same Word could be an example of this. The task of artificial intelligence will be to guess the desires of the user and help automate these operations. In its simplest form, I can imagine it so. Let a novice search for a series of keywords in the text and highlight them in bold or say, making a link. Of course, at the beginning he can read a thick guide and write a macro. But tell me honestly, how many of you started your acquaintance with Photoshop, Word, Internet Explorer from reading the manual, not from launching the program? Now we are studying the possibilities of programs gradually, already working in them. And it will be very nice if artificial intelligence, having caught the pattern of our actions, suggests how to simplify them. Let him help the user to perform all the replacements automatically or show him how to use the macro and how to write it.

“Television will not last six months on the market. People will soon get tired of staring into a plywood box every night, ”- Darryl Zanuck, film producer 20th Century Fox, 1946.

At the beginning, the interfaces with built-in artificial intelligence will be imperfect, rather even terrible, and as always users will be angry with them. :) By analogy, you can bring a graphical interface to replace the command line, which was also very criticized. In my opinion, the described way of development of the interface is inevitable, since the programs acquire all the new functionality, which practically no one except specialists uses.

How many resources are needed to build such systems is difficult to say. But any algorithms from the field of artificial intelligence are very resource-intensive and the possibilities that are obtained by modern personal computers are still very modest for such tasks.

Another direction of the use of new hardware capabilities will be more realistic. Now, various physical processes in games are still more “drawn” than they are considered. In other words, in games, the use of numerical modeling of physical processes can be started, which is now being implemented in specialized numerical packages that calculate the motion of liquids, gases, and so on. All this will give a new level of realism. And such realism is achieved just due to the work of a huge number of processors processed in parallel in different parts of space. In addition, each area of ​​space requires a significant amount of memory for storing information and the distribution in it of gases, temperature, pressure and other characteristics required for modeling the physical process. In this case, the more memory and cores, the more accurate the calculations will be and the amount of space in which they will be produced. Algorithms for accurate physics simulation in the virtual world are almost unlimited resource users.

I am sure that there are a lot of other examples where the presence of a large amount of memory and the number of cores remains a bottleneck. And where their increase will allow to make perfect new things on the computers standing on our tables.

PS
Be sure to ask - "What does Intel have here?" :) I will answer in advance. Intel developments (both hardware and developer tools ) bring the future of software closer to the limit of modern hardware solutions!

“I traveled all over the country, across and talked to the greatest experts, so I can assure you that the whim called information processing is a short-term phenomenon that will not last a year,” the editor of the business literature column in Prentice Hall, 1957.

Parallelism is a new era and one should not expect that in a couple of years there will again be single-core processors, but with a frequency of 20 gigahertz. Take the Intel Parallel Studio and go. Moreover, it opens up horizons for new innovative ideas in the field of demanding computing and new market segments. I wish you success.

Source: https://habr.com/ru/post/90142/


All Articles