📜 ⬆️ ⬇️

Megahertz is not caught, the kernels do not grow. What happened to the technical progress in the PC?

Hi, Giktayms! Even if we ignore the fact that mankind has traded space exploration for dog costumes and gadgets, as Ray Bradbury used to say, the impression remains that the earthly “king of computing”, a personal computer, feels bad. How does iron productivity grow and how long will it be possible to increase it in conditions when the notorious cores and gigahertz are marking time?



It is easiest for pessimists to live - they are not surprised by economic problems, the slow development of technology, or natural disasters. Positive-minded enthusiasts have a harder time, because the “chronic sores” of iron, trivial at first glance, eventually rise to their full height. Today we will assess the pace of progress of various PC components and try to anticipate the “revolution” in technology, if any.

Processors - teaspoon annual productivity growth


As you know, the closer the deadline - the more interesting to wash the dishes. Similarly, there are conversations about the revision of the new generation of x86 CPU architecture.
')
- How are you guys doing with processor performance? Has it increased?
- You know, we have such great integrated graphics!
- Wonderful, but performance? Well, the one that processor?
“And energy consumption, you know, has decreased.” Improve the ecological situation together!
- What is the speed of something ?!
- We have here excavated a computer five years ago - in comparison with it, our new processor is very good!
- ...


Remember that for any "sufficient" processor performance sooner or later there will be its own Windows Vista

It would be sly not to note that CPU performance is improving from year to year - the integrated graphics useless at the dawn of their appearance today “sent to the grave” all discrete video cards of the initial class, modern architectures processors acquire support for new instructions and due to this “trash” their predecessors in a number of tasks (video and audio coding), and performance per clock increases due to more intelligent branch prediction systems, for example.

But in everyday tasks of a home PC (games, browser, photo processing), it just so happens, the new processors most often offer a ridiculous + 5% compared to its predecessor. Not surprisingly, CPU makers are so skillfully evading the comparison of border architectures head-on. And for the same reason, they are trying to expose the achievement with the grief in half to play on integrated graphics in new games, as if computer users did not “gorged on” a low frame rate with old integrated graphics and old games ...

Do not want or can not? Difficult question, because the same Intel, on the one hand, bravely tries to comply with the law of the company's co-founder, Gordon Moore, according to which the number of transistors on integrated circuits should double twice every couple of years. But it turns out to be unable to follow this order, because the introduction of each new process technology is becoming more and more painful, and the physical limitations of the size of transistors limit the “scope for creativity” of chip manufacturers. At the same time, the clock frequency ceased to grow comparatively long ago, and not because of the “collusion of marketers”, but due to the fact that with the modifications of the superscalar architecture, megahertz can no longer be as linearly scaled as a measure of performance, as in the good old days. That is, the instructions were still executed for the same period of time, and the overclocking of the processor turned out to be "corn" , as overclockers like to say. Another thing is that multi-threading and the growth of cores in low-end models are not implemented for marketing reasons, but this is the price paid for symbolic competition between the two leading manufacturers of processors.

Such a bleak prospect still does not hinder the gradual progress of generations of Intel Core (by the way, after relatively low-frequency Skylake Skylake Refresh is about to come up with much more impressive gigahertz and really increased performance after them), but with a symbolic increase in productivity even without fierce competition from "Red" sooner or later have to do something. One of the ways to improve the CPU can be the introduction of programmable transistors, that is, chips that can work in a non-linear manner. This idea sounds bold, although it is closer to reality than it seems. Yes, just such an undertaking takes a lot of time and money to develop, and there is no such luxury in the shrinking PC industry, and so far it is not foreseen.

There is also a conservative variant of “the same slushy and pour into a tall vessel” - that same FinFET or vertical transistors, and then multilayer chips with a common substrate. Drives could - and processors can! Speaking of drives.

Drives - many different and good, but not all reliable


Your affairs are wonderful, the drive industry! Before us is a rare example in the computer industry of neighborhood technology of the past (HDD) and current (SSD) generations, and the neighborhood turned out to be peaceful due to the different specialization of two types of drives.


From left to right - NVM Express SSDs, SATA-III-era solid-state drives and hard drives

After the SSD dropped sharply in price, the hard drives were retrained into “storage rooms”, from which it was not appropriate to ask “how are you doing with technical progress?” Or “do you plan to improve performance and access time, sir?”. The HDD mission today is similar to the one followed by trucks - to operate with huge amounts of information where on a “passenger” SSD it is difficult either for financial reasons or because of rapid memory wear.

Two types of drives tried this way and that - Seagate, for example, is still producing hybrid SSHDs, a bit more responsive than traditional hard drives, but less expensive than solid-state drives. There are also examples of successful software consolidation of SSD and HDD into a single logical volume - Apple Fusion Drive, in which the operating system automatically distributes frequently used files to flash memory, and unclaimed long-term data - to a slower-speed hard drive.

As for hard drives in their classic sense, today corporate clients are free to purchase 3.5-inch hard drives with a capacity of 10 TB. The revolutionary technology of thermomagnetic recording (HAMR) , alas, is still a bright future without serial incarnation.

Truly cool by the standards of computer components, solid-state drives are being developed - speed is growing, memory is improving, controllers are becoming more productive, form factors are becoming smaller . The potential of an outdated SATA-III interface, the new SSDs have long since surpassed and are now progressing in a less “tight” PCI-e (usually with four lines for flagship models) with the NVMe protocol.

All the variety of flash memory today is divided into MLC and TLC . Among them, the MLC is fast and mainstream, and the TLC is an economy class with a slightly smaller rewriting resource in solid-state drives.
Terabyte drives have ceased to be fiction, and the main concern of all SSD users is the resource of memory cells, the wear of which, unlike hard drives, is unequivocally and accessiblely displayed in diagnostic utilities. And, of course, not all solid-state drives are equally useful, but in the case of Kingston, cruel marathon tests showed the endurance of even the outdated HyperX 3K at the level of two petabytes (2 million GB, ladies and gentlemen) of the recorded information. Such is the resource of a two-year-old quality drive. In modern SSDs, from the “popular” best-selling UV400 to the “supermodel” Predator, with reliability, things are at least as good.


Accumulators - old and new "schools" for different tasks

“The last fashion squeak” in solid-state drives is a three-dimensional flash memory (which does not exist separately from everyone, but is also subdivided into TLC and MLC), that is, the vertical organization of cells. Expensive pleasure and a kind of hack, in which the manufacturer has the opportunity to use NAND on an outdated process technology to increase the drive resource. The language does not turn to call the drive endurance redundant, but the controller in this type of SSD will die anyway sooner, so is it worth to dream of a perfectly preserved body if the brain is aging in normal mode - an open question.

RAM - amazing adventures of DDR4 in laptops


If in the case of processors and accumulators we had to make lyrical digressions, they say, “do you understand, is an individual question, which side is it to look from ...”, then in the case of memory there is a hard standard JEDEC (Committee for Engineering Standardization of Semiconductor Products), which everyone follows just as athletes adhere to some tactic from the very beginning.

The current standard, DDR4 SDRAM, “went out to the people” quite recently, in 2014, and so far is very far from its ceiling of possibilities. For example, eight-layer packing of 16-gigabyte crystals into a chip will allow in the future to increase the capacity of the modules up to 512 GB. Yes, this is a reserve for the future, but one of the most pleasant consequences of DDR4 is the cheapening of large amounts of RAM, and this is good news. The performance of the new type of DDR is higher, and the power consumption is lower than that of its predecessor (thanks, comrade captain), and with troubleshooting, things are much more rosy.

A truly massive transition to DDR4 took place in the fall of 2015, after the announcement of Intel Skylake processors. In desktop PCs, together with the transition of customers to a new platform, especially since there was not much to choose. The fact is that for the sixth generation Core chips, Intel recommends either DDR3L memory or DDR4. The classic and most common in DDR3 desktops with processors to work in the state, but Intel can not guarantee that the processor (with its built-in memory controller) will not be damaged by such a neighborhood.

Another remarkable fact is that Apple, famous for its innovations, is still equipping its computers with DDR3 versions. For example, the newest MacBook Pro is content with just 16 GB of LPDDR3E RAM (improved, economical LPDDR3), which provoked some drama among potential buyers of the model. And the thing is that Apple is chasing the maximum efficiency of all components and for this reason refuses to use the "full-blooded" DDR4. And there is no third option for mobile Intel processors - Skylake is not friendly with LPDDR4, and DDR4L is still in development. Therefore, we are witnessing a situation where the latest computers are content with the outdated RAM standard. However, Apple makes its laptops "on a turn-key basis", without the possibility of upgrading the RAM, so either the buyer accepted the configuration before the purchase, or not the computer, there is no third.
And in all cases when there is room for upgrading and selecting a configuration, RAM requires only high-quality memory banks and the appropriate manufacturing culture (where do it not?).


RAM for the PC develops in stages, but with noticeable innovations

Kingston has this order since 1980, when the memory came from Fountain Valley to the Mac, and then the PC to replace the less high-quality brand modules. Now Kingston is given a lifetime warranty, which is often not useful, because the reliability of the company's modules is still very high .
The next generation of RAM will arrive in computers no earlier than 2019 and will initially be used in servers. HBM multilayer chips are coming.

Videocards. Technical process - the engine of progress


Too voluminous topic for a complex article, because video cards are developing rapidly, so we note the trends of the last couple of years.
First, the graphics accelerators are timid, but deliberately moving in the direction of the "playable" frame rate in 4K resolution. Most often, this factor (and not the crooked ports of games from consoles) encourages iron developers to increase productivity, and buyers - to spend new funds on upgrading the video accelerator.

The second driving force is virtual reality. Games and simulators with maximum immersion in the process require extremely detailed pictures, and this is not a question of the quality of graphics, but of comfortable well-being, because VR, in some way, “deceives the body” of a person. The response time and picture quality depend on how convenient the virtual reality in the home version becomes for long-term use - poor detail also becomes one of the causes of nausea, fatigue and headaches from using VR.

Regarding the design of video cards, the two main innovations of recent years are new process technology (16 nm FinFET from NVIDIA, 14 nm FinFET from AMD) instead of archaic 28 nm stretching for many years, and a new type of HBM memory, which replaced GDDR5.


Video cards have undergone significant changes in 2016

The useful area of ​​the crystal has decreased and chip developers have “untied hands” to increase productivity, which is why the NVIDIA Pascal architecture, not revolutionary in its base (modified in several directions by Maxwell, familiar to us in the GTX 900 series), sharply added to performance, and the video cards learned how to work high clock frequencies, as a result of which we observed the “overwhelming” devaluation of indices of previous generations. For example, the GeForce GTX 980 turned out to be equivalent to just the GTX 1060. As in the good old days!

AMD has some other goodies in store. Back in 2015, absolutely new R9 Fury video cards appeared in which the graphics processor and memory were combined into a single cluster. Moreover, the memory itself is dramatically different from what we are used to seeing in GDDR of any generation: compact multilayer economical chips have not yet been perfected (to cope with significant heating of AMD components, Fury video accelerators had to be equipped with water cooling), and in the new Radeon RX 470/480 are not used, but the performance of GDDR5 has nowhere to increase and it is HBM that will lead to a new jump in the performance of video cards in the future.

The next factor in productivity growth will be the use of low-level APIs, the most famous of which are Vulkan and DirectX 12 , which allow you to be “closer to the hardware”, bypassing the usual levels of abstraction. That is, it will make sense to implement new, more efficient rendering schemes and manage the memory of all video accelerators in SLI as a single entity (though NVIDIA itself prevents this from disabling SLI support in middle class video cards). And the spread of DirectX 12 is hampered by the fact that it is present only in Windows 10, while the preferences of PC users were distributed somewhat differently . Game developers are also not in a hurry to implement a bright future, so we will see the main achievements of the new APIs in the near future.

Alive all kinds of "post-devices"


A personal computer, even without explosive progress on all fronts and a pack of problems set aside for later, still looks like the king of the mountain - a practical, modular, universal computing machine that can "light up" any frivolous devices of the so-called post-PC era.

Another thing is that the classic desktops stopped “hammering nails” and used them in any work scenarios, as it was before: workstations deal with extremely time-consuming tasks, laptops, tablets and thin clients became an intermediate link between the device for generating and consuming content, for example . And in the regiment of gaming consoles and their varieties arrived ( Steam machines , for example).
And it would not be entirely correct to compare the incomparable, but as practice has shown - where the computer loses weight, the tablet computer will go to the grave. So we wish good health to our last stronghold of honest digital technology among disposable gadgets!

Even if you already have an SSD and you are not planning to purchase a new one, we know how to make you happy for the new year. There is not a lot of RAM, so we give you a 12% discount on all available DDR4 Predator models on the Ulmart network. Arm with a promotional code GEEKPR16 and have time to buy high-speed memory until December 31, 2016. For more information about Kingston and HyperX products, visit the company's official website . In choosing your kit HyperX help page with visual aids .

Source: https://habr.com/ru/post/399653/


All Articles