📜 ⬆️ ⬇️

Modern analog computers: is there a future?

Most of us associate the development of information technology with the digital revolution. The emergence of microprocessors, of course, brought electronics to a fundamentally new level. Already the race for the possession of the most powerful supercomputer lost its scientific charm - teraflops directly depend on the amount of money and free space. Buy more servers and increase computing capacity.

Since the time of the university, I have been haunted by the idea that I would like to throw to the discussion of the habrasoobschestvu.

Before the digital age, the direction of analog computers was developing.
')
An analog computer is a device that performs computational tasks, operating with continuous rather than discrete data. A bit is a discrete value, one or zero. Current, voltage, pressure, temperature, brightness, power - this list can be continued for a long time - there are continuous values, that is, their exact value cannot be measured in principle, everything is limited by the accuracy of the measuring device.

If digital media processing is an ideal medium for digital technology, then logic processing should be an ideal medium for an analog computer, for example, real-world data. But for some unknown reason, this area of ​​knowledge is practically abandoned. Probably the answer is some insurmountable difficulties, perhaps something else, but over the past ten years in this direction there have been almost no changes.

Most often analog computers are purely hydraulic or mechanical devices that convert an input signal into an output by a constructively programmed function - like the same Kauffmann posographer , which determines the most successful exposure when shooting, or an anti-Hitter mechanism predicting the position of the planets and the sun.

A classic example of a modern analog computer is an automatic car transmission. When the torque changes, the pressure of the fluid in the hydraulic drive also changes, and the nature of this “function” can be changed constructively.

But such examples in the 21st century are already indecent to cite. Science has gone so far ahead that the implementation of the simplest function should have deservedly remained in the middle of the last century. But for some reason, nothing came in return.

I would like to raise the question of automatic electronic devices that solve the problems of processing real-world signals without digitizing them. Well, or get a convincing answer, why at this stage of development of civilization there are no such examples.

Look, on the one hand, almost all interfaces to the real world are analog: microphone, webcam, mouse. On the way from physical phenomena (a mouse was moved, a sound was made or a light was turned on), the signal recorded by a computer passes through an ADC — an analog-to-digital converter, where the analog signal is digitized. As a result, we “coarsen” the original signal to an acceptable level. And whatever one may say, seriously processing high-quality video in real time is not very good for us so far (for example, recognizing objects on it).

If you think about it, then digital signal processing has practically no analogues in nature, unlike almost everything else that humanity invented. Any living organism is built differently - it is exclusively an analog computer. Here, both chemical reactions and neurons work with continuous physical parameters, and not with a “digit”. When some patterns coincide with what we get from the real world, the brain fixes “bursts”, clinging to which, corrects the direction of memories and past experience, gives the command to our senses to listen or look at some key details.

All this would be impossible if the brain had a digital nature. But how does all this translate into technology?

By analogy with bit operations, any continuous physical quantities are amenable to addition, subtraction, division or multiplication. But more interestingly, there are solutions that allow you to perform the functions of integration and differentiation with analog signals. These signals can serve as a laser in optical computers or information about the brightness of individual parts of space. A certain processor could impose a two-dimensional or three-dimensional field of the template on a two-dimensional or three-dimensional field of the projection of the real world, finding a splash, a resonance when they are superimposed, more accurately analyze the found configuration until the desired threshold of certainty is reached.

As a result, a whole class of tasks associated with making decisions, recognizing images, sounds, and any interaction with the outside world, should have a very effective implementation using analog logic due to parallelization of calculations.

Solving data processing problems from the real world in a digital way resembles nailing a microscope. To turn the picture, we rather use a conventional lens than do a similar operation on a digitized copy. How much would the headphones cost if the noise cancellation system were to be made through a bundle of ADC-processor-DAC?

I think that the next big step in electronics is quantum, analog systems, systems built on the principles of neural networks and not with digital nature in their basis. It should be already significantly "advanced" analog technology, specializing for a specific task. It is necessary to move away from the “screenshots” analysis model to the “live image” model, from discreteness to continuity.

New developments in this area are extremely few.

One of the very interesting, but very poorly lit in runet, are technologies
built on the principle of Cellular Neural Networks. The architecture of such systems resembles a neural network, in which each cell is an independent state element, informationally connected with several neighbors. Commercial real-time image analysis solutions using CNN are, for example, Anafocus and Eutecus . The latter, for example, on its website claims that its systems operate at speeds of 10 ^ 12 operations per second. Similar performance shows Lenslet enlight256 - an optical processor, built on a different principle, VCSEL-lasers.

It is also clear that for full-fledged decision-making systems, in robotized control systems, more information is needed about the world or the object under study than a regular camera gives. Look at the nature - there are smells, and brightness, and temperature, and sound - everything complements each other. Yes, and stereo vision and the ability to look at the world from different points plays a significant role in understanding what is happening around you. This all means that the amount of information that will need to be processed by fuzzy logic will be simply huge. And the underdevelopment of speech recognition systems or images is now connected precisely with the fact that they all receive very limited information at the input, with a lot of losses, distortions or noise. And a large amount of information is simply nothing to process.

It is hoped that in the next ten to twenty years we will not stupidly multiply the number of processors, frequencies, try to create systems based on a terrible hitch ADC-processor-DAC where you can leave only the central element, but make it a fundamentally different, more suitable solution tasks.

So is there a future for analog computers?

Source: https://habr.com/ru/post/146680/


All Articles