📜 ⬆️ ⬇️

What could be the computing systems of the future

We tell that the new can appear in data centers and not only in them.



/ photo jesse orrico Unsplash

It is believed that silicon transistors are approaching their technological limit. Last time we talked about materials that can replace silicon and discussed alternative approaches to the design of transistors. Today we are talking about concepts that can transform the principles of traditional computing systems: quantum machines, neuromorphic chips and DNA-based computers .

DNA computers


This is a system that uses the computational capabilities of DNA molecules. DNA strands consist of four nitrogenous bases: cytosine, adenine, guanine, and thymine. By linking them in a specific sequence, you can encode information. To change the data, special enzymes are used that complete the DNA chains using chemical reactions, as well as cut and shorten them. Such reactions can be carried out in different parts of the molecule simultaneously, which allows parallel computations to be performed.

The first DNA-based computer was introduced in 1994. Leonard Adleman, a professor of molecular biology and computer science, used several test tubes with billions of DNA molecules to try to solve the traveling salesman problem for a seven-vertex graph. Adleman marked its tops and edges with DNA fragments with twenty nitrogenous bases, and then applied the method of polymerase chain reaction (PCR).
')
The disadvantage of Adleman’s computer was its “narrow focus”. He was imprisoned for solving one task and could not carry out the others. Since then, the situation has changed - at the end of March, scientists from the University of Meinuth and the California Institute of Technology presented a computer, which is loaded into data in the form of DNA sequences and can be reprogrammed.

The system is able to open the way to a new type of computing system, it remains to solve the problem with slow data input and output (the sequencing process is quite expensive and takes a long time).

Despite the difficulties, experts say that in the future, DNA computers the size of modern desktops will bypass supercomputers in performance. They can be used in data centers that process large data sets.

Neuromorphic processors


The term “neuromorphic” means that the architecture of the chip is based on the principles of the human brain. Such processors emulate the work of millions of neurons with processes called axons and dendrites. The first are responsible for the transfer of information, and the second - for its perception. Neurons are interconnected by synapses - special contacts that transmit electrical signals (nerve impulses).

The creation of neuromorphic systems was talked about back in the 1990s . But seriously engaged in developments in this area after the 2000s. Experts from IBM Research participated in the SyNAPSE project, whose goal was to develop a computer with an architecture different from that of von Neumann. As part of this project, the company designed the TrueNorth chip. It emulates the work of a million neurons and 256 million synapses.

Above neuromorphic processors work not only at IBM. Intel has been developing a Loihi chip since 2017. It consists of 130 thousand artificial neurons and 130 million synapses. A year ago, the company completed the development of a prototype on the 14-nm process technology.

Neuromorphic devices allow you to accelerate the learning of neural networks. Such chips, unlike classic processors, do not need to regularly refer to the registers or memory for data. All information is permanently stored in artificial neurons. This feature will allow to train neural networks locally (without connecting a storage with a set of test data).

It is expected that neuromorphic processors will find application in smartphones and Internet of Things devices. But for now, there is no reason to talk about the large-scale implementation of technology in user devices.

Quantum machines


The basis of quantum computers are qubits. Their work is based on the principles of quantum physics - entanglement and superposition. Superposition allows the qubit to be in the zero and one state at the same time. Entanglement is a phenomenon in which the states of several qubits are interrelated. This approach allows operations with zero and one at the same time.


/ photo IBM Research CC BY-NA

As a result, quantum computers solve a number of problems much faster than traditional systems. Examples include the construction of mathematical models in the financial, chemical and medical fields, as well as cryptographic operations.

To date, the development of quantum computing is engaged in a relatively small number of companies. Among them is IBM with their 50-qubit quantum computer, Intel with 49-qubit and InoQ, which tests the 79-qubit device . Also in this area are Google , Rigetti and D-Wave .

It’s too early to talk about mass adoption of quantum computers. Even if you do not take into account the high cost of the devices, they have serious technological limitations.

In particular, quantum machines operate at temperatures close to absolute zero . Therefore, such devices are installed only in specialized laboratories. This is a necessary measure to protect fragile qubits that can support superposition for just a few seconds (any temperature fluctuations lead to their decoherence ).

Although at the beginning of the year IBM introduced a quantum computer capable of working outside the laboratory with a tightly controlled environment - for example, in the local data centers of companies. But you can’t buy the device yet; you can only rent its power through a cloud platform. The company promises that in the future this computer can be purchased by anyone, but it is not yet known when this will happen.



Materials from our Telegram channel:

Source: https://habr.com/ru/post/457156/


All Articles