In April of this year, within the framework of the project “
Collective Phenomena in Quantum Matter, ” led by leading scientist
K. B. Efetov , the leading international scientist A.M. came to us. Zagoskin, a
reader in quantum physics at the Faculty of Physics at Loughborough University, UK , is
one of the founders of D-Wave Systems Inc. (1999, Canada) , which released the world's first adiabatic quantum simulator.
Our university could not disregard such a significant event. A.M. Zagoskin gave a lecture, and we captured this event, organized a video shooting and mounted a video that can be
viewed here . This material will undoubtedly be of interest to the target audience! Also, we could not miss this opportunity and asked the professors to participate in our already traditional
GT heading
“Expert Opinion” .
Alexander Markovich gladly agreed to write a note in the popular science format about quantum computers and engineering specifically for
our corporate blog on the GT . We are sure that most of the target audience that will be interested in this material is concentrated on GT!
Also tomorrow we will publish this material on our portal, where, perhaps, a real scientific discussion will unfold between our experts, young researchers and the author. Probably, someone from dear readers will want to join the discussion on our portal.Quantum computers, quantum engineering and quantumness
In science fiction, the main thing was radio. Under him, the expected happiness of mankind. Here is the radio, but there is no happiness.
Ilya Ilf, "Notebooks" .
If the word “revolution” or “quantum” is found in a newspaper article on science, then it is usually not worth reading further. Nevertheless, the results obtained in the last twenty years may, in time, justify the loud title of the “second quantum revolution”.
The “first quantum revolution” of the middle of the last century led to the creation of nuclear weapons and nuclear energy, semiconductor electronics, masers, lasers and superconducting devices. Without these technologies, the modern level of civilization development would have been impossible. Nevertheless, they rely on the simplest quantum effects, in the sense that they can be described by a small number of variables even when the macroscopic number of particles is involved. Thus, the quantum theory of a rigid body operates mainly with one- and two-particle Green functions. Superconductivity is a good example: despite its essentially quantum nature and the fact that the macroscopic number of electrons enter the superconducting condensate, an ordinary superconductor is well described by a single “wave function” of the coordinate and time. Therefore, a microscopic number of degrees of freedom is involved in all the “strangeness” of quantum theory.
')
The “second quantum revolution” is based on more fragile effects, such as quantum entanglement at macroscopic distances and quantum superposition of macroscopically different states. Of course, in quantum theory there are no prohibitions on their existence, as illustrated by the famous Schrödinger cat. However, until relatively recently, firstly, their implementation remained practically impossible, and secondly, therefore, was considered irrelevant. The Copenhagen interpretation with its principal (although it is not known where the passing) boundary between the micro and the macrocosm coped well with the case.
The situation began to change after Feynman emphasized that it is fundamentally impossible to effectively model quantum systems using classical computers - simply because the dimension of the corresponding Hilbert space increases exponentially with the size of the system. This does not contradict the achievements of quantum field theory, the theory of solids, etc. - success was achieved precisely in those tasks that could be reduced to the simultaneous consideration of a relatively small number of degrees of freedom. (For example, when the behavior of a macroscopic quantum system is described in terms of an almost ideal gas of quasiparticles, and correlations higher than second or third order are of little importance.)
The operation of a digital quantum computer essentially uses just such states of a quantum system with which the classical methods of calculation do not cope. This caused interest in their experimental implementation, despite considerable skepticism from even people like Tony Leggett, who won the 2003 Nobel Prize for his fundamental work in the theory of macroscopic quantum tunneling and superposition. Indeed, “cat” states of the form | 0000 ... 0> + | 1111 ... 1> are very fragile compared to the factorized
P J (| 0> + | 1>)
J. The study of the mechanisms and speed of their destruction (“the problem of the quantum-classical transition”) continues, suddenly turning from a rather abstract semi-philosophical section of the foundations of quantum mechanics almost into an engineering discipline. But the main thing is that such states turned out to be much more stable than expected, and that so far no experimental indications of their fundamental prohibition have appeared.
As a result, since the end of the last century, such progress has been made in the manufacture and control of artificially essential quantum structures, starting with individual superconducting qubits and up to the current Divivev processors, that the theory has fallen behind. The current theoretical and computational methods do not allow predicting, analyzing and simulating the behavior of such structures.
By “substantial quantumness”, in the absence of a better term, I understand the existence in the system at each moment of time a sufficiently large number of degrees of freedom in the state of quantum superposition. The vector of quantum state of such a system “lives” in a very large Hilbert space. Such a vector is generally not simulated and not measured. Already a hundred qubits - almost the limit. Not without reason, the coherent quantum behavior in Deviv processors was directly demonstrated only for a group of about a dozen qubits, for which it was possible to measure the quantum state, and build a quantitative model with which the measurement results were compared. The behavior of a five hundred - or thousand-bit processor must be characterized by indirect results: by the type of statistics of successful and unsuccessful attempts to find the basic state of the system with a random choice of its parameters.
Thus, even if there is no fundamental prohibition on the existence of arbitrarily large “cats”, and in principle it is possible to create a universal quantum computer that is able to effectively model the behavior of large essentially quantum systems, the road to it is blocked for now. Quantum systems, which can still be modeled by existing methods, are too small and, as a universal quantum computer, are inoperable. And we will not be able to construct a system of sufficiently large size, because it is impossible to predict or characterize its behavior. Moreover, this system will be quite complex, and the task of its design, manufacture, characterization, debugging and operation is already an engineering task. It is necessary to create quantum engineering.
One of the definitions of engineering is the creation of reliable structures from unreliable elements. In our case, there are completely new levels of insecurity due to the notorious fragility of the “cat” states, which cannot be compensated for by duplicating systems or checking individual structural elements. Standard engineering methods are insufficient. But a general engineering approach with its result orientation, using estimates, phenomenological assessments, heuristics and intuition, the habit of having to satisfy incompatible requirements, can be fruitful where it is impossible to calculate from first principles.
Engineering can be divided into engineering of elementary units, engineering of structures and systems engineering. Applied to our field, the first of these concerns individual quantum bits, their small arrays and the corresponding control circuits. Here, in general, everything is clear - the theory is well tested in experiments for a variety of implementations of these devices. The latter should deal with the integration of quantum and non-quantum devices in large systems - and this is not yet relevant, because there is no middle link. Structural engineering, by definition, is engaged in predicting the properties of a structure based on the properties of its structural elements. This is precisely what we cannot yet do even for already existing structures, and it is here that it is necessary to concentrate efforts.
Of course, neither “quantum engineering intuition” nor quantum engineering can develop otherwise than on the basis of the regular application of quantum theory to the development and testing of new essentially quantum devices. A universal quantum computer is not the only and, apparently, not the most interesting and useful of such devices. (Although it is - more precisely, the frightening prospect of its creation and use for decoding RSA-code - and played a key role in attracting attention and money to this area of research.) More realistic, for example, quantum optimizers like Deiv's - in essence, analog devices, sort of quantum slide rule, tuned to a fairly accurate solution of a limited, but important, class of problems. Interesting quantum metamaterials are artificial environments with a sufficient degree of quantum coherence, with predicted fun properties and possible use as sensors or for image processing. In short, humanity will find what nuts to prick with these royal seals. The main thing is to try to make them. And the success in the creation and application of essentially quantum devices will be the very public practice, which is a criterion of truth and which will give more to our understanding of quantum theory than any number of university lectures and, especially, popular books and TV shows.
Now the importance of this subject is gradually gaining recognition. After the initial hype and bloated expectations and several years of regular cooling and skepticism, there was a rise. His signs are the recently allocated 250 million pounds in the UK for the UK, the newly promised billion euros in the European Union, the investments of investors such as Google and NASA in the DiWave Systems, the Canadian Prime Minister’s press conference at the research institute. he (briefly and incorrectly) explained to journalists how a quantum computer works, etc. etc. The interest of financiers, businessmen, politicians and the military is understandable, although - as always - both the benefits and the troubles of a fundamentally new technology will not lie where they are expected.
As for dangers, the main threat of quantum “second wave technologies” for, for example, the global financial sector is not at all that someone starts breaking codes in large numbers in order to transfer money from other people's accounts or steal economic secrets. The main threat - and for its implementation does not even need a universal quantum computer - is that operational control of the global economy will be possible in real time, which simply eliminates the entire financial sector as completely useless.
upd:
As for the benefits, then, for example, no algorithm for Grover can dig debris of information debris in which the world is sinking (from Facebook) to improving the quality of scientific research and higher education, transparent reporting, global ratings and other “sharing best practice "), if not reduce the generation of this garbage. Returning to the epigraph - no technology in itself solves the problems of humanity and does not bring him happiness. The main benefit, as always, will be to expand and deepen our understanding of the laws of nature.