The American Physical Society
summed up the year. Among the interesting things this year are unknown particles from the depths of space, knots of quantum computers, thermonuclear fusion, ball lightning, as well as some statistics and philosophy.

Possible detection of dark matter

As it is known, a significant part of our Universe consists of dark matter - particles of incomprehensible nature, subject only to gravitational interaction. They do not emit or absorb electromagnetic radiation, which does not allow to observe them directly through a telescope.
')
One of the candidates for the role of dark matter is
sterile neutrinos (they
have already been written about them on Hicktimes). With a very small probability (10
-21 s
-1 ) they can decay into ordinary neutrinos and gamma-quantum. Based on assumptions about their mass, the gamma-quantum should have an energy of the order of a few keV units (X-ray range).
Actually, this year already two orbital telescopes (NASA’s Chandra and ESA XMM-Newton)
found an unusual peak with an energy of 3.6 keV in the radiation of two different galaxies. It looks like this (all at the limit of possibilities):

In this case, the intensity of the line decreases with distance from the center of the galaxy. Of course, this doesn’t pull into evidence, but at least does not contradict the ideas (the density of dark matter decreases on the periphery of galaxies) and does not find other reasonable explanations. To clarify, it is planned to take measurements from the Japanese ASTRO-H telescope, which will be launched into orbit next year.
Inflation model

One of the subtle points in cosmology is the expansion of the Universe in the first moments after the Big Bang. A highly probable theory is the
inflationary model , which assumes an extremely rapid expansion of the Universe at a certain stage and makes it possible to circumvent a number of problems that appear in other models. In theory, if the inflationary model is correct, then this should somehow reflect on the subsequent evolution of the Universe and on what we see now.
In particular, we should observe some features of the background radiation (microwave background of the Universe, the nature of which goes back to the Big Bang), namely its vortex (rotor) polarization. It is something like circular polarization of light. However, for measurements, you need not a polarizer from 3D glasses, but something more complicated.
Actually, it was possible to
detect this feature in polarization in 2014. The device that managed to do this is called
BICEP2 . This is a radio telescope-refractor (great, yes?), Measuring the relic radiation at a frequency of 150 GHz using an array of sly superconducting sensors. Polyethylene lenses are cooled with liquid helium to 4 K, and the matrix itself is up to 250 mK. The device is installed at the South Pole to observe the same sky.
Unfortunately, after some time it turned out that the result could have been caused by the scattering of radiation on cosmic dust. Apparently, the experiment will continue to accumulate statistics.
Progress in inertial thermonuclear fusion

Take a small ampoule with a deuterium-tritium mixture and sharply illuminate from all sides with powerful lasers. If the radiation is powerful, it will compress the ampoule to such an extent that the fusion reaction will begin. The main problem here is to create super-power pulsed lasers and choose the pulse parameters for the maximum energy output of the reaction.
In particular, it is extremely important to choose the optimal time profile of the laser pulse. Typically, the laser power increases with time, and, stepwise (to save energy):
Compression is slow in the first stage and much stronger at maximum laser intensity (this profile is called
“low-foot” ). The problem is that by a certain moment the target, apparently, begins to collapse, and the reaction stops.

In 2013 (the article was published in early 2014) at the
National Ignition Facility (Livermore, USA) suggested that a sharper beam profile (
“high-foot” ) would avoid premature target decay. The experiment
showed that the idea turned out to be true. Moreover, it allowed to understand the inaccuracies of theoretical models. A nice bonus was the fact that a positive feedback appeared in the new process: the alpha particles formed as a result of the synthesis additionally heat the mixture, which maintains the high temperature required for the synthesis.
As a result, the efficiency of the reaction increased (according to various parameters) by about 50%. True, the ratio of the energy received to the spent energy still remains below unity.
The first observation of the spectrum of ball lightning

And about this on Giktayms
already written .
SiV centers in diamond

Sometimes it turns out that lattice defects have very useful properties. For example, if in diamond one of the carbon atoms is replaced with nitrogen, and the next one is removed altogether, then we get the
NV center (nitrogen-vacancy). Its energy levels are very similar to the energy levels of the atom: they are narrow, they have acceptable lifetimes, and the transitions between them lie in the visible and IR ranges, which is very convenient for manipulating them.
Imagine: we have a small piece of diamond, and we know that there is exactly one NV-center in it. If we shine on it, it will radiate exactly one photon. Now take two pieces of diamond - they will emit two absolutely identical photons. And this can already be used for experiments with quantum information.
Another nice thing is that a piece of diamond is durable. With a single atom it would not have happened: it can fly away, oxidize, or something else. Therefore, NV centers are very much loved in quantum optics, although they are not ideal.
Actually, the essence of the discovery is that we
managed to find another similar system. As the name implies, this is practically the same, only the nitrogen atom was replaced by silicon. As it turned out, such a center is superior to NV-centers in a number of characteristics - and this means that the next few years, studies of SiV promise to be fruitful.
Single photon transistors

This year a one-photon transistor was added to the single-photon source, and not one, but two. This is not about an electron, but about an optical transistor, that is, controlling a powerful beam of light with the help of a weak (ideally, one photon).
Let's take a hydrogen atom. It has a line spectrum: many electron energy levels corresponding to different electron orbitals. The higher the level, the greater the radius of the electron orbit. So, for a level with a number in the region of 50-70, the radius of the orbit is tens of microns. Such an atom is called
Rydberg and looks very interesting: a tiny core and a huge electronic "coat" around. And what if you slip another atom under the "coat"?
We look at the
picture : a laser beam (pink) excited several Rydberg atoms (red nuclei and gray electron clouds). Green atoms fell under the "fur coat" of their neighbors and were screened as in a Faraday cage - they do not feel external fields and do not interact with the light. If Rydberg "fur coats" are removed, then the green atoms will again be able to interact with the outside world - say, absorb photons.
To enable the "fur coat" just one photon. They turned on the "fur coat" - the green atoms have shielded and do not absorb light - the light passes through them. Turned off the "coat" - green atoms are activated and absorb light - the light does not pass. The idea is quite simple, but it took a lot of time to implement. This year,
two groups from the Federal Republic of Germany succeeded. The utility models are very far from here, but for science this step was very welcome.
Entropy and diagnosis of leukemia

There is such a parameter in statistical physics as
entropy - a measure of disorder. Usually, all (<mode bore> closed <mode bounce>) systems tend to the state with the greatest entropy. Let some system be described by two variables A and B, and we can measure only A. We do not know anything about B - but if the system is alive and evolving, then B tends to the value at which the entropy of the system will be maximum.
If we are able to count the entropy by the known A and B, then having solved the inverse problem, we will find B corresponding to the maximum entropy. This will be the most likely value of B in the living system.
Now the same, but in medicine. A living system is a human being, there are not two parameters, but much more. We can measure some of them (as I understand it, something like protein concentrations), figure out how to calculate the entropy, and restore some unknown parameters using the proposed scheme. These unknowns are very helpful in diagnosing leukemia and, possibly, a number of other serious diseases. In any case, allegedly, the
first results were quite encouraging.
Time arrow

So entropy grows over time. Or vice versa: time flows in the direction in which the entropy (disorder) is greater. This is the definition of the thermodynamic arrow of time. There are two more arrows of time: cosmological (the Universe is expanding in its direction) and psychological (as time we feel). One of the fundamental physical (as well as philosophical) questions is why it seems to us that the directions of all these arrows coincide.
About this available reasoning Hawking in "A Brief History of Time". His explanation for the thermodynamic and psychological arrows is simple and elegant. Our brain is essentially a computer that processes input data. Regardless of whether he arranges them or erases them, the energy released in the form of heat is expended on it. A careful calculation will show that when a computer is running, the total entropy of memory and the environment increases - and this means that the time for it goes in the direction of increasing entropy.
In
this year's work, the question is about the same, but the approach to the solution is slightly different. The authors propose a mental experiment: two communicating vessels, one with gas, the other without; between the vessels is
Maxwell's demon counter and counts how many molecules flowed in which direction. Thermodynamic boom is directed towards equalization of gas concentrations. If psychological time flows there, the counter counts how many molecules it has
flown in which direction. If the arrow of time is directed in the opposite direction, then the counter considers how many molecules
will fly - a sort of memory about the future.
Now let's slightly move one of the molecules at the beginning of the psychological time. If the time arrows are co-directed, then nothing will happen: the final state of the system will be approximately the same, the counter will initially count the passes at the same points in time (maybe a little different in the future). If the arrows are directed oppositely, the system will not be able to return to the “all gas in one vessel” state, because the probability of such a process is extremely small and very sensitive to the initial conditions. That is, the minimum change in the initial conditions completely changes the meter readings.
Thus, the memory (of the counter) changes only slightly with a slight change in the past for the co-directed arrows of time, and is radically rebuilt for the opposing ones. In the latter case, the memory is difficult to call a memory, because it ceases to store what was happening. The authors do not stop at this, but introduce certain characteristics of memory, generalize the theory to other types of memory, and
generally have a great time discussing many other interesting aspects. What is great, with all this, on 8 pages they write only 4 formulas and draw one picture.
Instead of conclusion
It is nice to see a deep philosophical publication on the pages of one of the leading physical journals. It is no less pleasant to realize that we are a few steps closer to quantum computing. We will wait for new interesting works in the coming year.