📜 ⬆️ ⬇️

The “ability” of machine learning to predict the future of chaotic systems

Half a century ago, the founders of the theory of chaos discovered that the "butterfly effect" makes it impossible for the long-term prediction of the behavior of a chaotic system. Even a minimal disturbance of a complex system (such as weather, economy, etc.) can trigger a chain of events that will make the future unpredictable. Unable to pinpoint the current state of such systems, we cannot predict how they will evolve in the future. But now machine learning comes to the rescue.


According to the results of a series of experiments published in the journals Physical Review Letters and Chaos, scientists used machine learning (the same method that is behind the latest achievements of artificial intelligence) to predict the future - predicting the evolution of chaotic systems to distant horizons that stagger the imagination. * This approach is recognized by third-party experts as innovative and is likely to become available soon for widespread use.


The findings are cited by chaos theory veteran Edward Ott and his four staff from the University of Maryland. They used a machine learning algorithm, called reservoir computing, to “study” the dynamics of an archetypical chaotic system, called the Kuramoto-Sivashinsky equation. The evolving solution to this equation behaves like a flame front moving through a combustible medium. The equation also describes plasma drift waves and other phenomena, and, according to Jadidep Patak, a graduate student of Ott and lead author of research, serves as "a test bench for studying turbulence and space-time chaos."


After training on the data, the “reservoir calculator” of researchers was able to accurately predict how the flame-like system would evolve over a period of eight “Lyapunov Times”, simply put, eight times longer than previous methods allowed. “Lyapunov time” shows how much time is required for the exponential divergence of two almost identical states of a chaotic system. In fact, this is the time during which the system is brought to complete chaos.


“It's really very good,” says Holger Kants, a researcher on chaos theory at the Max Planck Institute for the Physics of Complex Systems. The fact that it was possible to predict a chaotic system for a period of eight “Lyapunov times”, figuratively speaking, means that using machine learning is almost the same as knowing the truth.

The algorithm knows nothing about the Kuramoto-Sivashinsky equation itself; he sees only a set of data on an evolving equation solution. This makes machine learning particularly effective. In many cases, the equations describing a chaotic system are unknown, which prevents the modeling and prediction of such a system. The results of Ott's research suggest that equations are not needed at all - only data is needed. Holger Kantz, mentioned above: “It is assumed that one day we will be able to predict the weather with the help of machine learning algorithms, and not with the help of complex atmospheric models.”
In addition to weather forecasting, according to experts, machine learning techniques can help in monitoring cardiac arrhythmias with signs of future heart attacks, as well as in monitoring neural excitations in the brain to indicate neural impulses. In theory, this could also help in predicting giant waves that threaten seagoing vessels, and possibly even earthquake prediction.

Ott, Patak and their colleagues Brian Hunt, Michel Girvan and Jishin Lu (who now works at the University of Pennsylvania) achieved their results by combining a number of existing tools.
')
Ott and his colleagues used the localization of interactions in spatially extended chaotic systems. Localization means that variables in one place depend on variables in adjacent places, but do not depend on variables located far away. “Using this,” Patak explains, “we can essentially break the task into pieces.” That is, we can use one reservoir of neurons to learn about one piece of the system, another reservoir - to find out about the next piece, etc., with a small overlap of neighboring domains to account for their interactions.

Parallelization allows you to use an approach based on reservoir calculations for processing chaotic systems of almost any size if the required computational powers are available to solve the problem.

The researchers show that their predicted solution of the Kuramoto-Sivashinsky equation exactly corresponds to the true solution over a period of eight “Lyapunov times” before chaos finally wins, and the actual and predicted states of the system diverge.

The usual approach to predicting a chaotic system is to measure its characteristics at a single point of time as accurately as possible, use this data to calibrate the physical model, and then improve the model. According to a rough estimate, in order to increase the prediction period of the development of a chaotic system eight times, it is necessary to measure the initial conditions of a typical system 100,000,000 times more accurate.
“That is why machine learning is“ a very useful and powerful approach, ”says Ulrich Parlitz from the Institute for Dynamics and Self-Organization. Max Planck, - “I think that this not only works on the example that they looked at, but is also universal in a certain sense and can be applied to many processes and systems.”
After an article in Physical Review Letters, Ott, Patak, Girvan, Lou, and other employees approached the practical implementation of their prediction technique. In the new studies accepted for publication in Chaos, they showed that improved predictions of chaotic systems, such as the Kuramoto-Sivashinsky equation, are made possible by a hybridization of the data-based approach, machine learning and traditional model-based prediction. Ott sees this as a more likely way to improve weather forecasts and similar attempts, since we do not always have complete high-resolution data or ideal physical models. “We must use the good knowledge that we have where we have it,” he says, “and if we have ignorance, we must use machine learning to fill in the gaps where this ignorance is located.” Tank-based predictions can significantly calibrate models; in the case of the Kuramoto-Sivashinsky equation, the accuracy of the predictions can be increased to 12 “Lyapunov times”.

The duration of the “Lyapunov time” varies for different systems, from milliseconds to millions of years (several days in predicting the weather). The shorter it is, the more unstable or more prone to the effect of the butterfly will be the system.

Wilkinson and Kants define chaos in terms of stretching and folding, like re-rolling and folding dough when making puff pastries. Each piece of dough is stretched horizontally under the rolling pin, quickly exposing in two spatial directions. Then the dough is folded and flattened, squeezing the adjacent spots in the vertical direction. Kanz says that the weather, forest fires, the surface of the sun, and all other chaotic systems act this way. “In order to have this exponential divergence of trajectories, we need this stretching, and in order not to run away to infinity, we need to bend a little,” folding is due to the non-linear relationships between variables in the systems.

It is precisely because tank calculations are so good at studying the dynamics of chaotic systems, they are not yet sufficiently clear, except that the computer adjusts its own formulas according to the input data, until the formulas reproduce the dynamics of the system. The technique works so well that Ott and some other researchers at the University of Maryland now intend to use chaos theory as a way to better understand the internal mechanisms of neural networks.

Source: https://habr.com/ru/post/358352/


All Articles