📜 ⬆️ ⬇️

Hacking the server led to another dance with a tambourine around the GP

Hacking the Climatic Research Unit server led to a big scandal around global warming . That was the name of the next sensational news about the GP, which the scholars of conspiracy theories among scholars were sucking on with pleasure.
I am a skeptic and do not believe in Santa Claus and the habit of checking information has entered along with numerous laboratory laboratories while studying at the Faculty of Physics of RSU. I spread the result of small excavations to the court.


Climatological studies are among the most difficult tasks. And often these very tasks are solved on supercomputers. Cause?
It is necessary to process a huge amount of data in millions of points of the globe over many years. Since this data is scarce, and there is still not enough computing power, we have to resort to some simplifications or physical models. The real object is simplified and we get some abstract object, the behavior of which is already quite easy to predict. An example of a physical model you should remember about school is an ideal gas.

Returning to the GP, in order to get a computer model, which will be in the teeth of a supercomputer, part of the data must be discarded. How to determine which data is needed and which is not? It is very simple (in fact, not so simple :)) take data for a certain period of time, divide it in half. The data for the first half is laid in the computer model, which is trying to give a forecast for the second interval. The forecast is reconciled with known data. If the source data is poorly selected or the model is unsuccessful, then the predictability will be bad. Recognize captcha hello, familiar problems?
')
The data is “juggled” until the model begins to produce an acceptable result, or they abandon the model. Then, using a similar data sampling technique, the model is already being run on other data.

Now back to the intercepted correspondence. What was it about? Oh! One scientist wrote to another that he was juggling data to get the desired result.

“As soon as Tim receives the chart, we will send it out today or first thing tomorrow morning. I just did the trick of Michael used by him in an article in Nature, and for each series over the past twenty years (ie from 1981 onwards) changed the actual temperature data upwards, plus did the same with the Keith series, since 1961 - to hide the decline. "

(If you’ve already made it?) 1961 for Keith's to hide the decline.)

Does this letter mean that the data has been tweaked? Yes! But whether it was a question of practicing the method or creating a myth about the GP, it cannot be said.

In conclusion, the real story from the student past. Once on the lab, we make measurements with a friend, all according to the training manual. The device is antediluvian, tortured by generations of students provides data. The data is collected, we verify with the curve of the manuals. The error is about 90%. By the manual no more than 10%. We are terrified, but it’s lazy to repeat the procedure, it took about 40 minutes. Well, we correct the results so that it is at least 20% and carry it to the teacher. He laughs and without explaining anything sends a retake. We re-shoot, we get the same error in 90% and with confession to the teacher, please explain. He says that due to antiquity, this device has not given accurate data for 10 years and an error of 90% is a good result. But at the same time, the device teaches not to make a serious scientific error, not to shy away from explaining data that you do not like.

Source: https://habr.com/ru/post/76393/


All Articles