📜 âŹ†ïž âŹ‡ïž

Information Philosophy, Chapter 2. The Existence of Information


Before reading this text, it is recommended to read the beginning of this story. Otherwise it will not be clear why it was necessary to build a complex structure instead of doing it as usual, in a simple way.


Chapter 2. The existence of information


Signals and Contexts


We need to learn how to get rid of the illusion that the information is contained in books, on hard drives, in cables, radio waves and other objects from which we are used to “extract” it. If we have finally accepted that the reification of the concept “information” is unacceptable, then we simply have to admit that, for example, reading a book, we acquire information, but in the subject we are obliged to use for this, it is not. An object must be present (it is impossible to read a book without having it), but the physical object cannot contain information.
')
Let's carefully analyze what happens when we read a book. Certainly, there is a certain physical process, and it is most convenient to describe some stages of reading a book in physical terms. In particular, if we read the paper book with our eyes, then it should exist as a material object, and some acceptable level of illumination should be provided. The optical system of the "eye" should also be, and it should be operational. The use of other ways of reading (Braille, voicing programs) does not change the situation very much, and in these cases it also makes sense to talk about a certain material component, which also has to be.

The fact that we, the readers, happen in the brain after the content is delivered in some way, can also be tried to speak in physical terms, but this is not very promising. Something, of course, is happening. The material component, no doubt, takes place, but now we don’t have a way to translate such, for example, a simple and obvious situation as “surprised by an unexpected plot twist” into material terms. It cannot be ruled out that we will never have such a method. If only because the mechanism of surprise for the unexpected plot twist can be realized in different ways in different heads.

The specificity of information processes, in contrast to the material, is that the same information process can be implemented “in matter” in fundamentally different ways, but at the same time remain yourself. For example, the sum of two numbers can be found using an electronic calculator, a wooden account, counting sticks, a piece of paper and a pen, or even in the mind. The meaning and result of the action will remain the same. The book can be obtained in paper form by mail or electronically by e-mail. The method of implementation, of course, affects many nuances, but the essence and meaning of what is happening remains unchanged. Any attempt to “ground” the information process into the material component (“surprise is nothing but internal dopamine secretion”, “delight is nothing but internal secretion of endorphins”) is akin to if we said that the addition of two The numbers are nothing more than moving wooden knuckles along iron guides. Material reality is total, therefore any information process must have a material aspect, but to it only what is happening cannot and should not be reduced, otherwise the addition of numbers will have to become the monopoly prerogative of the wooden account. Turning to the consideration of the informational aspect of what is happening, you need to be able to abstract from the material aspect, while naturally realizing that it certainly is, but what it is, specifically, is not very important to us.

We will continue consideration of the process of reading a book, abstracting from the details of the material realization of what is happening. In order for the reader to successfully read the text delivered to his receptors, a number of conditions must be met. First, he must know the language in which it is written. Secondly, he should be able to read. Thirdly, he must understand why this particular occupation for him is now preferable to all the others. It is easy to see that in all the above conditions we are talking about the reader having information, because both “knowledge” and “skill” and “understanding” are all synonyms of the concept “information”. Thus, to read the book, we have two sets of conditions for the successful course of the process: the presence of text in some way and the reader's preliminary readiness. The condition for the delivery of the text is denoted as the requirement of a signal . The condition of the reader's readiness is denoted as a requirement for a context .

What is important, these same two sets of conditions are observed in any process that we can identify as the acquisition of information. Even if we consider such a simple thing as a radio-controlled little car, it’s possible to receive commands only when, firstly, the radio signal delivery is OK (the antenna is not broken and the little car did not roll too far from the console) and, secondly, the unit control avtomobilchika "understands" the command sent to the remote. It turns out that even despite the fact that everything seemed to be happening in a reliably deterministic piece of hardware, the essential component that ensured the receiver to successfully receive data from the transmitter turned out to be the knowledge that the designer of the receiver received from the transmitter designer. It was this knowledge that ensured that the receiver became a material object in which the atoms were located not in a haphazard way, but in a very concrete and special way . The radio wave that came to the antenna is not all the information that has entered the receiver. There was still, perhaps, an e-mail received by the developer of the control unit of the car from a colleague who developed the console.

Both components - the signal and the context - we can consider both in the material aspect and in the information aspect. But if it is sometimes possible to abstract from the informational aspect of the signal (especially when the channel width is deliberately redundant), then it is impossible to abstract from the informational aspect of the context, which in its essence is the ability to interpret the signal. Context is information about how a signal can be interpreted , and therefore we must treat it as an intangible entity.

It may seem that in the transfer of mysterious immateriality into this kind of mysterious "context" there is some element of cheating. But it is not difficult to notice that the perceived information and the information constituting the context are different information. The plot of the book and knowledge of the language in which it is written are different knowledge. If the resulting recursiveness of the structure (for the existence of a second-order context, a third-order context is needed, and so on deep into infinity) causes some anxiety, then, looking ahead a little, I note that this is not a signal-contextual defect, but probably its most valuable property. We will return to this topic in the fifth chapter in order to prove an extremely useful theorem through the recursiveness of the signal-context construction.

To solve our metaphysical problems, the essential benefit of viewing information as what happens on the combination of a signal with a context is that this construction is obtained by the very bridge between the worlds that we lacked so much. If in a particular situation we managed to abstract from the informational aspects of the signal (which is often not particularly difficult), we are able to argue about the participation of material objects in the information process. If at the same time we were able to also consider the context in its entirety of its dual nature (in our age of information technologies, this is a common thing), then as a result we have for the specific situation a full bridge between the material and information worlds. It should be immediately noted that the presence of the bridge still does not give us the right to re-identify the information. The signal, if it is considered as a material object, can be reified (the file is recorded on a flash drive, a flash drive in a pocket), but the context, that is, the ability to interpret the signal, cannot be retified.

When considering the classical data transmission situation from the point of view of information theory, we have a transmitter that “places” information into a signal and a receiver, “extracts” information from it. There is a persistent illusion that information is something that exists within a signal. But you need to understand that the interpretation of a specially prepared signal is far from the only scenario of acquiring information. Paying attention to what is happening around, we get a lot of information that no one sent us. The chair does not send us information that it is soft, the table does not send information that it is solid, black paint on the book page does not send us information about the absence of photons, the radio off does not send information that it is silent. We are able to understand the material phenomena around us, and they become information for us because we have a context in advance that allows us to interpret what is happening. When we wake up at night, opening our eyes and seeing nothing, we do not extract information about what had not yet dawned, not from the physical phenomenon present, but from its absence. The absence of the expected signal is also a signal, and it can also be interpreted. But the lack of context can not be some such special "zero" context. If there is no context, then there is no place for information, no matter how much the signal arrives.

We all know very well what information is (for creatures living in an information spacesuit, there can be no other way), but we are accustomed to consider information only that part of it, which is designated here as a “signal”. The context is a matter of course, for us, taken for granted, and therefore we habitually take it out of the brackets. And after bracing out the context, we are forced to put all the “information” exclusively into the signal and, thus, to mercilessly reify it.

There is nothing difficult to get rid of the reification of "information". You just need to learn in time to remember that in addition to the signal there is always the context. A signal is just a raw material, acquiring meaning (value, usefulness, significance and, yes, informativeness) only when it falls into the appropriate context. And the context is a thing that should necessarily be spoken in non-material terms (otherwise this saying will definitely not make sense).

Let us briefly recall the topic “properties of information” and evaluate how these properties fit into a two-component “signal-context” construction.

  1. Novelty. If the acceptance of a signal does not add anything at all to the informational aspect of an already existing context, then signal interpretation events do not occur.

  2. Credibility Interpretation of the signal by the context should not give false information (“truth” and “false” - concepts that are applicable to information, but not applicable to material objects).

  3. Objectivity. The same as credibility, but with an emphasis on the fact that the signal may arise from the work of another context. If the context trying to obtain information and the mediation context do not have mutual understanding (first of all on the goals pursued), then the information will not be reliable.

  4. Completeness. The signal is, objective, reliable, but the context is not enough for acquiring full information.

  5. Value (utility, significance). There is a signal, but there is no suitable context. All words are clear, but the meaning is not captured.

  6. Availability. Signal characteristic. If a signal cannot be obtained, even the presence of the most beautiful suitable context will not help the information to arise. For example, anyone could easily figure out what can be done with accurate data about how tomorrow's football match will end. But, unfortunately for many, this signal will appear only after the end of the match, that is, when its usefulness and significance will be far from the same.

In my opinion, the properties listed above resemble not a property, but a list of possible faults. Properties - it should still be something that describes what we can expect from the subject in question, and what we can not count on. Let's try to derive from the “signal + context” construction at least a few obvious consequences, which, in fact, will be the properties of not specifically taken information, but of information in general:

  1. Subjective information. The signal may be objective, but the context is always subjective. Therefore, information by its nature can only be subjective. One can speak about the objectivity of information only if it was possible to ensure the unity of the context in different subjects.

  2. Informational inexhaustibility of the signal. The same signal, falling into different contexts, gives different information. That is why it is possible, from time to time, re-reading your favorite book, each time to find something new.

  3. The law of preservation of information does not exist. Not at all. We like it when the objects with which we operate strictly obey the laws of conservation and are not inclined to appear from nowhere, and even more so do not have the habit of disappearing into nowhere. Information, unfortunately, does not apply to such items. We can rely on the fact that only the signal can obey the conservation laws, but there is no information inside the signal and cannot be. You just need to get used to the idea that in the normal mode information just comes from nowhere and goes nowhere. The only thing that we can do in order to keep it somehow is to take care of the safety of the signal (which, in principle, is not a problem), the context (which is much more difficult, because it is changeable) and the reproducibility of the situation of getting the signal into context .

  4. Information is always the complete and undivided property of the subject in the context of which it occurred. A book (a physical object) may be someone’s property, but the thought generated by its reading is always the undivided property of the reader. However, if we legalize private property on the souls of other people, then it will be possible to legalize private property on information. What has been said, however, does not cancel the author’s right to be considered an author. Especially if it's true.

  5. Signals that apply only to information cannot be attributed to the signal. For example, the characteristic “truth” can be applied only to information, that is, to a combination of a signal with a context. The signal itself can be neither true nor false. The same signal in combination with different contexts can give true information in one case, and false information in another case. I have two news for followers of “book” religions: one is good and the other is bad. Good: their holy books are not lies. The bad: they do not contain truth in themselves either.

To answer the question “where does information exist?” Without the use of a two-component signal-contextual construction, one has to use the following popular approaches:

  1. "Information can exist in material objects . " For example, in books. When bringing this approach to logical completeness, one inevitably has to admit the existence of “inforod” - a thin substance present in books besides paper fibers and pieces of paint. But we know how books are made. We know for sure that no magical substance is poured into them. The presence of subtle substances in the objects we use to acquire information contradicts our everyday experience. The signal-contextual construction is excellent without thin substances, but it also gives an exhaustive answer to the question “why do we need the book itself to read the book”.

  2. “The world is permeated with information fields, in the thin structure of which everything we know is recorded . ” A beautiful and very poetic idea, but if so, then it is not clear why you need a volume of "Hamlet" to read "Hamlet". Does he work as an antenna tuned to a specific Hamlet wave? We know how Hamlet’s volumes are made. We know for sure that no detector circuits that are configured to receive otherworldly fields are not embedded in them. The signal-contextual construction does not need any assumptions about the existence of parallel invisible worlds. It is very good without these extra essences.

  3. "Information can exist only in our heads . " A very popular idea. The most insidious and tenacious version of reification. Its cunning is due primarily to the fact that science has not yet developed any coherent understanding of what is happening in our heads, and in the darkness of this obscurity it is convenient to hide any shortcomings. In our large and diverse world, it happens that a person writes a work, and then, without having time to show it to anyone, dies. And then, after years, the manuscript is found in the attic, and people will learn something that none of them have known all this time. If information can exist only in heads, then how can it skip that period of time when there is not a single head that owns it? The signal-contextual construction explains this effect simply and naturally: if the signal is preserved (manuscript in the attic) and the context is not completely lost (people have not forgotten how to read), then the information is not lost.

Let's see how the idea of ​​signals and contexts fits into what happens during the transfer of information.It would seem that something surprising should happen: there is information on the transmitter side, then the transmitter gives the receiver a signal in which there is no information, and there is information on the receiver side again. Suppose Alice intends to ask Bob to do something. Immediately, we note that Alice and Bob do not necessarily have to be living people. Alice can be, for example, a business logic server, and Bob - a database server. The essence of what is happening from this does not change. So Alice has information that, of course, is within her a combination of signal and context. Having this information, as well as information about what signals Bob can receive and interpret, she makes some changes in the material world (for example, writes a note and attaches a magnet to a refrigerator or, if Alice and Bob are servers,This involves the network infrastructure). If Alice was not mistaken about Bob, then Bob receives the signal in his current context and obtains information about what he should do now. The key point is the commonality of the context. If we are talking about people, then the commonality of the context is ensured by the presence of a common language and involvement in joint activities. If we are talking about servers, the commonality of contexts is realized through the compatibility of data exchange protocols. It is the commonality of contexts that allows information as if to jump over that part of the path where it cannot exist and turn out to be on the side of the receiver. Generally speaking, information, of course, does not jump anywhere. That Alice possesseswhat he should do now. The key point is the commonality of the context. If we are talking about people, then the commonality of the context is ensured by the presence of a common language and involvement in joint activities. If we are talking about servers, the commonality of contexts is realized through the compatibility of data exchange protocols. It is the commonality of contexts that allows information as if to jump over that part of the path where it cannot exist and turn out to be on the side of the receiver. Generally speaking, information, of course, does not jump anywhere. That Alice possesseswhat he should do now. The key point is the commonality of the context. If we are talking about people, then the commonality of the context is ensured by the presence of a common language and involvement in joint activities. If we are talking about servers, the commonality of contexts is realized through the compatibility of data exchange protocols. It is the commonality of contexts that allows information as if to jump over that part of the path where it cannot exist and turn out to be on the side of the receiver. Generally speaking, information, of course, does not jump anywhere. That Alice possessesIt is the commonality of contexts that allows information as if to jump over that part of the path where it cannot exist and turn out to be on the side of the receiver. Generally speaking, information, of course, does not jump anywhere. That Alice possessesIt is the commonality of contexts that allows information as if to jump over that part of the path where it cannot exist and turn out to be on the side of the receiver. Generally speaking, information, of course, does not jump anywhere. That Alice possessesthe same information as Bob can only be spoken if they have indistinguishably identical signals and indistinguishably identical contexts. In the lives of people this does not happen. It is impossible to see the green color in the same way that another person sees it, but it is possible to agree among themselves that we will designate such a color among ourselves with a green signal.

The signal-contextual construct is not exactly news for world philosophy. Another 250 years ago, Immanuel Kant wrote that “although our knowledge ( information? ) Comes from experience (a signal? ), It is absolutely impossible without a cognitive subject having a priori knowledge ( context? )”.

Measuring information


Measuring information in bits is a favorite thing. It is impossible to deny yourself the pleasure to speculate about this, simultaneously trying on a calculation method to the now-known and, I hope, understandable signal-contextual design.

If we recall the classical theory of information, then a generalized formula that calculates the amount of information (in bits) looks like this:

where n is the number of possible events, and p n is the probability of the n- th event. Let's think what is in this formula for what from the point of view of the receiver and transmitter. The transmitter can report, for example, about a hundred events, of which the first, second and third have a probability of 20%, and the remaining 40% are evenly spread over the remaining ninety-seven events. It is easy to calculate that the amount of information in a report about a single event from the transmitter’s point of view is approximately 4.56 bits:
I = - (3 × 0.2 × log 2 (0.2) + 97 × (0.4 / 97) × log 2 (0.4 / 97)) ≈ - (-1.393156857 - 3.168736375) ≈ 4.56

Please do not be surprised at the fractional result. In technology, of course, in such cases it is necessary to round up, but the exact value is also often interesting.

If the receiver knows nothing about the distribution of probabilities (and how can he know?), Then from his point of view the amount of information received is 6.64 bits (this can also be easily calculated by the formula). Now let us imagine a situation that for the needs of the receiver, only events number 1 (“execute”), 2 (“pardon”) and 100 (“award the order”) are interesting, and everything else is not an interesting “other”. Suppose the receiver already has statistics on previous episodes, and he knows the probability layout: 20% to execute, 20% to pardon, 0.4% to be awarded with the Order, other - 59.6%. We believe we get 1.41 bits.

The spread was substantial. Let's look for an explanation of this phenomenon. If we recall that information is not only one objectively existing signal, but the combination “signal + context”, then it is not at all surprising that the amount of information arising upon receiving a signal should also be context-sensitive. Thus, we have a good agreement between the signal-contextual concept and the mathematical theory of information.

The value of "I" , calculated through the formula, is usually used to solve the following problems:

  1. . « , , », , , 4.56 . , 4 561 893 . , , . .

  2. , . , . , 6.64 1.41 . 4.56 , , .

In the overwhelming majority of cases, when we talk about bits, bytes, megabytes, or, for example, gigabits per second, we focus on the first interpretation. We all like using the broadband Internet much more than the stunted dial-up connection. But sometimes it happens that we have to sit on the Internet for half a day, read a pile of texts and view a bunch of videos only to finally get a simple yes-or-no binary answer to the question of interest. At the same time, our uncertainty does not decrease by the tens of gigabytes that we had to pump to ourselves, but only by one bit.

The entropic interpretation of the nature of information raises more questions than it answers. Even from a purely everyday point of view, we see that minimal uncertainty is observed among those citizens who have not read a single book, and all of whose educational contacts with the outside world are limited to watching television shows and sports programs. These respected subjects are in complete happy certainty on all conceivable questions of the universe. Uncertainty appears only with the expansion of horizons and the acquisition of the pernicious habit of thinking. The situation when obtaining information (reading good smart books) increases uncertainty is impossible from the point of view of the entropy information theory, but from the standpoint of signal-context theory it is quite a common phenomenon.
Indeed, if the reception of a signal results in the formation of a new context, then to feed it we need more and more new signals that will satisfy this context, but can create a new, primordially hungry context as a side effect. Or even a few.

No less surprising are the arguments that information may be somehow related to orderliness (if entropy is a measure of chaos, then negentropy, that is, information, must be a measure of orderliness). Let's look at the following sequences of zeros and ones:

  1. 0000000000000000000000000000000000000000 . Perfect order in the style of "dream hostess." But there is no information here, as there is no it on a blank sheet of paper or a newly formatted hard disk.
  2. 1111111111111111111111111111111111111111 . Essentially the same.
  3. 0101010101010101010101010101010101010101 . Already more interesting. The order remained perfect, the information is still not thick.
  4. 0100101100001110011100010011100111001011 . I was not too lazy to throw a coin. 0 - eagle, 1 - tails. I tried to throw honestly, and therefore we can assume that the perfect mess turned out. Is there any information here? And if so, what about? The answer is "about everything", but if so, how can it be extracted in a usable form?
  5. 1001100111111101000110000000111001101111 . Similar to a coin, but only through a pseudo-random number generator.
  6. 0100111101110010011001000110010101110010 . It also looks like the same random nonsense, but this is not it. Below I will say what it is.

If you remove text comments and make a riddle of what could be the result of throwing a coin, the first three options will disappear immediately. The 5th is also under suspicion, because there are more units than zeros. This is a wrong argument. With an honest coin toss, the loss of all these options has the same probability of 2-40 . If I continue to throw a coin without sleep and rest, hoping to reproduce at least one of the six options presented, then we can expect that if I’re lucky, after about a hundred thousand years I will succeed. But which of these options will be reproduced first is impossible to predict, since they are all equally probable.

The sixth paragraph, by the way, is represented by the word “Order” (that is, “order”) in the eight-bit ACSII code.

It turns out that there is no information either in perfect order or in perfect disorder. Or is there? Imagine that a perfectly random sequence of zeros and ones (No. 4), obtained by throwing a coin, not by me, but by an employee of the encryption center of an enemy army, and is now used as a piece of a secret key that encrypts dispatches. In this case, these zeroes and edinichki immediately cease to be meaningless digital junk, and immediately become super important information, for which the decoders will be ready to sell the soul. No wonder: the signal found its context, and thus became very informative.

I have no desire to assert that the entropy theory of information is not completely correct. There are a number of highly specialized applications in which it gives an adequate result. You just need to clearly understand the limits of its applicability. It can be assumed that one of the limitations should be the requirement that the received signal does not lead to the formation of context. In particular, this criterion corresponds to the majority of communications. It is really possible to speak about extraction of a signal from noise as about struggle with entropy.

Information measurement has one more aspect, which is better not to forget. The result of any single measurement is a number. In our case, these are bits, bytes, gigabytes. Having received the number, we usually expect that we will be able to operate on them in the usual way. Compare over / under, add, multiply. Consider two examples of applying the operation “addition” to the amounts of information:

  1. There are two flash drives. The first - 64 GB, the second - 32 GB. Total we have the opportunity to write on them 96 GB. Everything is so, everything is fair and correct.

  2. There are two files. The first is 12 MB, the second is 7 MB. How much information do we have? The hand stretches to fold and get 19 MB. But let's not hurry. To begin, feed these files to the archiver. The first file shrunk to 4 MB, the second to 3 MB. Can we now add the numbers and get the total true amount of available data? I would suggest not to hurry and look through the eyes on the contents of the source files. We look and see that all the contents of the second file are in the first file. It turns out that the size of the second file does not make sense at all to add to the size of the first one. If the first file were different, then the addition would make sense, but in this particular case the second file adds nothing to the first one.

From the point of view of the amount of information, the situation with quineas, programs, one of the functions of which is issuing its own source code, is very interesting. In addition to this function, such a program may contain something else: some useful algorithm, texts, images, and the like. It turns out that inside the program there is this “something else”, and in addition to this, there is it itself, within itself containing once again all of itself as a whole plus the very thing “something else”. This can be expressed by the following formula: A = A + B, where B is not equal to zero. For additive quantities, such an equality cannot exist.

Thus, with the amount of information a very strange situation is obtained. It can be said that the amount of information is a conditionally additive quantity. That is, in some cases we have the right to add the available numbers, and in some - not. When it comes to the capacity of the data transmission channel (in particular, a USB flash drive may well be considered as a data transmission channel from the past to the future), addition is correct, and when weighting a particular signal we get a value, the possibility of which addition with other similar values ​​is determined external factors, the existence of which we may not even know. For example, the information capacity of the human genome (DNA can be considered as a data transmission medium, and, as far as I know, there are groups of researchers trying to design DNA-based drives), you can talk about 6.2 Gbit, but any answer to the question “how many information written specifically in my genome? ” will be meaningless. The maximum that can be argued is that whatever the calculation method is used, the result cannot exceed the same 6.2 Gbit. Or, if the reality is suddenly such that it is necessary to take into account not only the sequence of nucleotide bases, then it can. If we talk about the total amount of information contained in a living cell, then, apparently, the answer to this question cannot be obtained at all due to the fact that the cell itself is a living being, and not a medium of data transmission.

At the end of the topic “measurement of information” I would like to introduce the concept of “informational class”, which allows to estimate the amount of information, if not quantitatively, then at least qualitatively:

  1. The final information content is the situation when all the signal necessary for the context can be encoded by a discrete sequence of finite length. For such situations, the measurement of information in bits is applicable. Examples:

    • The text of "Hamlet."
    • All extant texts ever written by mankind.
    • Information in the genome.

    The information technologies that are currently available work precisely with finite information.

  2. Infinite informativity - a situation when a discrete sequence of infinite length is required for encoding a signal, and any restriction (“lossy compression”) to a finite length is unacceptable. Example: data on the position of the balls, which need to be maintained during ideal modeling of billiards, so that if you then start the process in the opposite direction, the initial position is formed. In this case, the speed and position of the balls need to be with infinite accuracy (an infinite number of decimal places) since, due to the strong non-linearities that exist, an error in any sign tends to accumulate and lead to a qualitatively different result. A similar situation arises in the numerical solution of nonlinear differential equations.

    Despite the apparent transcendence, there are no fundamental reasons for the fact that with the development of technology we do not have the means to work with endless informativities.

  3. Intractable information - a situation in which the required data can not be obtained in any way due to fundamental limitations of either a physical or logical nature. Examples:

    • It is impossible to find out what happened yesterday on a star that is 10 light years away from us.
    • It is impossible to find out simultaneously with absolute precision the momentum and the position of the particle (quantum uncertainty).
    • Being in a decision-making situation, the subject cannot know in advance which particular alternative of available alternatives he will make the decision. Otherwise (if he knows the decision) he is not in a decision-making situation.
    • A complete deterministic description of the Universe cannot be obtained in any way. The whole complex of fundamental constraints, both physical and logical, works against this at once. Plus, the effects associated with the parade of the barber are added.

    If, regarding the physical limitations, there is still some hope that clarifying the picture of reality will allow to translate some seemingly intractable informativity into finite or at least infinite, then logical constraints cannot be overcome under any technological development.

"Information" in physics


Historically, the connection between the topic “information” and the topic “entropy” arose from the discourse about Maxwell's demon. Maxwell's demon is a fantastic creature sitting near the door in the wall separating the two parts of the chamber with gas. When a fast molecule flies to the left, it opens the door, and when it is slow, it closes. And if a fast arrives on the right, it closes the door, but if it is slow, it opens. As a result, slow molecules accumulate on the left and fast molecules on the right. The entropy of the closed system grows, and on the temperature difference generated by the demon, we can start a second-order perpetual motion machine to our satisfaction.

The perpetuum mobile is impossible, and therefore, in order to bring the situation in line with the law of energy conservation, and at the same time in line with the law of non-decreasing entropy, we had to argue as follows:

  1. When a demon is running, the entropy of the gas decreases.
  2. But at the same time, since the molecules interact with the demon, the gas is not an isolated system.
  3. The “gas + demon” system should be considered as an isolated system.
  4. The entropy of an isolated system cannot decrease, so the entropy plus the entropy of the demon does not decrease.
  5. From this it follows that the entropy of the demon grows.

So far, everything is logical. But what does the "entropy of the demon grow" mean? The demon receives information (we work so far in traditional terminology) about approaching molecules. If the information is negative entropy, then the entropy of the demon should decrease, not grow. Suppose that the demon performs a simple mental effort, and through the mechanism of the door transmits information to a flying molecule (or, alternatively, does not transmit). Negative entropy returns to the molecule, and thereby reduces the entropy of the gas. But why does the demon's entropy grow? Why do we take into account only the information flow coming from the daemon, but not taking into account the incoming flow? What will happen if the demon does not immediately forget what signals he received from the arriving molecules, but will memorize them? Is it possible in this case to say that the entropy of the demon does not increase?

Norbert Wiener, considering Maxwell's demon (“Cybernetics”), writes that a perpetual motion machine cannot be assembled on this thing, because sooner or later the demon’s increasing entropy will reach a critical limit, and the demon will spoil. In principle, this is logical, but it is unlikely that the damage to the demon should be explained by the fact that it will distribute its original wisdom to the molecules, and will itself become stupid. From an information point of view, the work of the demon is very simple and tedious. Neither of which "waste of mental forces" can not speak. Similarly, we do not say that, for example, each file passed through the archiver program increases the entropy of the archiver and thereby gradually reduces its ability to compress data. Most likely, the impossibility of a perpetual motion machine on Maxwell's demon should be explained not by informational and technological considerations, but by the fact that the energy gain from manipulating a molecule cannot exceed the energy costs of ascertaining the parameters of the flying molecule plus the cost of manipulating the door.

The formulas for which thermodynamic and informational entropies are considered are generally similar. Thermodynamic entropy (compare with formula (1) above):

where p i is the probability of the i -th state, and k B is the Boltzmann constant. But this formula is inevitably tied to the fact that there is a subject who has classified the state and identified a finite number of groups of interest. If you try to get rid of the subject concerned, you may find that there is a high risk that the expression should be correctly written like this:

In this case, the total probability is 1 (the system must be in one of the states):

An infinite number of possible states is much closer to the truth of life than a finite one. It is easy to show that if in the system under consideration the percentage of states x does not tend to zero, for which the probability p x is not equal to zero, the integral entropy tends to infinity. In terms of formula (2):

Thus, if the assumption that the operation of integration is appropriate here is correct (and for this it suffices only that at least one of the physical quantities has the continuity property), then the “information” capacity is practically any (i.e., except for degenerate cases a) The material system turns out to be unlimited. It destroys any meaning to equate the thermodynamic information entropy. The similarity of formulas can be attributed to the fact that in our world there are many fundamentally different things expressed by similar formulas. There are other arguments in favor of complying with thermodynamic and informational entropies, but, as far as I know, they have either never been subjected to experimental verification, or (as, for example, the Landauer principle) themselves deduced from the assumption of equality of entropies.

Speaking about the connection of the topic “information” with physics, one cannot but mention the concept of “quantum information”. The laws of quantum mechanics are such that in some cases, describing what is happening, it really makes sense to use information terms. For example, according to the Heisenberg uncertainty principle, we can know for sure either the particle momentum or its position. From this arises the illusion that by taking a measurement, we can get no more than a certain maximum amount of information. From this, as it were, the conclusion automatically follows that there can be information inside the particle, moreover, its volume is strictly limited. I can say nothing about the productivity or counterproductiveness of such use of information concepts, but there is a strong suspicion that stretching the bridge between the purely physical concept “quantum information” and the information that we operate at the macro level (for example, Hamlet) is not easy difficult, but impossible at all.

To transmit our macro information, we use not only physical objects and phenomena, but also their absence. The text in the book is encoded not only by the substance of paint, but also by unpainted gaps (nothing can be read from a uniformly colored sheet). You can also easily come up with a lot of situations where a very important signal is transmitted not by the energy impact, but by its absence. I am still ready to imagine that inside the particle there is some mysterious substance, which is information, but to imagine that inside the absence of a particle there is also information - this is something completely counter-logical.

At the current level of development of knowledge about how our world works, it seems to me that the concept of "quantum information" should be treated in the same way as the concept of "color" used in relation to quarks. That is, yes, “quantum information” is quite possible and necessary to recognize as a valuable concept, but it should be clearly understood that it can only have an indirect relation to the “information” that we talk about in all other cases. Perhaps the conflict can be resolved by the consideration that physics can quite productively study the material basis of the transmitted signal (in particular, give an answer about the maximum possible capacity of the data transmission channel), but the presence of a signal is a necessary but not sufficient condition for us the right to say that there is information in the object in question.

It should be clearly understood that we do not have the physical basis of information (an analogue of the phlogiston theory, but only applicable not to heat, but to information), not because we do not know everything yet, but because it cannot be in principle. One of the most essential requirements of the natural science method, which is most clearly and consistently applied specifically in physics, is the expulsion from the phenomenon being studied that has the free will of the acting subject. The subject (the so-called "implicit observer"), of course, should be close to the phenomenon under consideration, but he has no right to interfere in anything. The mechanism of the phenomena studied, that is, the total absence of purposeful activity, is what makes physics physics.But as soon as we start talking about information, we can’t get away from the fact that the signals received by the subject are the raw material for decision-making. An implicit observer of physical phenomena should be the same as what to observe, and the actor who lives simultaneously in the material world and in the information reality cannot “be in principle” anyway. From this diametrical contrast of the requirements imposed on a subject placed inside the phenomena studied, it follows that the “information” phenomenon cannot be reduced to any physical phenomena, including even those that are not yet discovered.

What is particularly surprising is that an excellent consensus has been reached among materialists with idealists on the need for deep physical "information". For materialists, this is handy for the fact that physics thus reaches the totality of describing reality (nothing remains that is not a physical reality). And idealists celebrate victory because in this way their “spirit” is officially recognized as the basis of the universe. Both have long been at war with the camps celebrating victory, but rather not over each other, but over common sense. Both materialists and idealists react very aggressively to any attempt to link the material and ideal worlds in any alternative banal way of reification.

Data


As mentioned above, a signal can be considered not only a material object, but also an intangible object. According to the principle of totality of physical reality, the signal, of course, must have a physical embodiment, but quite often there are situations when the physical side of the signal does not interest us at all, but only the intangible component is interested. In such cases, we completely abstract away from the physics of the signal, and as a result we have a very strange subject for further discussion. We have rejected physics, and we still cannot speak about the presence of information inside this subject, since this is just a signal, and in order for information to emerge, it needs a context for it. Such objects will be called data. Data is an intangible signal. Intangible he is not becausethat it has some otherworldly nature and travels through the subtle astral entities, but because in this particular case it turned out to be not important for us how it travels. For example, a small volume of “Hamlet” in a beautiful binding, and indeed its and some rare edition is a signal in which we are interested in both the material and non-material components. But if you just need to refresh the monologue “to be or not to be”, we are looking fortext , and we do not care where we find it. Both the paper book, and the file on the flash drive, and the network library service are suitable. The text of "Hamlet" is the data, and the volume of the gift edition of "Hamlet" is no longer just them.

Of particular interest is the case of an object for which not only physics is not essential, but also a suitable context is missing. Imagine an inscription in an unfamiliar language (I do not know Chinese, so let it be Chinese). I want to know what this inscription means, and therefore I take a piece of paper and carefully redraw the hieroglyphs. Just copying all the dashes and squiggles. For me, this is all the lines and squiggles. The meaning of the picture will appear only after I show this leaflet to someone who speaks Chinese, and he will translate the inscription into some language that I can understand. In the meantime, this did not happen, I have an information object on a piece of paper that definitely has a signal, but a signal for a context that is not present at the moment.

In the case of copying Chinese characters, I could not bother to redraw the data (this is data) on a piece of paper, but take a picture on the phone and send it to my friend by mail. In the course of the journey of this signal to my friend, the lack of context for the interpretation of this inscription would be observed not only in me, but also in the software of the telephone, the mail program and all the magnificence of the Internet protocols that would participate in the transmission of data. One could say that in general such a thing as understanding is peculiar exclusively to us, super-complex creatures of flesh and blood, but this would not be entirely true. For example, when transmitting a picture with hieroglyphs, the transport layer of the network will complement the transmitted data with its service data that is understandable(that is, they will be correctly interpreted) to those mechanisms that implement the transport layer of the data network. If we assume that understanding is not necessarily something mysterious and high, with a penetrating gaze who sees the essence of the phenomena, but merely the presence of an adequate context (in the case of the transport layer of the network, this context is formed by the fact that the developers of the network infrastructure honor the TCP protocol) then we can confidently say that our technical systems are also endowed with the ability to understand. Yes, this understanding is not very similar to our ability to grasp the essence of phenomena, which we observe from within ourselves, but this does not change matters.

The concept of “data”, although it does not introduce anything fundamentally new to the metaphysics of information, is, nevertheless, from a practical point of view, extremely useful. The two-component “signal-context” construction, although it is complete (the third component is not needed), but when you try to apply it in everyday life, there is a lot of inconvenience. The source of inconvenience is that the concept of “signal” is clearly associated with the material side of the process, and when the material side has to be ignored, the “grounding” force of the “signal” begins to interfere strongly. Imagine that your comrade is going to make a trip to Bremen and asks you how he could learn more about this city. The first thing that comes to mind is Wikipedia. Looking at the different language sections, you notice that the Russian-language article, though good,but very small, and English-speaking, though much longer, but still inferior to an article in German (which is not at all surprising). Now you need to tell your friend that there is more information in an English-language article than in a Russian-language one, but then, remembering the philosophy of information, you understand that there can be no information in any of the sections. A Wikipedia article is a signal that becomes information when it falls into context.when it falls into context.when it falls into context. Problem. “The signal recorded on the hard drives of the English-language servers of Wikipedia when you get into the context of your perception ...” - Fu, what a horror. How can a friend get to these hard drives with his context? “The signal delivered via Wi-Fi from English-language servers ...” is also something wrong. What does Wi-Fi have to do with it, if a friend can just as well go to Wikipedia via the mobile Internet? When replacing the concept of "signal" with the synonym "data" (in this case, it turns out that it is a synonym), all inconveniences disappear. "You can look at Wikipedia, but keep in mind that in the English, and especially in the German article, data about Bremen is much more". They took advantage of the fact that at least, as we now know, there can be no information in the article, but the data is, in fact, an article. Signal, the physical implementation of which in this particular case is not important to us.

In my practice, I will say that, having experimented with the transition to the correct terminology in everyday life and professional activities (information technology), I have never been confronted with someone from my interlocutors noticing that something has changed. The only thing that now has to pay attention to what is at stake - about the data, or still about the information. For example, it is not the information that is now stored in the database, but the data, but the users, by transferring this data to the database, thus exchange information. The system still remains informational, but functions on the basis of accumulated data.

With the development of transmission networks, we have a fairly simple criterion to determine whether we have the right to completely abstract away from the physics of a particular object and, as a result, to speak of it as an information object (that is, data ). The criterion is: if the subject can be transmitted via the Internet, then we have every right to speak of this object as an information object .

Examples:


For the purity of terminology, of course, it would be better to speak not about an “informational”, but about an intangible object . But the term “informational” is much more convenient, since there is no particle “not” in it.

I draw your attention to the fact that the considered simple empirical rule for identifying an information object has an “if-that” structure, and therefore works only in one direction. That is, from the fact that we cannot transmit something via the Internet, it does not mean that the object is not informational. For example, we cannot transmit the number pi in a “live” form (that is, as a sequence of numbers). We can transfer the recipe of cooking this “cutlet” (that is, a program that sequentially calculates the signs after the comma of the number pi), we can transfer a picture with the designation, but we cannot take this “cutlet” itself.

Information in pi


If we are talking about pi, it makes sense to make out one funny case connected with this thing.

Rumor has it that among the numbers that make up the infinitely long tail of pi, it is theoretically possible to find any pre-set sequence of numbers. If to be completely accurate, then this is just a hypothesis, not proven and not refuted. There are real numbers that have the property to contain any finite sequence of numbers (they are called “normal”), but the hypothesis that pi is normal has not yet been proved. In particular, a normal number containing any sequence of zeros and ones can be obtained by successively appending the combinations of all combinations to the tail after the decimal point, gradually increasing the digit capacity. Like this:
0, (0) (1) (00) (01) (10) (11) (000) (001) (010) (011) (100) (101) (110) (111) (0000) ... and so Further.

In decimal form, the number will be slightly more than 0.27638711, and this number is guaranteed to contain the contents of any file from your hard disk, even the one that you have not recorded there yet.

But we will close our eyes on the fact that the normality of pi is not proven, and in our reasoning we will consider it normal. The number pi is covered with a mass of stories, riddles and prejudices, and therefore it is more interesting to talk about it than about some simple algorithmic output. If you are inconvenienced by a mathematical error, just consider that further here I’m not talking about pi, but about any number normal to base 2.

It turns out very majestic picture. Imagine that you will sit down in your declining years, write your detailed biography, and write it to a file. So, it turns out that among pi this sequence of zeros and ones is already now. And also there is the same sequence, but supplemented with the exact date and circumstances of your death. This is truly a fate book, isn't it?

The beginning of our book of destinies (the whole part and the first 20 signs of the endless tail) looks like this:
11.0010000001111110110 ...

Let's think about how such a book of fate could be read. Suppose I wrote my biography right up to the current moment, I took a calculator of fantastic power and made him find the beginning of my biography among the signs of pi. It is foolish to expect that the first occurrence has a meaningful continuation. Most likely, there goes a meaningless jumble of zeros and ones. After a little tweaking over the calculator's algorithm, I taught him to find not only the occurrences of a known part of the biography, but also to analyze whether the continuation is meaningful text written in roughly the same style. And finally, my calculator found such a fragment. I don’t know if it will please me or sadden me, but I will not stop the calculator. Let him continue his work.After some time, he will fill me up with a pile of versions of my further biography found among pi. Some will be quite ordinary (“he worked, he retired then, he grew old, he was sick, he died then”), but the rest will be much more interesting. For example, in one of the versions it will be that tomorrow, not earlier or later, a global zombie apocalypse will happen, and the bloodthirsty dead will devour me. And in the other, it is mandatory (among all there are all combinations of zeroes and ones) it will be written that I will acquire immortality and omnipotence, and become the master of the universe. And yet an infinite number of options, an endless stream of creeping out of the calculator. Which of these versions to believe? Maybe the very first? And why her?He died then "), but the rest will be much more interesting. For example, in one of the versions it will be that tomorrow, not earlier or later, a global zombie apocalypse will happen, and the bloodthirsty dead will devour me. And in the other, it is mandatory (among all there are all combinations of zeroes and ones) it will be written that I will acquire immortality and omnipotence, and become the master of the universe. And yet an infinite number of options, an endless stream of creeping out of the computer. Which of these versions to believe? Maybe the very first? And why her?He died then "), but the rest will be much more interesting. For example, in one of the versions it will be that tomorrow, not earlier or later, a global zombie apocalypse will happen, and the bloodthirsty dead will devour me. And in the other, it is mandatory (among all there are all combinations of zeroes and ones) it will be written that I will acquire immortality and omnipotence, and become the master of the universe. And yet an infinite number of options, an endless stream of creeping out of the computer. Which of these versions to believe? Maybe the very first? And why her?that I will gain immortality and omnipotence, and become the ruler of the universe. And yet an infinite number of options, an endless stream of creeping out of the computer. Which of these versions to believe? Maybe the very first? And why her?that I will gain immortality and omnipotence, and become the ruler of the universe. And yet an infinite number of options, an endless stream of creeping out of the calculator. Which of these versions to believe? Maybe the very first? And why her?

In order to simplify the task for ourselves, we will try to tell fortunes on the number pi a little bit simpler. Let's ask him a simple binary question. For example, will it be profitable for me today to buy the stock of shares I’ve watched? If the first in the fractional part of pi is a unit, then it means that the omniscient oracle answered me that it is profitable. If zero, then this means that you need to wait. We look. Zero met right in the first position, but one, won, even not in the second, but in the third. Oh, something tells me that with such an oracle in my life I will not buy a single action. To this oracle would still attach some additional oracle, which suggests which position to look.

It turns out that to extract information from databooks of destiny we lack the very small key, which will tell you from which position this book should be read. And without a key, the only information that for us is contained in the infinite tail of the digits of pi is the ratio of the circumference of a circle to its diameter. Somehow it even turns sad ...

Chapter Summary


In this chapter, using the two-component signal-context construction, we learned not only to get rid of the reification of “information”, but also received a tool that allows us to draw a bridge between the material and non-material aspects of reality without engaging in mystical practices.

The main concepts and concepts considered:

  1. Information as a combination of signal and context.
  2. A signal as a certain circumstance that can be interpreted.
  3. , .
  4. , . , – , – , . , – , .
  5. , , , . , .
  6. . : , .
  7. . . «».
  8. , . «» , , «».
  9. The instrumental technique "can it be transmitted via the Internet" to quickly determine whether the subject in question is an information object .

It will only be more interesting further, but if you didn’t understand how we managed to make physics with lyrics with the help of signals and contexts, you will be sad.



Continued: Chapter 3. Foundations

Source: https://habr.com/ru/post/403327/


All Articles