📜 ⬆️ ⬇️

Perceptron Rosenblatt - what is forgotten and invented by history?

On Habré - there are already several articles about artificial neural networks. But more often talk about the so-called. multilayer perceptron and backtracking error algorithm. And do you know if this variation is no better than Rosenblatt’s elemental perceptron?

For example, here in this translation. What are artificial neural networks? we can see what Rosenblatt’s perceptron says:

The demonstration of the percepton Rosenblatt showed that simple networks of such neurons can be trained on examples known in certain areas. Later, Minsky and Papert proved that simple prepeptons can solve only a very narrow class of linearly separable problems, after which the activity of studying the ANN decreased. Nevertheless, the method of back propagation of a learning error, which can facilitate the task of learning complex neural networks with examples, has shown that these problems may not be separable.

')
And it is found in different ways in various articles, books and even textbooks.

But this is probably the greatest advertisement in the field of AI. And in science it is called falsification.



What is invented?

In fact, all serious scientists know that the idea of ​​a Rosenblatt perceptron is not serious. But they prefer to write about it quite softly. But when history appears to us in a substantially distorted form, although it is not good in such a narrow field of science. I do not want to say that it was falsified intention. No, this is a good example of how young people do not read the originals, or read them diagonally, and then pretend that they understand what the problem is. And solve this problem. Then they become doctors and get fame. And the next generation already believes them. Always, if you do something serious, double-check the classics for the originals. Do not believe in the articles.

So let's open the original describing Rosenblatt's perceptron: Rosenblatt, F. Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms, 1965 . There you will not find the so-called. single-layer perceptron - it does not exist in nature for at least 1965. He was invented much later.

There you will immediately see an elementary perceptron, which has a hidden layer of A-elements.

Sometimes this is explained by the fact that the terminology has changed. Alas, do not change the words - but the perceptron has the right layer and ALWAYS was. Moreover, Rosenblatt himself writes that there is no point in considering perceptrons without a hidden layer. This for 1965 was elementary, and everyone knows.

Then they want to add the authority of Minsky. But he also knew perfectly well what a perceptron is. And he never argued that the perceptron could not solve linearly non-separable problems. He argued for a completely different, related to the problem of invariance. And this problem now does not solve any known ins. (but this article is not about that, if you are interested - make an order, I will try to write about it.)

In this respect, the difference between the Rosenblatt perceptron and the Rumelhart multilayer perceptron (MLP) is only in the following:
1. Rumelhart's perceptron is trained by the back-propagation error algorithm, while learning the weights between the input and hidden layers and the weights between the hidden and output layers.
2. Rosenblatt's perceptron is trained by an error correction algorithm. This algorithm only learns the weights between the hidden and the output layer. As for the weights between the input and the hidden layer, then it does not learn consciously. It makes no sense to teach, because This layer performs a completely different task than the second. The weights of the first layer, or rather excitatory and inhibitory connections, are created in the perceptron by chance - simulating the nature here. The task of this layer is precisely to transform a non-separable problem into a separable one. And the input pulses passing through the connection of the first layer are displayed on the space of A-elements. This random matrix provides the transformation to a separable problem.

The second layer both in the Rosenblatt perceptron and in the MLP already separates the linear problem obtained after the transformation.

Now I hope it is clear why the first layer is - it provides the transformation from a non-separable (linearly non-separable) to a separable representation of the task. The same is done in MLP, the selected back-propagation algorithm does not change this.

But if Rosenblatt's perceptron uses chance, then the MLP, through its training, creates this chance. That's the whole difference.

There are a number of other differences, consequences - but this article is still not for newbies, but for those who are at least a little in the subject. Here I just wanted to note that for beginners Rosenblatt's perceptron must be studied by originals .

And what was forgotten?

It is not deserved the name of Rosenblatt is now often remembered in historical reviews. It should be noted that Rosenblatt developed not any one kind of artificial neural network. He developed a complete classification of all kinds of neural networks. Under the common name of the perceptron - ANY is now existing ANN. Rosenblatt also has multi-layer perceptrons, which, according to his terminology, begin with two inner layers, and recurrent perceptrons, and many other subtypes. Moreover, unlike the modernly developed, their characteristics by Rosenblatt were calculated more thoroughly. That is why it is simply necessary to compare the newly developed INS first with the corresponding Rosenblatt perceptrons according to the classification - if there is no such comparison, then the effectiveness of the new INS is completely unclear. And many ANN developers do not bother to do this, and as a result there are many called ones, and not a single one.

PS I often met with skepticism when I told it. But if you suddenly do not believe me, read the article by Kussul E., Baidyk T., Kasatkina L., Lukovich V., Rosenblatt Perceptrons for recognizing handwritten numbers, 2001 .

In conclusion, I give a link where I will help to study the Rosenblatt perceptron not according to the myths, but according to the originals: Here we study the possibilities of the Rosenblatt perceptron

upd. To those who are still in a captivity of delusions - is dedicated - the solution of the XOR task by the Rosenblatt perceptron



upd2. Thank you all, learning is not knowing and not reading the originals - it was not my task here. Want to talk normally, write in a personal. I no longer respond to defiant comments, I recommend taking and reading the basics .

Source: https://habr.com/ru/post/140301/


All Articles