30 lines of code .In this article we will look at how you can create and train a neural network using the
Synaptic.js library, which allows you to conduct in-depth training in conjunction with the browser Node.js. Let's create the simplest neural network that solves the
XOR equation . You can also explore a specially written
interactive tutorial .

')
But before moving on to the code, let's talk about the basics of neural networks.
Neurons and synapses
The main "building" element of a neural network is, of course, a
neuron . Like a function, it takes several input values and produces some kind of result. There are different types of neurons. Our network will use
sigmoids that take any numbers and reduce them to values in the range from 0 to 1. The principle of operation of such a neuron is illustrated below. At the entrance it has the number 5, and at the output 1. The arrows denote synapses connecting the neuron with other layers of the neural network.

But why is the number 5 at the entrance? This is the sum of three synapses, "included" in the neuron. Let's figure it out.
On the left we see the values 1 and 0, as well as the
offset -2. First, the first two values are multiplied by their
weights , which are 7 and 3, the results are added and the offset is added to them, we get 5. This will be the input value for our artificial neuron.

And since this is a sigmoid, which reduces any value to the range from 0 to 1, then the output value will be 1. If such neurons are connected to each other using synapses, you will get a neural network through which the values pass from input to output and are transformed. Like that:

The neural network is trained to make generalizations for the sake of solving various problems, for example, handwriting recognition or mail spam. And the ability to generalize well depends on the choice of the right
weights and
offsets within the entire neural network.
For training, you simply give a set of samples and force the neural network to process them time after time, until it gives the correct answer. After each iteration, the prediction
accuracy is calculated and the weights and offsets are adjusted so that the next time the neural network’s response is slightly more accurate. This learning process is called backpropagation. If you spend thousands of iterations, your neural network will learn how to generalize well.
We will not consider the work of the back propagation method of an error; this is beyond the scope of our guidelines. But if you are interested in the details, you can read these articles:
Code
First we create the layers and do it in the synaptic mode using the function
new Layer()
. The number transmitted to it means the number of neurons in the fresh layer. If you do not know what a
layer is , then look at the above
tutorial .
const { Layer, Network } = window.synaptic; var inputLayer = new Layer(2); var hiddenLayer = new Layer(3); var outputLayer = new Layer(1);
Now we merge the layers together and instantiate a new network.
inputLayer.project(hiddenLayer); hiddenLayer.project(outputLayer); var myNetwork = new Network({ input: inputLayer, hidden: [hiddenLayer], output: outputLayer });
We have a network of 2–3–1, which looks like this:

Let's teach her:
We drove 20,000 iterations of learning. In each iteration, the data is run back and forth four times, that is, four possible combinations of values are fed to the input:
[0,0] [0,1] [1,0] [1,1]
.
Let's start by running
myNetwork.activate([0,0])
, where
[0,0]
is the input. This is called forward propagation, or
activation of a neural network. After each forward propagation, the opposite should be done, in which the neural network updates its weights and offsets.
myNetwork.propagate(learningRate, [0])
propagation is performed using
myNetwork.propagate(learningRate, [0])
, where
learningRate
is a constant, meaning how much weights need to be adjusted each time. The second parameter
0
is the correct output value for the input
[0,0]
.
Next, the neural network compares the resulting output value with the correct one. Thus, it determines the accuracy of its own work.According to the results of the comparison, the neural network adjusts the weights and displacements to answer a little more accurately the next time. After 20,000 such cycles, you can check how well our neural network has learned by activating it with the help of all four possible input values:
console.log(myNetwork.activate([0,0])); -> [0.015020775950893527] console.log(myNetwork.activate([0,1])); ->[0.9815816381088985] console.log(myNetwork.activate([1,0])); -> [0.9871822457132193] console.log(myNetwork.activate([1,1])); -> [0.012950087641929467]
If we round the results to the nearest integer values, we get the exact answers for the XOR-equation. Works!
That's all. Although we have just dug up the subject of neural networks, you can already experiment with Synaptic and continue learning yourself. There are many good tutorials on the library's
wiki .