📜 ⬆️ ⬇️

Information and analytical system analysis and forecasting of the development of the agro-industrial complex

The process of developing software for working with neural networks is considered. The selection of optimal learning parameters and the type of neural network has been carried out to ensure the most accurate prediction of indicators characterizing the agricultural regions of the Russian Federation.

AIC as a system can be controlled by control and observable parameters. In order to trace the dynamics of changes in the observed parameters when manipulating the controlled factors, we used the technology of neural networks, which is suitable for solving this problem.
It is necessary to find the optimal type and parameters of the neural network. Network type and parameters must be saved for future use.

Software development



It was decided to develop a software product that allows you to work with neural networks, flexibly adjust training parameters, receive information about the neural network training process during the training itself. Provide the ability to save the finished neural network files for further use, normalize the input parameters, adjust the ratio of test and learners, select the type of neural network and functions of its layers, display the communications map of the finished neural network. The ability to use a neural network to predict values ​​in several modes:
  1. single value prediction based on input parameters;
  2. cyclic forecast of a set of values ​​when the predicted value is fed to the input of the network and the prediction is restarted, with the possibility of setting the number of iterations;
  3. comparison of real and neural network values;
  4. plotting graphs on several indicators at once.

The software product was developed in the Borland C ++ Builder 6 environment. To provide support for the neural network technology, the FANN open source library was used. Fast Artificial Neural Network (FANN) is a library that supports the programming languages ​​C, C ++, PHP, Python, Delphi and Mathematica and is a powerful tool for software developers. (1) The library can be used to develop artificial intelligence in computer games, identify objects and images, help stock brokers to identify trends and trends in the market and, of course, to solve the problem.
The software product has several screen forms. Each form provides a separate mode of operation of the application.
The main form “Creating a network”, Figure 1, contains grouped elements of normalization of the neural network, automatic selection of training parameters, setting the neural network parameters and the log of operations. Switching between operating modes (screen forms) is carried out through the “Tools” menu.
')
image
Figure 1. The main form of the application. "Creating a network"

The following form and mode of operation is called “Map of Links”, Figure 2. In this mode, when selecting a previously created neural network file, you can view the following information about a neural network:

The main part of the screen form is occupied by a field for displaying a table about the structure and strength of the connections between neurons in the finished neural network. On the form there is a font size control unit in the field for displaying the table, in case the network contains a large number of neurons and layers and information about its structure and the strength of connections in it does not fit into the field reserved for this.

image
Figure 2. “Map of connections” form

Information about the strength and structure of connections between neurons in a neural network is presented in the form of a compact matrix for convenient viewing of the internal structure of a neural network.
This neural network has only 15 neurons, 3 layers (input, output and hidden layer). In the input layer of 9 neurons, in the hidden 5, in the output 1. The total number of connections in the network is 34. The connections between the neurons are represented in the matrix. "." (Full stop) means no connection. The strength of the bond is characterized by the letter az. 3 real neurons in the hidden layer (neurons 10, 11 and 12 in layer 1) have connections with 10 neurons from the previous layer, as can be seen in the first three rows of the matrix. Neuron 14 in the output layer has a connection with four neurons of the hidden layer 11-14 neurons, as can be seen on the fifth row of the matrix.
To simplify the matrix, neurons in the input layer and bias neurons and their connections are not displayed in the matrix.
Figure 2 shows the output of information about the network structure built on the basis of a sample of data from the “Base of the APK” on the following indicators for 10 years from 1995 to 2005:
  1. Total population, thousand people;
  2. The average annual number of people employed in agriculture, mln.
  3. The average annual number of employed population in the economy, total thousand;
  4. Money incomes of the population (on average per capita per month), rub .;
  5. Cash expenditures of the population (on average per capita per month), thousand rubles;
  6. Number of agricultural enterprises, units;
  7. The total area of ​​agricultural land, thousand hectares;
  8. Total planted area;
  9. Production of gross agricultural output in all categories of farms, in fact. prices, mln. after 1998;
  10. The number of peasant (farmer) farms, units;
  11. Area of ​​plots allocated to peasant (farmer) farms, thousand hectares;
  12. Crop areas of agricultural crops, thousand hectares;
  13. Investments in fixed assets, mln rubles;
  14. Capital investments from the federal budget, million rubles;
  15. Capital investments at the expense of the budgets of the subjects of the federation, million rubles;
  16. The share of unprofitable enterprises in their total number,%;
  17. Profit (before tax) for all activities of agricultural enterprises, mln rubles;
  18. Budget subsidies attributable to the results of households. activity of agricultural enterprises of the Ministry of Agriculture, million rubles;
  19. Profitability level for all activities of agricultural enterprises (with subsidies and compensations),%;
  20. Authorized capital, thousand rubles;
  21. Profit before tax, thousand rubles;
  22. Loss before tax, thousand rubles;
  23. Gross income, thousand rubles


The sample included regions of the Russian Federation, selected in previous studies (2), (3) using cluster analysis using the Kohonen self-organizing maps based on the results of the 2006 All-Russian Agricultural Census. The leading cluster includes the following regions, the indicators of which were used in the training of the neural network:

  1. Arkhangelsk region;
  2. Astrakhan region;
  3. Belgorod region;
  4. Bryansk region;
  5. Vladimir region;
  6. Volgograd region;
  7. Vologodskaya Oblast;
  8. Voronezh region;
  9. Ivanovo region;
  10. Kabardino-Balkaria;
  11. Kaliningrad region;
  12. Kaluga region;
  13. Karachay-Cherkess Republic;
  14. Kirov region;
  15. Kostroma region;
  16. Krasnodar region;
  17. Kursk region;
  18. Leningrad region;
  19. Lipetsk region;
  20. Moscow region;
  21. Murmansk region;
  22. Nizhny Novgorod Region;
  23. Novgorod region;
  24. Orenburg region;
  25. Oryol Region;
  26. Penza region;
  27. Perm region;
  28. Pskov region;
  29. Republic of Adygea (Adygea);
  30. Republic of Bashkortostan;
  31. The Republic of Dagestan;
  32. The Republic of Ingushetia;
  33. Republic of Kalmykia;
  34. Republic of Karelia;
  35. Komi Republic;
  36. Mari El Republic;
  37. The Republic of Mordovia;
  38. Republic of North Ossetia-Alania;
  39. Republic of Tatarstan;
  40. Rostov region;
  41. Ryazan Oblast;
  42. Samara Region;
  43. Saratov region;
  44. Smolensk region;
  45. Stavropol region;
  46. Tambov Region;
  47. Tver region;
  48. Tula region;
  49. Udmurtia;
  50. Ulyanovsk region;
  51. Chechen Republic;
  52. Chuvash Republic-Chuvashia;
  53. Yaroslavskaya oblast.


The next mode of operation (screen form) is called “Network Usage”. The neural network application uses:
  1. single value forecast, Figure 3;
  2. cyclic prediction of a set of values ​​a given number of times, Figure 4;
  3. comparison of real values ​​and those obtained using the neural network, Figure 5.


image
Figure 3. Network Usage Form. Single value forecast.

Using a trained neural network in this mode is very simple. Select the saved neural network file. We enter the normalized values ​​into the table in the “Prediction of Value” block. In the “input” column, you need to enter as many values ​​as the neural network takes at the input, and click the “Check” button. The predicted value will appear in the “output” column.

image
Figure 4. Network Usage Form. Cyclic forecast of a set of values ​​with plotting.

The prediction of the set of values ​​allows you to import unnormalized values ​​from the csv file at once for a variety of indicators. Normalize them. Enter the required number of prediction iterations. Plot the resulting values.

image
Figure 5. Network Usage Form. Comparison of original and obtained using the neural network values. Evaluation of the quality of neural network training and forecasting.

You must select the file with the training data that will be used as input values ​​for the neural network, and click the Compare button. For easy comparison of values, a graph is plotted.

Selection of the optimal type and parameters of the neural network


Two-layer neural network with linear activation function


The first type of neural network that was tested is a linear two-layer neural network. Network parameters are depicted in Figure 6.
The learning method is FANN_TRAIN_RPROR. More advanced learning algorithm, showing good results on most tasks. The learning algorithm RPROP is adaptive. Some parameters of training can be changed, but it is recommended to do this only for users who understand the operation of the RPROP algorithm. To be more precise, this is the implementation of the iRPROP algorithm.
Activation functions of the hidden and output layer FANN_LINEAR. The signal at the output of the neuron is linearly related to the weighted sum of the signals at its input.
f (x) = tx
where t is a function parameter. In artificial neural networks with a layered structure, neurons with transfer functions of this type, as a rule, constitute the input layer. (4) The range of values ​​(-∞, ∞).

image
Figure 6. Learning a two-layer neural network with a linear function to activate the hidden and output layers

The number of neurons in the network 5, compounds 6, layers 2. Learning error 0.167253659566243 is quite large. The number of erroneous bits is 123. The number of erroneous bits indicates the number of exceedances of the maximum allowable difference between the predicted value and the actual value during training. The result shown by this type of neural network with a linear activation function does not allow using this neural network to adequately predict the values ​​of the task.
Neural network with one hidden layer (three-layer)

The next type of neural network that has been tested is a neural network with one hidden layer (three layer). Network parameters and learning outcomes are depicted in Figure 7. The learning method is FANN_TRAIN_RPROR. The activation functions of the hidden and output layers - FANN_SIGMOID_SYMMETRIC. Sigmoidal activation function is one of the most frequently used types of activation functions at the moment. The introduction of sigmoid-type functions was due to the limitations of neural networks with a threshold neuron activation function — with such an activation function, any of the network outputs is either zero or one, which limits the use of networks not in classification tasks. The use of sigmoidal functions made it possible to switch from binary outputs of the neuron to analog outputs. Transmission functions of this type, as a rule, are inherent in neurons located in the inner layers of the neural network. (four)

image
Figure 7. Neural network with one hidden layer (three-layer)

The number of neurons in the network is 15, the number of connections in the network is 34, layers 3. Learning error 0.000978241524388713. The number of error bits is 0.

image
Figure 8. Map of connections of a neural network with one hidden layer (three-layer)

Information about the strength and structure of connections between neurons in a neural network with one hidden layer (three-layer) is presented in the form of a compact matrix in Figure 8.
This neural network has only 15 neurons, 3 layers (input, output and hidden layer). In the input layer of 9 neurons, in the hidden 5, in the output 1. The total number of connections in the network is 34. The connections between the neurons are represented in the matrix. "." (Full stop) means no connection. The strength of the bond is characterized by the letter az. 3 real neurons in the hidden layer (neurons 10, 11 and 12 in layer 1) have connections with 10 neurons from the previous layer, as can be seen in the first three rows of the matrix. Neuron 14 in the output layer has a connection with four neurons of the hidden layer 11-14 neurons, as can be seen on the fifth row of the matrix. All ties are strong enough.
The quality of training of the neural network on the example of the comparison of the original and obtained by the network values ​​is shown in Figure 9.

image
Figure 9. Comparison of original and obtained values ​​using a neural network with one hidden layer (three-layer)

findings



The conducted tests revealed the suitability for further use (prediction of the values ​​of indicators) of the neural network of the following type and parameters:


Bibliography



  1. Nissen, Stefen. Neural Networks Made Simple. Software 2.0. 2005, 2, p. 14.
  2. Voronin, Evgeny Alekseevich and Zakharov, Dmitry Nikanorovich. Cluster analysis of agricultural census results. Bulletin MSAU. 2010, 41, p. 90.
  3. Voronin, Evgeny Alekseevich and Zakharov, Dmitry Nikanorovich. Analysis of the agricultural census results of the agro-industrial complex. International Journal of Science. Economic issues. 2010, 3, p. 31.
  4. Wikipedia. Artificial neuron. Wikipedia is the free encyclopedia. [On the Internet] 2011.w.wikipedia.org/wiki/Artificial_Neuron .

Source: https://habr.com/ru/post/130809/


All Articles