Two and a half years ago, there was a need for a system for managing the work of an enterprise, in this case a small chain of stores. They started with some kind of minimal boxed program, but it was not the well-known standard two-letter system. He offered to make his efforts, it became interesting to write a full own ERP. In general, I approximately understood how much it would cost me, I work on contracts from the 95th year mainly in telecoms, banks, insurance companies, but sometimes in production. Saw enough projects to realize the scale of the plan, but decided that the pig would not eat, choke, in short, began to take this event.
First of all, it was necessary to put the normal equipment instead of what was there, the firewall purchased in Montreal, ordered two servers in California. I drove them from Ontario (you are moving the border, taxes are at your own risk), in general LA, Toronto, Frankfurt, Peter. Iron was selected on several parameters: a database with the possibility of adding a second processor, memory up to 96GB and up to 16 3.5 "disks. 1U software server already with two processors and 32GB. On the software server there are two hard disks on RAID1. On DB there are two hard disks under RAID1 and two SSDs are also on their RAID1, all in a 2U box. The main base is on the SSD, the OS is on the hard disks and the backup goes there. The firewall has 6 ports in the 1 / 2U box, inside the hard drive and optionally a Flash card, serial port, VGA, and outdoor electronic health monitor. Then mouth ION Red Hat, server configuration, OpenBSD on the firewall. There was a funny story with socks in iron boxes and all the stuffing in hand luggage, but not the point.
For a start, it was necessary to collect and install all this, which was done with the help of some 'metal specialists' (another fun story). After that, there was a tour of the shops, where the actual development began. First of all, it was necessary to collect data from existing systems into one. To do this, a system was written to transmit data via SSL (OpenBSD is very convenient for this, and I also like Theo). Data was collected according to a predetermined schedule, fortunately thinking a lot about what exactly was transferred was not necessary, everything that went from the cash registers to the store server and everything that went from the server to the cash register was caught, recorded in the local simple file 'base' and waited for the next transfer. The system attempted to start the modem and then transmitted. All data collected from the virtual OS user. Each store received instructions not to turn off the program (it was not possible to start the program as a service in a non-professional version of MS Windows).
The next step is of course to understand the incoming accumulated files, process and write them into a new database, which must also be carefully modeled. This is quite a long not very pleasant process, some of the files are not badly documented, but all that concerns the server part of the program, which was then used to manage the network, it was just a wild west and no documentation - as you wish, understand this abnormal model. At some point, the data crept from the firewall to the new database. There were several visits naturally, the base had to be erased, to correct new problems found and to reload again. In the end it was decided, the data began to come every twenty minutes and load, which made life easier. Now the question has come of using all this wealth.
')
Retreat for a minute - why write your ERP? What is the point, because there are a lot of boxed solutions, and besides it is known that in Russia most people are used to one well-known program? There is no simple answer, there are many complex ones, but you can say this: it had to be done because it is difficult to compete with a small new network. You need to have some way to push off from all others, you need a new approach to competition. In fact, this was a big risk from different positions, and from my point of view (2 years already after all) and from the point of view of the network - where is the guarantee that everything will go fine and that it is not easier to establish what everyone uses? But the fact is that the fact that everyone uses keeps everyone on the same level, and when you are on the same level with everyone, the guarantees for survival are the same as for all others who are exactly like you.
The system began as a standard Java business system (JVM, Tomcat, Struts, Apache Server, PostgreSQL, some more shell and expect), but a large number of code generators were created for faster development. The standard code is generated from the standard configuration files, so the first blanks of communication with the database are created, the entire presentation facade and everything that connects them (the shell of the business logic), it was the right decision, it saved an incredible amount of time. Literally, what is done by hands for 3-4 days is now done in a couple of hours. It soon became clear that working on this program as a 'standard' will not work, you need to make several architectural decisions that will either work and increase the speed of execution or all of this will be covered, because I'm used to users who expect a response from the system within 200 -800 milliseconds, not more. But here they started to make reports that covered about 15 stores and time intervals of at least 1 month, but soon the time intervals in the reports increased to a year. That is, 40 directors / managers were interested in sales of product categories in 1 or all stores and they wanted to see data for 1-12 months at once. It was necessary to recycle how the object model was used, no one wants to wait for the report for 30 minutes, and it is indecent to give such people. After a month of work, a solution was found; moreover, now every user request for a report was entered into the database, it was possible to look through the system itself who does what, how much time it takes, what parameters they use. It has become useful for monitoring the work of employees and for educating people, because the system allows users to write simplified regex to filter products, and people with even the most simple similarity to regex have some difficulties. But in general, the result was normal, ninety percent of requests were made in less than one second, another 9 percent in less than 10 seconds, and another percent took 1-2 minutes, which is already easier.
Of course, ERP is not only reports on the collected data, it’s a lot of other things, for example, data entry control, but here are also some of their quirks. How to integrate data from 15 stores into one database, if in each store the goods are entered into the database in their own way? Their names, sometimes their bar codes, their 'folders', even their prices. To do this, the system had to learn how to automatically merge products into a list of possible identical goods, and then allow the operator to change the list, edit parts of the data, and record these associations as one product. Naturally, this is not an easy process not only from the point of view of the system, but the operator himself must be able to properly manage this and the system must allow these errors to be caught and then corrected (and for this purpose a special control panel was created, in it various functions allowed to run different logic immediately above all the data in the system - you set several parameters, press a key and wait a few minutes until it spits out the file with possible problems). At the same time, you need to be able to create products and classify them correctly, I don’t want to go into it, but there are several solutions that do not exist in other systems of this kind, especially with regard to the method of classification and its hierarchy.
With ERP, of course, the more features, the better. The collected data was enough to add to the system a component for sending out offers and advertising, and then a component for analyzing data on customer purchases before and after sending. Customers began to receive birthday greetings (and some even respond to auto-greetings, which is very pleasing). Another big component was developed for integrating suppliers into the network, but first I’ve said that the suppliers resisted even the idea of such an approach to work, but since some were persuaded and switched to a new delivery method, graphical reports appeared in the system, who are able to compare sales by any characteristics (for example, you can compare sales of product groups by brand and by supplier on one chart for any period of time and by any stores), then during the year most suppliers Cove switched to the system, because they just saw the data, and you can’t argue with the data, unlike the allegations. With suppliers, the number of users increased fourfold, but no change in speed was observed, and moreover, what was done in the next paragraph helped with speed.
From collecting and managing the data received, to associating a variety of products into more general products that are applicable at once in all stores, the next question is: it's time to remove the existing management system in the stores and install your own, integrated, allowing you to work fully with each store, allowing you to keep complete records all documents and goods. It took about a third of the year from scratch, the system is double as usual - the server / client, the server is integrated via the API of the main platform, documents flowed in all directions, not files, the time has come and you can turn off the primary file exchange. In general, in 40 hours all the stores were transferred to a new management system, but it also included the transition to GNU / Linux (Ubuntu), in general, it saves some money on licenses, but even more important, there was less opportunity to get some is a virus. The system automatically communicates with the central platform and with the cash registers / price checkers, no more manual postings are needed and the so-called z-reports no longer took almost 2 hours of time as at the previous solution, this is a thing of the past. All this had a good effect on the speed of the system, because it became possible to switch the logic inside the code from the primary one (the one that worked according to the principle - there are a lot of stores, but they all can have their products with their prices, and even though it was all together Associated in the general system, these associations did not fall back into the stores), to the logic of the second version - the entire network works on the same common base, and this reduces the requirements for everything used for the memory and for the complexity of queries to the database and for the number of requests.
With the advent of the entire circulation of documents in the platform, it became possible to appease accounting. Before that, they were in manual mode, and they were just very sorry. Of course, they had the necessary accounting programs and programs that allow payment through a bank in the network, but for almost half a year they were tormented by loading data into their programs. They were really glad to have the opportunity to automatically upload the data they need in the format they need, and later their mood picked up a report on payment to suppliers.
By the way, all this was still not enough to show a report that would have residuals in any goods for any period of time - this is not something that you especially think about first. But later you see that there is information that needs to be collected specifically for only one type of report, otherwise you will suffer specifically if you try to calculate something on the fly.
There is still a lot of things to be done, and in general programs of this kind can never be completed, you cannot put the last point and say: that's all, this is the maximum, nothing more to do here. But after two years on this project, if someone asks me, is it worth it to write something of your own, great, from scratch and with a bunch of unknowns? I think that the answer is that you need to write.
If you have questions, write in the comments, I will answer.