📜 ⬆️ ⬇️

Creating an enterprise - the solution of tomorrow

We are grateful for the preparation of the article Pavel Kovtun - CEO of Mycroft Assistant. Pavel more than 12 years
specializes in business automation and supply chain management. He is the author of unique analysis and decision-making techniques at SCM, the creator of the first innovative expert-level inventory management system.



The task of any business is the effective use of available resources. For a business working with goods, one of the main problems is the effective investment of money in a negotiable product - there must always be as many stocks as necessary. If it is more than needed ( overstock ), then the money is invested inefficiently, if it is less ( out-of-stock ) - that is, the probability of missed sales, poor quality of service and lost customers. The result - the wrong solution leads to lost money.

The management of the supply chain, Supply Chain Management , is responsible for solving this problem. The mechanisms used to solve problems are associated with processing large data arrays, mathematical modeling, using statistical methods, solving optimization problems and machine learning.

Most companies solve supply problems simply by using Excel calculations, manually collecting data, trivial formulas and a pledged error, which is insignificant at low company speeds. And up to a certain size and turnover of the company, this option is suitable - it is possible to scale the calculations, hiring new managers. But with the growth of the company, the accumulated error and the increasing amount of data do not allow all calculations to be made “in manual mode” - the work efficiency drops so much that as a result of the constant mass of wrong decisions, the inefficiency of using working capital can reach 30%.
')

What to do business


To solve these problems, there is a special class of systems called Expert Supply Chain Management (expert-SCM) . Such solutions independently monitor data on current sales and the available balances of each of the goods in all outlets (warehouses). And they do it not only for you, but also for other participants of the supply chain. Having these data, taking into account the individual characteristics of sales of goods, online processing of the entire gigantic data array, making decisions about which product you need to invest money and at which outlet it should be delivered now, so that there are no problems in the future.



Our solution - Mycroft Assistant ( mycroftbs.ru ) is one of the leading cloud representatives of Expert-SCM solutions for SMB . We have been involved in SCM solutions for more than 5 years, and our customers confirm the cost-effectiveness of using our solutions in business.

Two years ago we decided to collect our achievements and accumulated experience in this area, creating a product that solves the problems of our clients in the most optimal way. At the same time, we set two tasks for SCM Mycroft Assistant:

  1. give maximum business efficiency (usability, speed of integration and deployment, work result, the ability to unite all participants in the supply chain),
  2. have an optimal modern technical implementation (architecture, speed of calculations, used methods and approaches).

The important point was to enable the wide use of our solution to a large number of clients and the possibility of their interaction with each other within a single supply chain. The result was a solution that was affordable and accessible to most SMB customers, so it was necessary to take into account the high load on the system. As a result, we got the enterprise application of tomorrow.

How did we design MycroftAssistant?


We initially decided that this would be a cloud solution implemented on the Microsoft technologies stack (Azure, ASP.NET, MS SQL) that proved to be familiar to clients and would be able to work as flexibly as possible:



Although we started developing on our server in collocation in the data center, it was originally planned for production to use the Azure cloud for both business reasons and technical reasons:

  1. Architecture Azure offers a ready-made pool of solutions and modules, in the logic of which our architecture is implemented. So we did not have to create a complex of replicated VMs. At the same time, we obtained optimal reliability and scalability , which will allow us to easily scale the business, without worrying about technical problems and not inventing “crutches” to solve those problems that have already been solved with Azure capabilities
  2. Despite the reduced performance relative to its collocation server, the economic effect of a high level of fault tolerance covers these “estimated” costs. So we do not worry about the stability of our own servers, and give a high level of customer service.
  3. Azure allows you to place access points to the server closer to the clients. For us, this is relevant, since our clients are located in Russia, America and Europe. The ability to reduce latency access to the server, moving it closer to the client is a great advantage that is not a shame to mention for the client.
  4. The cost of using and maintaining Azure is quite low, and the ability to easily scale resources according to actual need helps predict costs per customer.
  5. The implementation allows you to combine customer objects within a single database. Conveniently, when there is a bundle on the principle of the Supplier - the Buyer.




The implementation of cloud architecture was laid out when designing Mycroft Assistant as a SaaS web-based solution.

What stack was used to implement?


From the server side we use the following technologies:

From the client side are used:
jQuery, jQuery UI, knockoutjs, flot, Globalize, DataTables, Bootstrap, fancytree

We chose our architecture for a long time and painfully at the start of implementation, but in the end we came to the most optimal, which can be represented in the form of the following scheme:



SCM Mycroft Assistant consists of three parts: Core, WebAccess , Workers.

Core - a set of core libraries for the overall functioning of the system: common classes, basic interfaces, kernel components, etc.

WebAccess - ASP.NET MVC based web access. View and manage data, configure, administer. It consists of several subsystems, while it is possible to install any other subsystem without rebuilding the WebAccess module. Registration of subsystems occurs automatically when the application starts.

Workers - Azure Worker Role, within which all calculations are performed.

The system has a plugin mechanism. For example, a plugin can be written to load data into the system, which can retrieve data from an external system via an API, which is then converted into internal business logic objects and stored in the database. Via plug-ins can also be extended: notifications, reports, data calculation modules.

After several iterations and refinement of the architecture, we built the following order of interaction between the parts of the system, and their features:

Not bad, but what about the load?


After deployment to Azure, we were able to conduct comprehensive load testing using Visual Studio Online. It revealed a bottle-neck that did not manifest itself when working with current customers. It looked like this:



For example, if a client operates 50,000 different goods on 60 outlets, for a total of only 1 day, if you count all movements between different points, the volume of operated records is 50,000 * 60 * 60 = 180,000,000 , and for calculating the history for year (as it should be in operation), the number of records is more than 6.5 * 10 ^ 10 . The cache does not cope with such volume of Big Data .

The problem was in the data caching architecture - the entire cache was in-memory . This made it possible to greatly reduce the expectations of responses on the client side, but, at the same time, increased the cost of RAM. If you only cache information that is most frequently accessed, this is not as problematic as caching all raw data.

Solving the problems identified


Fortunately, we initially planned to move to a distributed cache, but for this we needed to translate the filtering and aggregation of data from LINQ to SQL .

Not even in LINQ to SQL, but in the manual construction of queries, since we use a versatile cut according to various data (goods, warehouses, managers, etc., and even by year, month, and the most “interesting” is by week) , and the Entity Framework does not know how to call functions from C # code, unfortunately (Expression did not suit us).

The aggregated data, firstly, takes up less space in the memory, and secondly, we generally abandoned caching, shifting the load on the database. At the cost of an insignificant reduction in the speed of loading information, we defeated memory leaks.

What was the result?


Based on the result of the deployment of the solution in Azure and the possibility of using all the modern "buns" that this platform provides, we came to the conclusion that our reliance on obtaining technical benefits from using the Microsoft technological stack and cloud technologies Azure fully justified itself.

We also see that customers using cloud technologies, such as our SCM Mycroft Assistant solution, get better services today than using solutions built on on-premise technology. Such as:
  1. Solution deployment speed. The client can start work in just 1 day. No dependence on the accounting system used.
    Availability of solutions from various places and devices. No need for long-term client installation or IT staff involvement in the integration of the solution
  2. High speed: both from the client and from the server. A very large amount of data can be recalculated in a matter of minutes. There is no need to spend on an expensive server. There is also no dependence on the speed of client machines.
  3. No problem with scalability and reliability of the solution. When a company grows, it may not think about the need to increase the maintenance costs of an increasing amount of data. The company can also be confident in the high level of availability of the solution, since it is located on Microsoft's proven facilities, the technical capabilities of which are more than those of any private company.
  4. The cost of using and maintaining a turnkey solution is much less. In this case, the costs are transferred from CAPEX to OPEX , which pleases the finance department.
  5. It is possible to integrate all participants in the supply chain into a single chain with automatic data exchange in the cloud, which does not provide any other solution.

You can be convinced of all this, having familiarized with our DEMOVERSHY which is available to all interested persons. And if you are interested in SCM Mycroft Assistant as a client, you can easily connect to the solution, and, having received a free trial period, independently verify its effectiveness.




about the author


Pavel Kovtun - CEO Mycroft Assistant

More than 12 years in the field of business automation, more than 8 years of entrepreneurial experience. In the field of supply chain management - 5 years.

Author of several unique methods of analysis and decision making in SCM, with publications in electronic and print media.

The creator of the first innovative management system of reserves of expert level, designed for PMS.

Source: https://habr.com/ru/post/270757/


All Articles