📜 ⬆️ ⬇️

SPA architecture for CRM systems: part 2

In the first part of the article, the thesis was expressed that the SPA architecture was to blame for the low speed of the CRM applications we create. For some, such an assumption might seem, to put it mildly, unexpected and even offensive, given the rapidly growing popularity of this approach in the development of WEB applications, and we, like many modern developers, are also quite successfully developing new technologies, but using this example We managed to empirically find the edge where we need to think twice before relying on a new one, and these details will be discussed in the second part of the article.


Choosing a SPA architecture, we expected the following benefits.

1. High level of interactivity for the user - the page is almost a full-fledged client application, removing from the user the burden of waiting for the pages to load and smoothing out the flicker that is inconvenient to the eye during navigation.
2. Reducing the cost of data exchange - instead of full-fledged pages from the server, only business data is loaded, which is for weak or overloaded channels, for example, for mobile Internet.
3. The server is removed from the additional burden associated with preparing the presentation for the client.
4. High speed of development and maintenance due to the declarative description of templates.
5. The absence of problems in the coordination of the state of the client and the server, since the state of the application during its work is fully preserved and managed on the client.
6. A clear division of responsibilities during development: the server implements only business logic and REST API for data access, and the implementation of the client depends little on the characteristics of the server business logic.
7. It is possible to cache data without taking into account the presentation, and on the client even the whole model is in Local Storage, which allows implementing Offline mode.
8. The ability to use one REST API for different client applications.
')
When developing, we used the following technology stack:
1) Microsoft ASP.NET MVC + WebApi on a server running IIS 8.
2) on the client - Angular JS version 1.2.0 (the latest at that time) and RequireJS.

Further narration will be based on the specifics of the selected technologies.

As a result, we imagined the following conceptual scheme.



ASP.NET MVC was required to generate a single “header” page: one route, one controller, one action, and one view that does not require a master template. The client interacts with the server through the REST API, which was implemented using the excellent WebApi version 2 library.

The client implementation was a collection of static files: CSS styles, HTML templates, and JavaScript scripts collected by the AMD method. For development, we used Microsoft Visual Studio (although in this case one could take another tool, for example, WebStorm is a definite plus, since Visual Studio’s AngularJS support is frankly weak).

The general architecture of our SPA-application is shown in the figure.



Applications on AngularJS work fine with the server within the framework of the CRUD concept, but despite the fact that in our system, the functions of viewing, creating and editing entities are similar to CRUD, in fact, they have much more complex logic than simply creating, modifying or deleting objects in the database data, and, for example, creating a task in CRM is a fairly voluminous business process that includes a certain number of checks and branches, and the resulting data model changes necessarily relate to other system objects, about should ultimately be reflected in client logic, as a result, after the operation of creating most of the data of the problem we have to re-download from the server to update the general state of the system.

So, having created a task, it is impossible to manage only by creating the corresponding object in the client’s depths and adding it to the collection, and this violates the concept that a thin client should be thin and not allow server and client logic to be smeared - the client’s logic is nothing more than logic UI, or user interaction. Thus, the expected advantage of the SPA architecture that the state of the application during its operation is fully preserved and managed on the client did not manifest itself, but only created additional implementation difficulties.

Then what is the state of the client application that persists during the life cycle of the client application? Basically, this is just the state of the UI (user-selected display options, recent bookmarks, filters, last element layout, user friendly, etc.) and data that are not subject to frequent changes for a long time, but are actively used for proper UI operation. . From the most basic, this is an ACL, depending on the parameters of which some controls will be shown, hidden or “disconnected” from the user, navigation history, information about the current user.

The second point is that (those who worked with AngularJS, they know) even within the application you can get a URL of any depth, for example, to save it in bookmarks and navigate through it directly to the desired section, in fact, each such link forces the browser to download the entire application, which, after downloading, must first be initialized, which entails the need to get all the data from the server to work, and then follow the link "inwards". This often requires additional requests to the server, and as a result, following a link can be painfully long, especially for the first time. For our CRM, sharing links between employees, opening content in additional tabs is a very frequent operation. Thus, the time gain on a small amount of transferred data is actually leveled by the need to perform a large number of data exchange operations with the server before the client application begins to fully function.

Now is the time to provide more information about the technical problems that we encountered in implementing CRM, using the SPA architecture and the AngularJS framework.

Memory leaks and performance drops

When testing, users began to notice a noticeable drop in performance after some time of continuous operation. We conducted a series of synthetic tests, taking into account the average usage pattern of the application, and obtained the following example of profiling the application from the very beginning of the start, alternately performing each of the functions forward and in the opposite direction within 15 minutes. Indicates the average value of memory usage during this time.



Nothing terrible is observed. The main consumer of memory: a collection of data. Moreover, a more detailed profiling in a short period of time showed that the memory is utilized normally, the GC is coping with its task.

But at large intervals of time, the buffer that is not being cleared by the data garbage collector begins to accumulate, in other words, a leak. Within two to three hours of work in normal mode, the amount of memory consumed rises from 200 to 600 MB, and during the working day to 1 GB or more, which is unacceptable. In the process of debugging, there were obvious flaws associated mainly with the seizure of resources in the closure - the problem is known. The lion's share of leaks fell on external foreign libraries, which, in conjunction with the code used, led to this problem.

The time spent on debugging has shown that it is possible to deal with leaks, but for an already written and working system, this kind of optimization is very time consuming. Thus, it is necessary to pay more attention to this during the development process, conducting profiling after each small stage of work to eliminate accumulated leakage, but this reduces the attractiveness of the chosen architecture for a developer looking for an effective way to quickly solve the problem.

So have we managed to meet the expectations that I mentioned at the beginning of the article? Only to some extent. This is far from the first experience of using AngularJS for developing web applications, but before developing CRM we did not have to face these problems.

As a result, we wondered if there was a way to take the best of the SPA architecture, and combine it with the classical approach of developing applications for the web? To answer this question, we chose a hybrid approach.

Hybrid approach

We decided to divide the heavy independent parts of the application into separate real pages, each of which is a separate application. At first glance it looks like a complication, but in the end it is not.

Each page, being an application, contains only the functionality that is used in this part (similar to a return to what it was before). Each page is not empty, but contains minimal HTML, which is immediately shown to the user. If a certain content opens in the application in a pop-up window, then by the link this content already opens as a separate page, and opens quickly. This is very similar to the special pages for the crawler when searching for optimization of SPA applications.

We did not want to give up high interactivity while navigating the user inside the application, but as it turned out, the concerns were in vain. If pages with HTML content are correctly cached (both on the server and on the client), the difference in navigation speed is not visible to the naked eye, and the average user does not see the difference at all. Otherwise, everything works the same way — first, the cached page opens quickly, then the application that loads the business data is executed using the REST API and renders the templates in the same way as the SPA application does.

The overall state of the application and user is stored in Session Storage, this allows our technical requirements for the browser. For caching, almost nothing has changed, since all pages are static from the client’s point of view.
The only thing that cannot be defeated is a full-fledged Offline mode, this is exclusively the prerogative of the SPA.

Summarizing

All we did was slightly violate the “Single” principle, decompose a monolithic client application into several separate applications, without losing almost anything. What have we got through this approach?

1. The problem with memory leaks is automatically solved, because in the process of work the user moves between pages, which causes a complete cleaning of the memory, without any costs on our part.

2. Additional features, such as fast content download by reference (without downloading the entire application, at least the part that would allow you to get inside), as well as different minimum HTML for each of the pages, instead of one universal “Loading ... "

3. We somehow quickly forgot that we have Razor, and what wonderful opportunities it gives us instead of static HTML, despite the fact that for the client these pages still look like static with a long cache time.

4. Separate pages in the end allowed to develop these parts of the application independently, for example, one page may fall, while the rest will continue to work as if nothing had happened.

I will add that all of the above is rather narrowly specific, and was considered solely within the framework of the technologies and specific software we used. Not always, this approach will be justified and will give at least some advantages over the power of the SPA.

We thank you for your attention!

Source: https://habr.com/ru/post/245643/


All Articles