📜 ⬆️ ⬇️

Javascript Compression and Combination Issues

text file compression After the publication of a series of notes on the subject of compression and combination of JavaScirpt-files, it is worth highlighting the most typical problems of this very compression and combination.

Let's start with a simple one: how JS compression can spoil our mood. And how to raise it back :)

UPD launched a site acceleration contest . In prizes: monitor, webcam, mouse. Everything is hyper fast.

Javascript compression


In general, it is worth mentioning at once that compressing JavaScript files will give us only a 5-7% reduction in size relative to regular gzip, which can be used everywhere (no, really, everywhere - from the Apache configuration through .htaccess to static compression through mod_rewrite + mod_mime and the nginx (or LightSpeed) configuration . But back to the topic topic: we want to minimize the JavaScript files, what is the best way to do this?
')
Two years ago, a review of current tools was made, during which time the situation has not changed much (except for the Google Compiler ). But all in order.


Summary here: check JavaScript not only on the server where it is developed, but also after optimization. Best of all - on the same unit-tests. You will learn many new things about the described tools :) If this is not critical, then use just gzip (preferably static with the maximum compression ratio) to serve JavaScript.

Problems merging javascript files


After we dealt with the compression of JavaScript files, it would be good to touch on the topic of combining them. The average site has 5-10 javascript files + several inline pieces of code that can somehow call plug-in libraries. As a result, there are 10-15 pieces of code that can be combined together (the benefits of this sea are from the download speed on the user’s side to the server failover under DDoS — every connection will be on the account, even a static one).

But back to the sheep. Now we will talk about some automation of the combination of "third-party" scripts. If you have full access to them (and you are good at web development), then it is not a big deal to fix the problems (or exclude a number of problematic scripts from the association). Otherwise (when a set of scripts does not want to unite without errors) the following approach is just for you.

So, we have 10-15 pieces of code (some of them are in the form of embedded code, some are in the form of external libraries, which we can also “merge” together). We need to guarantee their independent performance. What is it?

If we have a JavaScript error in the file, the browser stops executing this file in error (some of the most ancient ones also stop executing all JavaScript files on the page in this case, but we don’t really talk about it). Accordingly, if the very first library that we want to merge into a common file gives an error, then in all browsers our client logic will collapse after merging. It's sad.

Additionally, it is worth noting that the built-in (inline) code is hard enough to debug. It can either be excluded from the merge (for example, by placing a call to the merged file before or after the code), or, in case of its use, the merging of files can be canceled altogether.

backward compatibility


What can we do about it? The easiest way is to exclude problem files (in this case, errors can occur only at the merge stage, while individual files can be worked out with a bang) from the merge logic. To do this, you will need to keep track of where the error occurs, and assemble a configuration for combining for each such case.

But you can do a little easier. For JavaScript, we can use the try-catch construct. Yeah, thought caught? Not yet? We can enclose all the contents of the files that we merge into try {} , and in catch(e) {} connect the external file in the following way:
  try {
	 ... the contents of the javascript library ...
 } catch (e) {
	 document.write ('initial javascript file call');
 // or console.log ('you need to exclude a JavaScript file from the union');
 } 

In this case, the user will download a single file if no problems have occurred. If there were errors, then all external problem files will be connected in the same order. This will ensure backward compatibility.

Performance issues


Obviously, this approach is not “the most correct”. It would be most logical to identify JavaScript errors, fix them, and upload one file for all users. But it is not always possible. It is also worth considering that the try-catch design is rather heavy for execution in browsers (adds 10-25% to the initialization time), it is worth being careful with it.

But the described approach can be wonderfully used for debugging specifically combining JavaScript files: after all, it allows you to determine exactly which files are “getting lost” (if there are several dozen files, this is very, very useful).

Small summary


After additional minimization of JavaScript files, be sure to check their functionality. And debugging the correctness of combining JavaScript files can be easily automated, or even set up backward compatibility if it is impossible to debug specific scripts.

Source: https://habr.com/ru/post/86443/


All Articles