
After the publication of a series of notes on the subject of
compression and
combination of JavaScirpt-files, it is worth highlighting the most typical problems of this very compression and combination.
Let's start with a simple one: how JS compression can spoil our mood. And how to raise it back :)
UPD launched a site acceleration contest . In prizes: monitor, webcam, mouse. Everything is hyper fast.
Javascript compression
In general, it is worth mentioning at once that compressing JavaScript files will give us only a 5-7% reduction in size relative to regular gzip, which can be used
everywhere (no, really, everywhere - from
the Apache configuration through .htaccess to
static compression through mod_rewrite + mod_mime and
the nginx (or LightSpeed)
configuration . But back to the topic topic: we want to minimize the JavaScript files, what is the best way to do this?
')
Two years ago, a
review of current tools was made, during which time the situation has not changed much (except for the
Google Compiler ). But all in order.
- Let's start with the simple. JSMin (or its clone, JSMin + ). It works very universally (according to the principle of finite automata ). Almost always, the minimized file is even executed. Additional gain (hereinafter relatively simple gzip) - up to 7% in the case of the advanced version, i.e. very little. The processor eats moderately (advanced version, JSMin + is stronger, and the memory is significant), but does not analyze the scope of variables, and therefore can not reduce their length. In principle, it can be used for almost all scripts, but sometimes nuances are possible. For example, conditional comments are deleted (it is treated) or various constructions are recognized incorrectly (for example,
+ +
converted to ++
, it breaks the logic), it is also treated, but more difficult. - YUI Compressor . The most famous (until recently also the most powerful) tool for compressing scripts. It is based on the Rhino engine (as far as we know, the engine roots are somewhere near the Dojo framework, that is, very, very long time ago). Compresses scripts perfectly, works on the scope (can reduce the length of variables). The compression ratio is up to 8% over gzip, however, the processor eats if you are healthy (due to the use of the Java virtual machine), so you should be careful when using online. Also, due to the decrease in the lengths of variables, various problems are possible (and there are even potentially more of them than for JSMin).
- Google Closure Compiler has appeared recently, but has already won public trust. It is based on the same Rhino engine (aha, there is nothing new under the moon), but it uses more advanced algorithms for reducing the size of the source code ( an excellent overview in all details ), up to 12% relative to gzip. But here it is worth being triple cautious: a very substantial part of logic can be cut out, especially with aggressive transformations. However, jQuery already uses this tool . In terms of processor overhead, it seems to be even heavier than YUI (this fact has not been verified).
- and Packer . This tool is already a thing of the past due to the development of communication channels and the lag of processor power: for the compression in it (an algorithm similar to gzip) is done by a JavaScript engine. This provides a very significant (up to 55% without gzip) reduction in the size of the code, but additional costs up to 500-1000 ms for unpacking the archive. Naturally, this becomes irrelevant if the processor power is limited (hi, IE), and the connection speed is very high (+ gzip is used and supported almost everywhere). Additionally, this optimization method is most prone to various bugs after minimization.
Summary here: check JavaScript not only on the server where it is developed, but also after optimization. Best of all - on the same unit-tests. You will learn many new things about the described tools :) If this is not critical, then use just gzip (preferably static with the maximum compression ratio) to serve JavaScript.
Problems merging javascript files
After we dealt with the compression of JavaScript files, it would be good to touch on the topic of combining them. The average site has 5-10 javascript files + several inline pieces of code that can somehow call plug-in libraries. As a result, there are 10-15 pieces of code that can be combined together (the benefits of this sea are from the download speed on the user’s side to the server failover under DDoS — every connection will be on the account, even a static one).
But back to the sheep. Now we will talk about some automation of the combination of "third-party" scripts. If you have full access to them (and you are good at web development), then it is not a big deal to fix the problems (or exclude a number of problematic scripts from the association). Otherwise (when a set of scripts does not want to unite without errors) the following approach is just for you.
So, we have 10-15 pieces of code (some of them are in the form of embedded code, some are in the form of external libraries, which we can also “merge” together). We need to guarantee their independent performance. What is it?
If we have a JavaScript error in the file, the browser stops executing this file in error (some of the most ancient ones also stop executing all JavaScript files on the page in this case, but we don’t really talk about it). Accordingly, if the very first library that we want to merge into a common file gives an error, then in all browsers our client logic will collapse after merging. It's sad.
Additionally, it is worth noting that the built-in (inline) code is hard enough to debug. It can either be excluded from the merge (for example, by placing a call to the merged file before or after the code), or, in case of its use, the merging of files can be canceled altogether.
backward compatibility
What can we do about it? The easiest way is to exclude problem files (in this case, errors can occur only at the merge stage, while individual files can be worked out with a bang) from the merge logic. To do this, you will need to keep track of where the error occurs, and assemble a configuration for combining for each such case.
But you can do a little easier. For JavaScript, we can use the
try-catch
construct. Yeah, thought caught? Not yet? We can enclose all the contents of the files that we merge into
try {}
, and in
catch(e) {}
connect the external file in the following way:
try {
... the contents of the javascript library ...
} catch (e) {
document.write ('initial javascript file call');
// or console.log ('you need to exclude a JavaScript file from the union');
}
In this case, the user will download a single file if no problems have occurred. If there were errors, then all external problem files will be connected in the same order. This will ensure backward compatibility.
Performance issues
Obviously, this approach is not “the most correct”. It would be most logical to identify JavaScript errors, fix them, and upload one file for all users. But it is not always possible. It is also worth considering that the
try-catch
design is rather heavy for execution in browsers (adds 10-25% to the initialization time), it is worth being careful with it.
But the described approach can be wonderfully used for debugging specifically combining JavaScript files: after all, it allows you to determine exactly which files are “getting lost” (if there are several dozen files, this is very, very useful).
Small summary
After additional minimization of JavaScript files, be sure to check their functionality. And debugging the correctness of combining JavaScript files can be easily automated, or even set up backward compatibility if it is impossible to debug specific scripts.