📜 ⬆️ ⬇️

Static compression of css and js files (process automation)

Due to the absence of mod_gzip on the hosting, I had to implement css and js compression through static compression. On the other hand, it may be better ... But, immediately, one thing arises. To carry out such an operation manually, although it is possible, but extremely non-productive, it is desirable to automate all this. Here is one of the simplest options for such automation, implemented in php, shown here.

For a start, let's deal with the task. There is a local version of the site (Apache, php), actively modified and adjustable. In the course of work, we must have compressed versions of css and js files (you should see the finished result at testing immediately).

So we need to:
  1. Find all css and js files (even if they are in subfolders)
  2. Create a compressed version (and not every time, but only if the file has been changed)
  3. Implement automatic return by the server of the current version of the file (in compressed form, if supported by the browser)

Of course, it’s not difficult to add file mergers, if necessary, but this is not about this ...
')

So, let's create a php page (let's call it, for example, ready.php), which will contain all the code we need. Having opened this page in the browser, we will start the process of compression. If the site is dynamically built on php, then you can write include ('ready.php') and as needed compression will be performed automatically. Of course, the address in include () is necessary to register a real one. There is one nuance here, to prevent this file from being called on the hosting (actually it may not exist on the hosting, but you can forget to delete the line), you can prescribe a condition unique to the local server, for example:

  if (mb_eregi ("local root address", $ _SERVER ['DOCUMENT_ROOT'])) {
   include ('ready.php');
 } 

The root folder address for you and the server is unlikely to match, but you can think of something else ...
In ready.php we write the following php code:

  function ready ($ dir) {
 $ dir = $ _SERVER ['DOCUMENT_ROOT']. $ dir;
 $ ext = array ("js", "css");
 for ($ i = 0; $ i <count ($ ext); $ i ++) {search ($ dir, $ ext [$ i]);
 }}
 function search ($ dir, $ ext) {
 $ dirH = opendir ($ dir);
 while (($ file = readdir ($ dirH))! == false) {
   if ($ file! = "." && $ file! = ".." &&! mb_eregi (". gzip", $ file)) {
     if (filetype ($ dir. $ file) == "dir") {
       search ($ dir. $ file. "/", $ ext);
     } else {
       if (fnmatch ("*.". $ ext, $ file)) {
         if (! mb_eregi ("gzip", $ file)) {
         // next line will show all found files
         // print $ dir. $ file. "<br>";
           $ adr = substr ($ dir. $ file, 0, strrpos ($ dir. $ file, "."));
           $ timeF = filemtime ($ dir. $ file);
           if (is_file ($ adr. ". gzip.". $ ext)) {
             $ timeG = filemtime ($ adr. ". gzip.". $ ext);
           }  
           if ($ timeF> $ timeG) {
           // the next line will show the files to be compressed
           // print $ dir. $ file. "- GZIP <br>";
             // minimized (need yuicompressor and its real address)
             exec ("java -jar yuicompressor.jar". $ adr. ".". $ ext. "-o". $ adr. ". gzipY." $ ext);
             // compress
             if (is_file ($ adr. ". gzipY.". $ ext)) {
               shell_exec ("gzip -9 -n -f -c". $ adr. ". gzipY.". $ ext. ">". $ adr. ". gzip.". $ ext);
               unlink ($ adr. ". gzipY.". $ ext);
             } else {
               shell_exec ("gzip -9 -n -f -c". $ adr. ".". $ ext. ">". $ adr. ". gzip.". $ ext);
             }
 }}}}}}
 closedir ($ dirH);
 }
 // Here we set the address where the files are located
 ready ("address"); 

As a result, we get compressed copies of all js and css files with names like name.gzip.js and name.gzip.css (if not, first of all, we check the correctness of the address and access rights).

Next, it is necessary to ensure return of the current version of the file by the north (without regard for the cache). This is done by adding the filemtime tag to the file name. This is implemented in php as standard, for example:

  <link href = "/ css / css.v = <? = filemtime ($ _ SERVER ['DOCUMENT_ROOT']." css / css.css ");?>. css" rel = "stylesheet" type = "text / css "> 

The finished link should be of this type:

  <link href = "/ css / css.v = 1263208288.css" rel = "stylesheet" type = "text / css"> 

This has been discussed more than once and details can be found on request, for example, “speed up your website, practical css / js” ...

Add the redirection rules to the htaccess file (we also immediately take into account the presence of a compressed version):

  RewriteEngine on
 ReWriteCond% {HTTP: accept-encoding} gzip
 RewriteRule ^ (. * \.) V = [0-9.] + \. (Js | css) $ /$1gzip.$2 [QSA, L]
 ReWriteCond% {HTTP: accept-encoding}! Gzip
 RewriteRule ^ (. * \.) V = [0-9.] + \. (Js | css) $ / $ 1 $ 2 [QSA, L]
 <FilesMatch. * \. Gzip \. (Js | css) $>
   Header set Content-Encoding: gzip
   Header set Cache-control: private
 </ FilesMatch> 

That's all.
As a result, we continue to work comfortably with js and css files and at the same time, we always have their compressed versions, which are fed to the browser.

Source: https://habr.com/ru/post/80848/


All Articles