Our company is developing its own web application. That is, without external funding :) This has both advantages and disadvantages. But what I have always liked and like is the opportunity to try something new - technologies, approaches and solutions. For some reason, I cannot name the project site, but I can share the experience I gained during my work.
Since I am responsible for the part of the project that is directly visible to the user, and with which he works closely, my story will be about it).
For a start, so that the reader would understand what was going on, I would like to tell you what is on the “dark” side. And there is Java, MySQL, Neo4J, Jetty, RabbitMQ and at the end of this long python is nginx.
GCL
At the end of 2010, we, with our “gallant” web-js department, decided to abandon the old template engine for a number of reasons which we will discuss below and switch to something new and truly relevant to the realities of our crazy project. The point was that at that time the concept of widgets and places had already been implemented. Widgets in our understanding are certain independent visual pieces that communicate through channels. Channels can send messages and subscribe to certain types of messages. The widget, in turn, does not know where it is located in the HOUSE - the place is responsible for this. The big problem was that the widget defined some patterns by which it visualized the data. We can use the same widget in different places, but display data differently, therefore, the user could interact with the data differently. But back to our ancient template engine. At that time, all the templates were loaded and cached on the client in the web-storage because of what there was some asynchrony in the js code - after creating the widget, it took some time before data could be output. We wanted to find a new solution that would remove many of the problems of our template engine, for example:
- there were no cycles
- complexity of localization (it was impossible to insert variables in the texts)
- there were no conditions and branching.
We analyzed the existing solutions at that time and chose the Google Closure Library (GCL) ... Yes, I didn’t even know that Google provides technology, but does not provide tools for using it :)
By the time the project consisted of:
- ~ 500 js files
- ~ 30 css files
- ~ 300 templates.
The answer lies in the integrated approach proposed by Closure. We wanted to:
js code compressed, optimized in advanced mode
- remove dead js code
- css compressed and validated
- patterns were checked, compressed and stored on the client side.
- easy translation of resources into different languages
All the solutions that were on the network at that time, gave one thing, Google gave three related solutions: Closure Compiler, Closure Template, Closure Stylesheets, which work both separately and together. And when they work all together, the result is simply amazing!
Change js code
The first thing we started with was putting js dependencies everywhere ... goog.require ... It took a long time, it took about 1 month. As a result, we have simplified the connection of new js files and logic - it is enough just to register the dependency and the system will automatically load the necessary code.
Google does not provide tools for using its technologies, but in Google itself it is, as the developers directly (through G +) reported that they write in Eclipse and they have full support for Closure in it.
We wrote our script for Eclipse in the form of a Build Event, which every time we saved js updated the dependency file deps.js for Closure. At that time, the practice was such that the entire project (Tomcat, mysql, mq broker, etc.) was picked up by the machine from each developer, which consumed 6GB of memory and required about a half to two minutes to start the entire project, so we quietly migrated to local proxying of js, css, img files via nginx, which significantly accelerated development. And then it was very tiring to wait until the eclipse kicked tomkat so that he began to update the files.
')
Transition from CSS to GSS
GSS is somewhat similar to LESS, with its own characteristics.
In parallel, we switched from css to gss. Many problems brought all sorts of non-standard attributes, and in principle it is enough just to rename css to gss. The only thing I recommend is to go straight through your css and inject mixin. There was still a problem with the fact that it was necessary to somehow track which gss files have changed and call the gss-> css compiler for them
SOY Migration
What are SOY. These are templates written in html similar syntax and compiled into js code. This solves an important problem with client-side caching of all templates.
Simultaneously with all these innovations, we translated the old patterns into SOY (Closure Templates). SOY turned out to be just a fairy tale for programmers, since we were able to completely separate the visual part from the logic and easily integrate it into the js code, because the compiler puts down dependencies (goog.require). Since there is a namespace in SOY, we immediately thought out that our namespace will be reflected in the file system as folders - as in java.
The big problem is the compile time for all files — this is too long; on Core i7 3770K, compiling all gss and soy takes 20 seconds. Therefore, we made a script that keeps track of the modified gss and soy and compiles only them - for some reason, Google does not provide such tools in open access.
Update: In the process of writing the article, optimization was introduced and the compile time (in debug mode) is now 8-9 seconds.
Union
After solving these problems, we faced the last task - to make all these three technologies work together in order to speed up the work of the site, and to get something for which everything was intended.
But here some nuances came up: in order for the css selectors compression to work, you need to use the goog.getCssName ('display1') construction everywhere in js instead of the direct conversion 'display1'. That is, we needed to replace $ element.addClasss ('display1') with the $ element.addClasss (goog.getCssName ('display1')) construct. In addition, inside goog.getCssName (...) you cannot use variables and a large number of selectors. That is, goog.getCssName ('display' + value) does not roll :), nor does goog.getCssName ('display1 clearfix'). This caused a lot of inconvenience, because of which we had to rewrite the compilation scripts - so that they supported the possibility of non-compressible css selectors, since all the old code could not be immediately converted from the “display -” + value into something normal. In SOY itself, it is also necessary to specifically distinguish the classes and identifiers that will be compressed, {css display1}, etc. At the first stage, the coder was a complete hell ... We were looking for a solution with syntax highlighting, in the end we found a plug-in for Eclipse that solved a bunch of problems. (http://www.normalesup.org/~simonet/soft/ow/eclipse-closure-templates.en.html).
What he can do:
- SOY backlight and syntax checking
- Check for the correct call of nested templates. Missed and extra parameters
- Quick navigation through templates, using the Ctrl key
In general, this plugin has become for the maker of manna from heaven. SOY - gives you a free hand, but also increases responsibility. A little later, we wrote our plugins for the soy compiler to add the methods we need, like converting a string to a number and rounding. At this misadventures with SOY just started. Then we transferred the server templates to the new template engine. For this, I had to write my classes again to support topics and translations. To automatically convert old templates to a new look, we wrote a converter ...
With the translations of SOY into different languages, a separate song, Google says: "everything is fine there." There everything is really great if you have the tools :), and so you can generate xlf files or a file from soy files. And it turned out that you can’t take the old xlfs that were translated and just add those that are not translated ... It's just a nightmare! There is some terrible set of utilities for working with this format, but there is no what is needed, each phrase has its own id, but it is generated so sophisticatedly that even the class generator in Google Closure is called Fingerprint ... Again, we wrote tools which allowed to solve this problem.
We also had to render the code from separate jsp pages into separate modules, since we had to prepare for compression ...
Last bastion
So it took another 7 months from the beginning of our journey, we had all the necessary tools, all the necessary connections between the three technologies, but compression in the advanced mode did not work :) Again, problems arose due to the fact that jQuery and many plug-ins incorrectly assemble in the advanced mode, it was necessary to write and connect externs. With jQuery and plug-ins figured out, now it turned out that the js calls in SOY should also be replaced by some incompressible calls. I understand that GC does not recommend using direct calls in onclick, and this is easy to do when you write a project from 0 to GC, but when you have a ton of old code, it is not so simple, so we created the export.js file, in which they registered a proxy for external calls in this form:
global_export["showLoginDialog"] = function(event, elem) {
We set the standard for all such exported calls, of the form function (event, this, ...), that is, the first two parameters are necessarily such, and then whatever.
Having solved this problem with export (the number of calls turned out to be no more than 20-30) it turned out that another sad fact surfaced with GCC (Google Closure Compiler). In the advanced mode, GCC compresses everything that is not “pinned” with quotation marks 'or', and, therefore, calls to external plug-ins needed to be fixed. But the biggest disappointment was that the client-server interaction worked on a well-documented API, but it collapsed after compression. It threw us back indefinitely ...
Here it is necessary to make a digression, Google itself transfers not JSON objects, but arrays. At first we thought it was ProtoBuf - we tried it and it turned out that no, they simply associate each index of the array with a specific field. Apparently, when data comes from the server, they feed it to some MessageFactory, which based on the meta-data (here ProtoBuf meta-data is possible for a specific type of message) associates the elements with the object. If you do as Google does, then, of course, there is no problem after compression and optimization, and even speed will increase, as it is faster to work with arrays).
Why haven't we acted like google? The reason is that we have a lot of old code that we needed to maintain, but we will definitely take on this task, since this is the right way.
The search for the solution led to the fact that GCC could give out a name conversion card, of the form “old object field”: “new field name”. We started reworking the server code so that it could support this feature; for this, a class was introduced into the library shared among 5 services.
This kind of:
public interface Constants { public static final String typeId = "typeId"; public static final String user_id = "user_id"; public static final String items = "items"; .... }
Before assembly, a special utility took the map that GCC generated, and the rules for this class. But when we already thought that everything was ready, it turned out that for some reason, some of the historical data needed on the client side is stored as json in the database and there is no way to do it humanly ... Even having changed the name of the fields, everything is unreal in the database change, and since each time the js code changes, a new map is generated, there is no chance to convert. It was a complete fiasco ... And then the idea came to do the opposite - after all, GCC not only can give a map, but also accept map transformations. We took the Constants class, converted it to a map, fed it to GCC, it squeezed all the code, but did not touch the field names associated with the Client-Server API. Everything was fine until strange errors related to some fields were discovered. For example, the “items” field should have remained “items” in the output file, and it was renamed “items1”. The difficulty was that it was difficult to determine the dependence, since in simple examples everything worked like a clock. I had to take the GCC source code and run it under debugging, it turned out that if in the code you mention the name of the property in quotes ("or") "<property_name>" somewhere, then the compiler renames your variable even if in the map it was specified "items: items ”. Having created a bug in the GCC tracker and adding a one-line patch to the comments, we reassembled our GCC and successfully squeezed the whole project.
Source map
Next, we screwed the source map in order to understand this pile of compressed and optimized-incomprehensible abFs (... For this, I also had to write my own utility, because for some reason GCC cannot add the compressed module to the end
, well, either we are already so exhausted on the way to the goal that we missed this item in the documentation (scanty).
Total
What we got as a result:
Uncompressed 1.6 MB js code + 1.4 MB templates ~ 3.0 MB
38 modules in the normal mode of compression weigh 2830 KB in zip 445Kb
38 modules in advanced compression mode 1175 KB in zip 266Kb
The site really began to work faster, even if we spent 12 months on it. In parallel, we solved tasks for work and slowly went to the goal ...
This whole story is written so that you can imagine whether all these gestures are worth the result. If we started the project now at the GC Library, we would have fewer problems, but if you already have a ton of old code, this process can be delayed.
And make the typewriter write the documentation on SOY so that he would write examples and typical solutions there, this will help him to adapt and understand faster. (From the words of our typewriter :))
PS: If anyone is interested, then we keep all the documentation in Google Docs, and bugs in JIRA.
Tools
Why do we open a full stack of GCL tools?
Well, the technology itself is Open Source, but without crutches tools, they never gave up at all. And I know there are many great sites where these solutions would help. Well, in general, I just want to make the Internet a little bit better :)
So, what you need to deploy Closure Platform. This is a test project and a base point to start developing and to demonstrate the capabilities of the GCL.
OS: Linux OS, at worst OS X (BSD). The entire Windows family (I just feel sorry for you :)) comes from the absence of a normal shell.
Java 1.6 and up, ant, bash / sh and python.
Most scripts are written in bash, part in java.
Why not a python? Because I don't like him :)
So, let's begin.
Fast start
git clone github.com/DisDis/ClosurePlatform.git
cd ClosurePlatform
ant
Launch WebUI / index.html in the browser
Project structure
Now in more detail.
CP project structure:
- src - java sources should be stored here, in the example only Constants.java
- themes - themes, gss, soy and locales are stored there.
- gss - styles
- 0-definitions.gss - definitions
- * .gss - styles
- allowed.cfg - allowed parameters
- allowed_prop.cfg - allowed properties
- fixed.cfg - names that are not compressed
- .timestamp is a temporary file that stores the time of the last successful gss compilation.
- locales xlf translations
- * .xlf - translations
- empty.xlf.template - template for empty localization
- templates
- (namespace) - path to templates
- global.properties - global template constants
- .timestamp is a temporary file that stores the last successful compilation time soy
- tools - toolkit
- WebUI is what will go on the web server as root (if you are a java developer, then you are familiar with it)
- js code
- themes - compiled theme data
- css
- js - compiled templates
- img, data - data for the theme, images and everything else.
- * .html - stanichki
- build.soy.xml are tasks for anta to make it easier to run the toolkit.
Customization
In the tools folder there is a file config.properties
What is it for:
TIMESTAMP_FNAME=".timestamp" DEFINITION_GSS="0-definitions.gss" THEMES_PATH=$DIR/../themes THEME_LOCALES="en,ru" LOCALE_SOURCE="en" WEB_ROOT_PATH=$DIR/../WebUI WEB_THEMES_PATH=$WEB_ROOT_PATH/themes TOOL_LOCALE_PATH=$DIR/cl-templates/extractor TOOL_MERGE_PATH=$DIR/merge-xlf
$ DIR is a script folder that uses these settings.
The DEFINITION_GSS parameter is responsible for the gss in which the definitions will be placed.
THEMES_PATH - path to the folder with themes (not compiled gss and soy)
THEME_LOCALES - list of supported localizations
LOCALE_SOURCE - in which locale the texts are written in soy
WEB_THEMES_PATH - folder where compiled gss and soy will be stored
SOURCE_MAP_ROOT - the path to the source, it is easy then to wrap it through nginx where necessary.
SOURCE_MAP_FULLPATH - well, this is the full path to specific uncompressed files
MODULE_PATH - path to modules
All other parameters are not so important.
Eclipse or another IDE
Install the plugin for SOY . We use a plugin for Eclipse .
ZenConding works fine in SOY files.
Add an event to the file change and call it ant soy_update
Localization
First make localization.
There are two ways, at the initial stage, you can simply use the empty template empty.xlf.template, copying and renaming in the appropriate locale. For example: en.xlf, you only need to change the target-language parameter to the desired one inside.
But when you're ready to translate the texts in soy, run create.translate.sh
What this tool does is it scans all topics, each one takes you all soy and makes an xlf file out of them, then it takes an old xlf file and translations for which desc matches, transfers to a new file, items that could not be matched by desc, are recorded in the file of the lost .lost.xlf translations. They must either be moved by hand to the right place, or deleted if translations are no longer needed.
Yes, that's such a crutch. If you can offer a simpler method, we will happily simplify this step. It is quite rare, so there is room for optimization.
Under Mac OS, however, this item will not work :)
Compiling GSS and SOY
The compile.templates.sh script is responsible for finding the changed gss and soy and further compiling these files. You will run it very often, well, or automatically through the IDE. The script works in two modes, debugging and release.
Debug
What is he doing? It scans all threads for files that have changed relative to the time that the .timestamp file was modified and adds them to the list for compilation.
For each gss file, a similar css file is created, the names are not compressed. For soy is similar.
Release
To run in release mode you just need to specify the RELEASE parameter when starting the script: compile.templates.sh RELEASE
In this case, all (regardless of whether they changed or not) gss will be compiled into one compact.css, all names are compressed. All SOY are compiled into separate files with compressed selector names.
Constants
As already written, there are cases in which it is impossible for some properties of an object to be compressed, for example, Client-Server interaction. You can act like Google, but I have not seen such solutions that someone else act like Google, because only they have a full stack for using GCL.
In the example project, I generate a map of incompressible properties from a java file which is taken from src / com / example / utils / Constants.java
The constantsToMap.sh script is responsible for generating, which takes the file Constants.java and creates the property.in.map file from it.
While checking that the name of the constants matches the content (items = "items").
property.in.map is a file in which
< >:< >
In our case, the old and the new value is the same. In the standard package, GCC incorrectly handles constants, there is a bug in the tracker and a patch.
In the test project there is a patched version of GCC, I don’t know when the patch will be taken to the main branch, but the community can speed it up;)
You can generate this file from anywhere, just for an example the solution for java is given, the data format is simple.
Extern
To interact with external code, for example, jQuery or plugins that will not be compressed, you need to register Extern which will be connected in the section “module assembly”.
All extern files are stored in the tools / cl-externs folder.
In the example there are tools / cl-externs / example.js for the project, you can learn in more detail from the office. GCC documentation .
Module assembly
The tools / gcmodule folder and the gcmodule.jar application are responsible for this, it is easier to run it through ant soy.create.modules
Before starting, you need to collect all the gss and soy in release mode. This can be done through ant soy.compile-RELEASE
To simplify the task and do these two actions with one command
ant check.modules
Modules can combine multiple files or even folders. A module may depend on other modules, etc. It is better to select the common parts of your site in a separate module, and for all pages make separate modules. Why should do so will be described below.
For configuring modules there is a file config.cfg
So, why this is not a script, at first I wanted to write it in bash, but it turned out to be very difficult, because of the sorting of the arrays, therefore java.
The essence of the program is that it takes a list of folders and files from config.cfg, generates a dependency tree through a google tool, then sorts it into modules and then gives it to the compiler. This is done for the normal compression mode; in the advanced mode, the compiler can build dependencies itself, but for projects that do not immediately begin with GCL, this will be unrealistic. Therefore, for the transition phase of your project you will use normal compression, and there you need to clearly feed the files in the correct sequence. How to switch to normal compression mode will be written in below.
One thing should be noted here - in one launch, you can generate only one locale for one topic, unfortunately, it is not easy to overcome. But you can run with different parameters and solve this problem.
So, in the test project, after launching the assembly of modules in the WebUI / js / module / ru / * folder, our modules and generated and processed (if run through ant) ​​source map for each file will roll out.
The output will also have the property.out.map file - this is the file containing the map renaming fields:
< >:< >
config.cfg
So, the configuration file is a regular JSON object.
{ options:<>, modules:[<>] }
What are the settings:
{ defines:{ "LANG":"ru", "THEME":"default", "OPTIMIZATIONS": "ADVANCED_OPTIMIZATIONS" }, deps:{ params:" -o list", workPath:"../../tools/closure/bin", exec:"python ./calcdeps.py" }, compiler:{ params:" <anchor>habracut</anchor> ", workPath:"../../tools/closure/", exec:"java -jar compiler.jar" }, moduleManager:{ path:"../../WebUI/js/module/%THEME%/%LANG%/", tree:"moduleinfo.js" }, exclude:[ "/.svn/", "/closure/base.js" ], workPath:"../../WebUI/js" }
Module
{ name:< >, required:[< >], files:[< >], path:[< , %THEME%, %LANG%"], exclude:[< >] }
You can override the defines using the command line:
java -jar gcmodule -D<key>=<value> -D<key2>=<value2> -C"<Config file>"
The configuration file can be omitted.
Example:
java -jar gcmodule -DLANG=en -DTHEME=other
Sourcemap
If you started building modules via ant soy.create.modules, then you automatically started processing maps of compressed files.
The script parseMap.sh is responsible for this.
He takes all the modules that he found in the folder, and changes their paths in the map files to URLs, which is specified in the config file. Why url? Because it is easy to reboot through nginx, and then you can easily debug your applications.
Nginx configuration
There are two options, depending on whether the sources are located locally or the sources are on a remote server (192.168.1.88), select the one you need.
/ etc / nginx / sites-available / sourcemap.cp.com
#------------------------------- server { listen 80; server_name sourcemap.cp.com; proxy_buffering off; expires 0m; # Local map root /home/dis/workspace/CP/; # Remote map to source.cp.com #location / { # proxy_set_header Host "source.cp.com"; # proxy_pass "http://192.168.1.88:80"; # proxy_set_header X-Real-IP $remote_addr; # proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; #} } #-------------------------------
sudo ln -s /etc/nginx/sites-available/sourcemap.cp.com /etc/nginx/sites-enabled/sourcemap.cp.com
On hosts / etc / hosts
127.0.0.1 sourcemap.cp.com
Restart
sudo service nginx restart
Pages for example
In the example, two versions of the pages - regular and release.
Normal pages are used for debugging, the most important sections are related to css and scripts.
<link rel="stylesheet" href="themes/default/css/0-reset.css" type="text/css" media="screen" title="no title"> <link rel="stylesheet" href="themes/default/css/1-main.css" type="text/css" media="screen" title="no title"> <script type="text/javascript" charset="utf-8"> Handlers={}; Handlers.rootPath="/"; </script> <script type="text/javascript" src="themes/default/js/renaming_map.js"></script> <script type="text/javascript" src="js/jquery/core/1.8.1/dev/jquery.js"></script> <script type="text/javascript" src="js/third_party/moment/moment.js"></script> <script type="text/javascript" src="js/closure/goog/base.js"></script> <script type="text/javascript" charset="utf-8"> goog.require('example.page.index'); </script>
Here you need to pay attention only to the fact that all css need to register with your hands.
In the release version, everything is easier.
<link rel="stylesheet" href="themes/default/css/compact.css" type="text/css" media="screen" title="compact"> <script type="text/javascript" charset="utf-8"> Handlers={}; Handlers.rootPath="/"; </script> <script type="text/javascript" src="js/jquery/core/1.8.1/jquery.js"></script> <script type="text/javascript" src="js/third_party/moment/moment.js"></script> <script type="text/javascript" src="js/module/default/ru/closure.js"></script> <script type="text/javascript" src="js/module/default/ru/template_theme.js"></script> <script type="text/javascript" src="js/module/default/ru/game.js"></script> <script type="text/javascript" src="js/module/default/ru/p_index.js"></script>
You need to specify only the modules in the order in which you want.
Total
You will have a starting point for creating fast and compact applications, although there is no binding for supporting themes and locales in java, it is easy enough to write it using of. GCL documentation. I hope that these tools will help someone to speed up the use of GCL, as this is really a very powerful technology. Why Google will not release the toolkit in open source remains a mystery to me. License for GPLv3 scripts.
PS I will be glad to the bugs and patches on github.com
The plugin for gcc and gmodule will be opened a bit later.
Of GCC documentation