In the
last article, we talked about what constitutes
Cloud Commander
, about the main reasons for the emergence, disadvantages, merits, development process, as well as the philosophy of the file manager. Today we will talk about what has changed since that time, and how these changes affect the further development of the application. The article will cover a lot of node.js modules, some of which the reader may hear for the first time.
Changes
There are many changes, there are external, rushing eyes, and there is little noticeable. Finally, there was support for disks in Windows, as well as the ability to choose between editors:
Dword and
Edward . But this is not the main thing.
The main change can be considered a change in the course of development. Previously,
Cloud Commander
developed as a solid monolithic application that did not depend on third-party modules, could use them if they were available, but could do without them losing some of the functionality, but at the same time working stably.
')
There are several minuses in this approach:
- After a part of the application was improved by new functionality or fixed bugs, it could not become available until the entire application was updated, in which meanwhile a new portion of bugs appeared, and so on without end. Since the components depend on each other a little, the user, expecting that the new release will fix the menu in the editor, may have stumbled upon the fact that the menu works fine, but now the console has broken. And they will fix it only in the new version, in a few months, having broken something else.
- A monolithic application is difficult to document, not only because it is itself tied up, and something constantly changes in it, but also because it is large enough, and the description of the functionality of such a giant means freezing in development, or support not relevant documentation. In any case, documenting everything in a row may not pay for itself, and sometimes it is better to simply refactor the code so that it would be understandable by itself.
- In the monolithic code, the threshold of entry is high, there is a big difference in looking at a function in 10 lines or 10 pages, and the point is not only in the experience of the developer, in which there was a desire to make changes to the program, but also that psychologically much easier to understand, understand and realize the short code. Easier to find the error: a logical or typo. And it is much easier to make changes, not only to the main developers, but also to beginners.
The
previous article said that
Cloud Commander
uses modules. And this is true, from the first days of development, the application was written in such a way that the code could be reused. Thus, the functionality that is reused was put into separate functions and files. But it was all implemented in the framework of a single project using the features unique to it. Accordingly, the code was not fully portable, it could not simply be taken and used in other applications. He was tied up on himself.
Realizing this, it would be very difficult to continue developing in the same manner. It was decided that the functionality that can be reused should be rendered into separate
npm
modules. Where they can be reused by other people who may want to improve the code, or find an error in it. In the end, everyone will benefit from this.
Composition
So, you should immediately say that version
2.0
means not a loss of backward compatibility, but a shift in the project's philosophy. The part that could be rendered gained a new repository, a place in
npm
and the ability to be used in other applications.
There were quite a few such modules, and we'll talk about them.
Minify
minify is a hodgepodge of modules that process: js, css, html and transfer images to base64. At the same time, the result is added to the tmp folder, for faster obtaining of the result, at the next call. This is done using the
tomas module.
Join
There are different ways to speed up the loading of a web page: merge files, minify, compile using
browserify .
join works differently, but its purpose is the same.
It sticks files together on the fly into one http request, minifishing them with
Minify
, if necessary. Connecting it is quite simple, since it is middleware compatible with
express .
And this code is quite enough:
app.use(join({ dir: __dirname }));
Bonding, in turn, occurs as follows:
<link rel="/join:/css/normilize.css:/css/style.css">
Restafary
Restafary is a
REST implementation for the
CRUD file system, which is designed as
middleware
for
express
and can be used wherever there is a need for network file processing.
Console
Console is another express compatible middleware.
Connects in a couple of lines: on the server and client.
Here is the server part:
Console({ server: server, socket: socket, online: true, minify: true, prefix:'/console' })
And on the client:
<div class="console"></div> <script src="/console/console.js"></script> <script> Console('.console', function() { console.log('console ready') }); </script>
As a result, we get a console that can receive commands on the client and execute them on the server.
Spawnify
While we are talking about the execution of commands, I want to note
spawnify . Add-on
exec
and
spawn
, which, among other things, generates a
cd
event, with which you can find out that the folder has changed, and respond to it in time.
Edward

Editor
Edward , based on
ace , also stood out in a separate module that can use not only Cloud Commander, but any other application, as this is also an
express middleware
. There are hotkeys out of the box, as well as
minify
and
beautify
. To get started using enough code on the server:
app.use(edward({ minify : true, online : false, diff : true, zip : true }));
And client:
<div class="edit" data-name="js-edit"></div> <script src="/edward/edward.js"></script>
html, body, .edit { height: 100%; margin: 0; }
edward('[data-name="js-edit"]', function(el) { console.log('edward is ready'); });
Dword
Dword editor in terms of code, fully compatible with Edward. The main difference is that Dword is based on
CodeMirror . Now, users of Cloud Commander have the opportunity to choose an editor to their liking. Both Ace and CodeMirror are quite mature projects. True, they both have flaws:
- using Ace, it will not be possible to edit the code on the mobile normally, since there is no possibility to scroll the code.
- CodeMirror does not use Web Workers, therefore when using JSHINT, the editor starts working slower than Ace.
During the adaptation of CodeMirror to the possibility of becoming a full-fledged replacement for Ace, several plugins were written:
- CodeMirror Searchbox - similar to Ace, a search and replace text tool.
- CodeMirror Show Invisibles - adds a mode of displaying invisible characters similar to the one used in Ace and Chrome Developer Tools, but unlike the latter, it is able to show the end of lines.
By the way, this article is partially written in the Dword editor, which is quite suitable for working with text of any kind.
Mollify
The latest middleware for today will be
mollify . It allows minifying
js
,
css
and
html
on the fly. For this, he uses the
Minify described above.
Rendy
When you need a really simple
Rendy template engine,
it can come in handy. It does one thing. Makes it as simple as possible. Works in all environments that support ES5. You can use this way:
var Tmpl = 'hello {{ where }}'; result = rendy(Tmpl, { where: 'in browser' });
Ponse
In the comments to the previous article I was asked about the web server used. So here, this is
Ponse . Simple web server. With
express
will not be, but copes with some things well.
Pipe-io
When talking about
data flows , there are a few nuances that need to be taken into account.
One of them is this: error handlers must be attached to each stream when using a
pipe
, otherwise, in case of an error, there will be crash.
Pipe-io helps solve this problem by simplifying the syntax to a minimum.
It was like this:
var fs = require('fs'), read = fs.createReadStream('README.md'), write = fs.createWriteStream('README2.md'), open = function(msg) { read.pipe(write); }, finish = function() { console.log('done'); done(); }, error = function(e) { e && console.error(e.message); done(); }, done = function() { read.removeListener('error', error); write.removeListener('error', error); write.removeListener('open', open); write.removeListener('finish', finish); } read.on('error', e); write.on('error', e); write.on('open', open); write.on('finish', finish);
It became so:
var fs = require('fs'), pipe = require('pipe-io'), read = fs.createReadStream('README.md'), write = fs.createWriteStream('README2.md'), error = function(e) { e && console.error(e.message); return e; } pipe([read, write], function(e) { error(e) || console.log('done'); });
There can be as many streams in the array as there is an error handler for each of them. If something goes wrong, an error occurs in the
callback
. At the end of the
pipe
function, all handlers are removed from the streams.
Win32
You can get a list of disks in Windows using a special
win32 module. Systems from
XP
to
8-
supported.
Flop
You can delete, copy, read and move folders using the
flop module. This is its main purpose.
Copymitter
When you need to monitor the copy status, for example, the
Copymitter module will be suitable for displaying the progress status. It will generate events on each copied file, or on a change in the status of progress, which can be from 1% to 100%. In case of an error, the process can be interrupted or continued.
Development
The development process has not changed much; the main difference is that development is carried out not only in one global repository, but also in the repository of the module in which a bug is found or a new feature is needed. In the main repository, in the arsenal of code verification and documentation, two tools were added:
jscs and
yaspeller .
In the process of generating the
.jscsrc
using
--auto-configure
, it turned out that my code, with a few exceptions, most of all satisfies the
Douglas Crockford agreements. With whom I, of course, was familiar before. But for some reason it seemed to me that we have with him much more differences.
I would like to say a few words about the latest
wisdom module for today. It consists of several small modules, each of which simplifies and automates the routine work of a JavaScript programmer, such as:
ChangeLog
generation- update version in
package.json
and bower.json
(if exists) - tag adding
- publish in npm
- release creation on github
All these tasks could be easily implemented with the help of
Gulp
. But once again copying
gulpfile.js
into a new project, I caught myself thinking that I was doing this too often, and this was not very convenient, all the time dragging it along.
Wisdom
does not require a configuration file. It simply does all the necessary things that are needed before the module is published. This is not a replacement for
Gulp or
Grunt . This is just a look from a different angle. My workflow automated this tool, simplified and made much more convenient.
Installation
Unfortunately, I have to admit: the installation process is a bit complicated. Since modules can be used not only for the backend, but also for the frontend, it does not make sense to duplicate the code, because its installation is possible in different ways. The code installed via
npm
will be in the
node_modules
folder, but no one guarantees how high, relative to the current directory, it will be located.
This is good for applications written in
node.js
or using
browserify
. For the rest,
bower might be
fine . It takes the files marked with the desired tag from the repository and places them in the
modules
folder (according to
.bowerrc
). Thus, the updated module will be available for both the server and the client. This is the positive side. But there is also a negative one. To install Commander properly,
Git and
Bower
must be installed. After that, just run the command in the terminal:
npm install cloudcmd -g
In addition, the reader can always take the collected archive with all dependencies (except for the Node) from the
release page. Or take the latest version from the
repository and execute the command:
npm install -g
Afterword
Despite the fact that an application broken into modules is much easier to maintain, from time to time it happens that one application uses different versions of the same module. And the error corrected in one version continues to manifest itself in another. This happens rarely and happens mostly in the dev-version of the project, in which some node modules are symlinks to the corresponding repositories. As for the rest, it is much easier to work, reuse and develop with smaller particles.
I often have to explain to developers the benefits of modular development and sometimes this is given in a very complicated way. Apparently, some things become clear only when a person comes to this himself. In any case, as if the reader did not come to this, I really hope that the tendency towards small applications will prevail over frameworks that do a lot of work at once. And the latest observations in this direction fully justify my expectations, which is good news.
All typos, if found, please notify in a personal, or in a
hidden
repository branch.