📜 ⬆️ ⬇️

Development of client-server infrastructure in javascript (part 2 - server and location)

image Hello, in this article I will talk about the server part and describe the process of placing the application on the openshift cloud hosting.

Development of client-server infrastructure in javascript (part 1 - client)

The server is written in nodejs using swagger-node-express. This gives the advantage of autodocumentation . I want to note that the server was created, rather, in support of the client side, so some checks and optimizations were intentionally omitted and left for the future.

The server consists of a set of modules. In the main file, all modules are initialized.
')
var express = require("express"), swagger = require("./swagger"), orm = require('./orm'), auth = require('./auth'), config = require('./config'), static_files = require('./static'); var app = express(); app.use(express.bodyParser()); auth.init(app); orm.init(app); swagger(app); static_files(app); app.listen(config.listen, config.ipaddr); 

Core modules. For authentication, the http header is used, its name can be set in the parameters. Sesia is stored in memcached, it is just matching api_key -> user_id. User verification code.

 var client = new memcache.Client(cfg.memcache.port, cfg.memcache.host); ... app.use(function(req, res, next){ client.get( req.headers[cfg.header.toLowerCase()], function(error, result){ if(result){ req.user = result; } next(); } ); req.memcache = client; }); 

To work with the database, node-orm2 is used. I note that in package.json, add only the database driver that you will use.
Connection with base and example of model description.

 app.use(orm.express(config.db, { define: function (db, models) { db.define("users", { id : Number, email : String, password : String, twitter : Number, facebook : Number, google : Number, linkedin : String }, { validations: { email: [ orm.enforce.unique("Email already taken!"), orm.enforce.unique({ ignoreCase: true }), orm.enforce.notEmptyString() ], password: orm.enforce.notEmptyString() }, id: "id", autoFetch: false } ); var Conferences = db.define("conferences", { id : Number, title : String, description : String, datetime : Date, place : String, location : String, site : String, logo : String, facebook : String, twitter : String, telephone : String, cost : String, file : String },{ id: "id", autoFetch: false } ); var Decisions = db.define("decisions", { id : Number, decision : ['go', 'not go', 'favorite'], user : Number, conference_id : Number },{ id: "id", cache: false, autoFetch: false } ); Decisions.hasOne('conference', Conferences, {reverse: 'decision'}); } })); 

Locally and on my VPS for statics, nginx is used, but in the case of PaaS, nodejs gives the statics itself, so I created a separate handler for it. It is necessary to give the documentation and the client himself.

 var static_handler = express.static(__dirname + '/../static/'); app.get(/^\/static(\/.*)?$/, function(req, res, next) { if (req.url === '/static') { // express static barfs on root url w/o trailing slash res.writeHead(302, { 'Location' : req.url + '/' }); res.end(); return; } req.url = req.url.substr('/static'.length); return static_handler(req, res, next); }); var main_handler = express.static(__dirname + '/../client/www/'); app.get(/^\/(.*)$/, function(req, res, next) { if(req.url == '/cordova.js'){ return res.send(''); } if(!req.url){ req.url = 'index.html'; } return main_handler(req, res, next); }); 

The cordova.js file is created by phonegap, containing functions for communicating with hardware and other platform-specific features. The browser is simply given a stub, so the client knows that he cannot use the OS functions.
Now you need to initialize the server itself.

 app.use(function(req, res, next) { res.header('Access-Control-Allow-Origin', '*'); res.header('Access-Control-Allow-Methods', 'GET,PUT,POST,DELETE'); res.header('Access-Control-Allow-Headers', cfg.header+', Content-Type'); res.header('Access-Control-Expose-Headers', cfg.header+', Content-Type'); if (req.method == 'OPTIONS') { res.send(200); } else { next(); } }); swagger.setAppHandler(app); swagger.addModels(models); controllers.init(swagger); swagger.configure(cfg.basePath, "0.1"); // Serve up swagger ui at /docs via static route var docs_handler = express.static(__dirname + '/../documentation/swagger/'); app.get(/^\/docs(\/.*)?$/, function(req, res, next) { if (req.url === '/docs') { // express static barfs on root url w/o trailing slash res.writeHead(302, { 'Location' : req.url + '/' }); res.end(); return; } // take off leading /docs so that connect locates file correctly req.url = req.url.substr('/docs'.length); return docs_handler(req, res, next); }); 

That from the browser it was possible to freely use the API are given to cross-origin resource sharing headers. Then the initialization of the swagger. Here you need to pay attention to

 swagger.addModels(models); controllers.init(swagger); 

These models, which are given, it is possible to say, the analogue ViewModel and Ninkak do not belong to the models described in orm.js. In controllers.js, action handlers are written. Example.

 swagger.addGET(conferences.get); swagger.addGET(conferences.list); swagger.addPOST(conferences.decision); swagger.addDELETE(conferences.reject); 

How action is described. I will sign an example of the handler for receiving the conference by id.

 var get = { 'spec': { "description" : "Get conference by id", "path" : "/conferences.{format}/{id}", "notes" : "Get conference", "summary" : "Get conference", "method": "GET", "responseClass" : "Conference", "nickname" : "conference" }, 'action': function (req,res) { if (!req.params.id) { throw swagger.errors.invalid('id'); } var id = parseInt(req.params.id); req.db.models.conferences.get(id, function(err, conference){ if(err){ res.send(500, JSON.stringify({code: 500, header: 'Internal Server Error', message: JSON.stringify(err)})); }else{ if(conference){ if(conference.file){ conference.file = '/static/' + conference.file; } if(req.user){ conference.getDecision({user: req.user}, function(err, decision){ if(err){ res.send(500, JSON.stringify(err)); }else{ conference.decision = decision.pop(); res.send(200, JSON.stringify(conference)); } }); }else{ res.send(200, JSON.stringify(conference)); } }else{ throw swagger.errors.notFound('conference'); } } }); } }; 

The “spec” is mostly documentation information, but here is the “path”, that is, the url on which the handler will call express. The main part is action. Checks for the presence of parameters and a query to the database. Who doesn’t like it when callbacks grow in breadth, I’ll say right away that node-orm2 allows you to build chains of both request and handlers. If files are attached to the conference, a path is generated for them. If the user is logged in, his decision regarding the conference is sought. This is done to save http requests, but I’ll tell you honestly that I don’t know how best and more correct from the point of view of REST: attach the has-one model to the parent or return the id, and let the client send another request if he needs it.

On this, I think, the description of work with swagger-node-express is finished. As soon as the project began to take on a finished look, I wondered where to place it. At first he lived on my VPS, but I’ve been there all the time, so it was decided to post it on PaaS. Because the clouds are fashionable, and more fun. By the way, many cloud hosting offer free accounts, and many of them without time limits. Thus, you can freely host with a dozen projects.

Although I describe openshift in this article, I myself don’t belong to them at all and the procedures for hosting on other hosts will be very similar (it all comes down to git push). I chose this particular hosting completely by accident.

So first you need to register if you haven't done it yet. Then create a project, in their documentation it is described in detail. Especially I want to draw your attention to some things. First of all, if you are thinking of hosting a project in the cloud, then it is better to immediately select exactly where and sharpen the project for this particular hosting. Thus, you will get rid of reworking the project before deployment, for example, I needed to rename the main file to server.js, give support for the statics of the client application, and use an external memcached server.
I moved client / www to server / www and added a directory with statics attached to conferences and swager to the same place. Here's what the new static.js looks like.

 var static_handler = express.static(__dirname + '/www/static/'); app.get(/^\/static(\/.*)?$/, function(req, res, next) { if (req.url === '/static') { // express static barfs on root url w/o trailing slash res.writeHead(302, { 'Location' : req.url + '/' }); res.end(); return; } // take off leading /docs so that connect locates file correctly req.url = req.url.substr('/docs'.length); return static_handler(req, res, next); }); var main_handler = express.static(__dirname + '/www/'); app.get(/^\/(.*)$/, function(req, res, next) { if(req.url == '/cordova.js'){ return res.send(''); } if(!req.url){ req.url = 'index.html'; } return main_handler(req, res, next); }); 

Secondly, ask about SQL / no-SQL databases, because in some databases they are separate, and in some, openshift, for example, occupy a whole slot, and there are so few of them. I gave Memcached to another service garantiadata.com , where I also created a trial account. I have 2 slots left, by the way, yesterday the second instance of the node was also launched, which pleased me, since I was not sure that autoscaling would succeed.
And yes, warmth itself.

 git add --all; git commit -m 'commit'; git push origin master 

Do not forget to add the origin, which can be obtained on the application page in openshift. Also indicate to your application the correct host / port for service.

 exports.listen = process.env.OPENSHIFT_NODEJS_PORT || 8080; exports.ipaddr = process.env.OPENSHIFT_NODEJS_IP || "127.0.0.1"; 

The state can be monitored , with git push you get a log. And also you can go by ssh and see tail_all.
Via ssh, you can also connect to the database, first looking at the connection string.

 echo $OPENSHIFT_POSTGRESQL_DB_URL 

That's all, the project is in the cloud, everyone is happy. As always, I welcome criticism and suggestions.

Source: https://habr.com/ru/post/200356/


All Articles