📜 ⬆️ ⬇️

SockJS server performance study

Good time of day!

It so happened that I do all sorts of different push technologies using Tornado. A little earlier I described Tornadio2 , the server implementation of the socket.io protocol on top of Tornado .

Now I want to present a similar project - sockjs-tornado .
')
For those who are not very interested, there is another useful information: comparative load testing of PyPy 1.7 versus CPython 2.6.6, sockjs-node and socket.io (both at node.js 0.6.5). Everything is under the cut :-)

First, what is SockJS? This is a client library written in javascript that mimics the Websocket API, but it supports all browsers by using various surrogate substitutes in the form of ajax long-polling, jsonp-polling and the like. In general, very similar to socket.io, but with some key differences.

A small digression - I have nothing to do with the client part of the library, except that I sent out bug reports and annoyed it in every way.

So why do you need it if there is a socket.io ?
I will cite a couple of reasons that led to the development of SockJS:
1. Developers socket.io, starting with version 0.7, went somewhere wrong. Instead of correcting errors, adding support for different browsers, they decided to make the API higher level. I do not argue - all the innovations are very convenient, but the number of errors has not decreased. For example, a fairly serious race condition can not close more than 3 months . Connecting after disconnecting does not work until now. Well, and the like.
2. Sometimes you do not want to become attached to a specific library. If you use socket.io, then it will be difficult to abandon it, since you will have to change all the places where there is a binding to socket.io.

So what is SockJS?
1. As noted earlier, this is a simple replacement for the Websocket API for the browser. Accordingly, the relocation of an existing application that uses web sockets will be fairly painless (if not to talk about the server part)
2. SockJS even works in Opera, which socket.io doesn't really like. In addition, SockJS works correctly with different antiviruses - rolls back to another transport, while socket.io cannot establish a connection at all.
3. SockJS supports streaming protocols: one permanent connection from the server, for outgoing data. socket.io declined streaming transport from version 0.7.
4. The protocol is much simpler and very well documented. For developers, there is even a set of tests that the server implementation must pass.
5. Scalability laid down in the protocol. For example, load balancer does not have to work with cookies to organize a sticky session - all the necessary information is already in the URL.
6. The library is very well tested in different conditions, there are even qunit tests that check both the client and the server directly from the browser. For example, here is an example of tests for sockjs-node: http://sockjs.popcnt.org/

In general, this thing just works.

Now to the second part of the article, performance.

After writing sockjs-tornado, I decided to check how it is in comparison with the “native” server written in node.js. Now node.js is very fashionable, and they often talk about its performance in various push technologies. I will tell you in advance - the results of testing surprised me greatly.

The testing methodology was chosen very simple: we have a chat room with one room. Each incoming message is simply sent by the server to all chat participants. If interested, here is the server code.

There is a Websocket client that sends ping and waits for its “own” response. After receiving the answer, it considers the time taken between sending and receiving. Results for different levels of parallelism and the number of sent messages are saved and a graph is plotted.

Maybe some will ask - what is actually being tested? And this is what is being tested:
- The speed of implementing Websocket protocol for different servers
- The maximum number of messages at which the server starts to choke
- Overhead to support a large number of connections
- Response time at different load levels

To another question, why do we need a "stupid" chat, which is generally without logic? If someone came across such a project as the Humble Indie Bundle , they have the amount of money earned is shown in real time. So, they use a "broker" that "holds" a large number of web clients. And they also have a data source (producer), which from time to time sends the broker how much money was earned. The broker must send this information to all its connected clients. The faster the broker works, the greater the number of clients it can service in the shortest time.

The study was done in English, as the developers of sockjs asked to do comparative testing with sockjs-node and it can be viewed right here . If anyone is interested, I can translate the article into Russian.

In short, it turns out the following picture:
- sockjs-node can send up to 45,000 messages per second with an average response time of 200 ms.
- sockjs-tornado in cpython 2.6.6 can produce up to 55,000 messages per second with a response time of 200 ms
- sockjs-tornado on pypy 1.7 simply “breaks” with its 150,000+ messages per second.

Of course, servers can send more messages per second, but the response time grows and the application ceases to be realtime :-)

A comparative chart can be found here . On the X axis, we have the number of messages sent by the server in one second. Y axis - response time. Each line is a server combination (node ​​= sockjs-node, socket.io = socket.io node, cpython = sockjs-tornado in cpython, pypy = sockjs-tornado in pypy 1.7) with the number of simultaneous connections. socket.io is an example of the performance of another project on node.js.

Even if you don’t compare node.js and cpython, pypy performance was a complete surprise to me.

Well, in conclusion.

I recommend to look towards SockJS if any real-time functionality is planned, even if you have already considered options with socket.io. And I hope that sockjs-tornado will be useful to someone else.

Source: https://habr.com/ru/post/134822/


All Articles