📜 ⬆️ ⬇️

Fast pool for php + websocket without node node jua based on lua + nginx

nginx + lua

Briefly: nginx does not know how to pull websockets, and php works per request. We need an interlayer that will keep the websites open and, when the data arrives, connect to php (via the same fastcgi) and send back the answer.

update: This is not a php solution, as compared to even with nodejs, they are much slower.
')
The topic, as it turned out, is not new, the source code is already drawn from 2014, but, nevertheless, there is very little information about the trick, which will be discussed here. You can google " websockets php ". The topic is further aggravated by the fact that the examples of implementation found (two, more precisely) do not work, including the one in the documentation :)

Here somewhere inside felt, knew that is. I so long wanted to have this Middleware inside Nginx, in order not to use different rather slow php libraries ( one and two times ) and skip the single-thread nodejs. And I want a lot of websockets (and as much as possible), and that the extra costs for the interlayer are smaller. So, in order not to produce a bunch of machines with nodejs (in the future, under high loads, this is what they usually do), we will use the fact that Nginx provides with some extensions in the form of lua + resty. Nginx + lua can be installed from the nginx-extras package, or you can build it yourself. From Resty, we only need websockets . We download and upload the contents of the lib directory somewhere on the way (I have this / home / username / lib / lua / lib , and in a good way it would be necessary in / usr / local / share / lua ).

Standard nginx + websockets works like this:

  1. Client connects to nginx
  2. Nginx proxies to upstream / opens a proxy stream with another server (Middle Server based on nodejs + sockets.io for example) serving websockets.
  3. The middle server throws the socket connection to some epoll event listener and waits for data.
  4. When receiving data, the Middle Server server, in turn, opens a Fastcgi connection with php, waits and picks up the response. Sends it to the socket. Returns socket again waiting for data.
  5. And so on in a circle, until a special websocket closing frame comes.

Everything is simple, except for the overhead of resources and single-threaded this solution.

In the proposed scheme, MiddleServer turns into middleware inside nginx. In addition, there is no waiting for Fastcgi, all the work is done by the same epoll, to which nginx trusts an open socket, and in the meantime the nginx stream can do other things. The scheme allows you to simultaneously work with a bunch of websockets scattered in streams.

Here I will give only a simplified code that relates to the task without the rest of the hosting settings. I did not try to make all the headers correct as superfluous.

lua_package_path "/home/username/lib/lua/lib/?.lua;;"; server { # ,     ,     nginx location ~ ^/ws/?(.*)$ { default_type 'plain/text'; #        -   ,     content_by_lua_file /home/username/www/wsexample.local/ws.lua; } #   ,     php #    POST ,    json payload location ~ ^/lua_fastcgi_connection(/?.*)$ { internal; #     nginx fastcgi_pass_request_body on; fastcgi_pass_request_headers off; # never never use it for lua handler #include snippets/fastcgi-php.conf; fastcgi_param QUERY_STRING $query_string; fastcgi_param REQUEST_METHOD "POST"; # $request_method; fastcgi_param CONTENT_TYPE "application/x-www-form-urlencoded"; # $content_type; fastcgi_param CONTENT_LENGTH $content_length; fastcgi_param DOCUMENT_URI "$1"; #  $document_uri fastcgi_param DOCUMENT_ROOT $document_root; fastcgi_param SERVER_PROTOCOL $server_protocol; fastcgi_param REQUEST_SCHEME $scheme; fastcgi_param HTTPS $https if_not_empty; fastcgi_param GATEWAY_INTERFACE CGI/1.1; fastcgi_param SERVER_SOFTWARE nginx/$nginx_version; fastcgi_param REMOTE_ADDR $remote_addr; fastcgi_param REMOTE_PORT $remote_port; fastcgi_param SERVER_ADDR $server_addr; fastcgi_param SERVER_PORT $server_port; fastcgi_param SERVER_NAME $server_name; fastcgi_param SCRIPT_FILENAME "$document_root/mywebsockethandler.php"; fastcgi_param SCRIPT_NAME "/mywebsockethandler.php"; fastcgi_param REQUEST_URI "$1"; #      .      lua   -   php . fastcgi_pass unix:/var/run/php/php7.1-fpm.sock; fastcgi_keep_conn on; } 

And the ws.lua code:

  local server = require "resty.websocket.server" local wb, err = server:new{ -- timeout = 5000, -- in milliseconds --     max_payload_len = 65535, } if not wb then ngx.log(ngx.ERR, "failed to new websocket: ", err) return ngx.exit(444) end while true do local data, typ, err = wb:recv_frame() if wb.fatal then return elseif not data then ngx.log(ngx.DEBUG, "Sending Websocket ping") wb:send_ping() elseif typ == "close" then -- send a close frame back: local bytes, err = wb:send_close(1000, "enough, enough!") if not bytes then ngx.log(ngx.ERR, "failed to send the close frame: ", err) return end local code = err ngx.log(ngx.INFO, "closing with status code ", code, " and message ", data) break; elseif typ == "ping" then -- send a pong frame back: local bytes, err = wb:send_pong(data) if not bytes then ngx.log(ngx.ERR, "failed to send frame: ", err) return end elseif typ == "pong" then -- just discard the incoming pong frame elseif data then --      uri,  json payload   body local res = ngx.location.capture("/lua_fastcgi_connection"..ngx.var.request_uri,{method=ngx.HTTP_POST,body=data}) if wb == nil then ngx.log(ngx.ERR, "WebSocket instaince is NIL"); return ngx.exit(444) end wb:send_text(res.body) else ngx.log(ngx.INFO, "received a frame of type ", typ, " and payload ", data) end end 

What else can you do about it? Measure the speed and compare with nodejs :) And inside lua you can make requests in Redis, MySQL, Postgres ... check cookies and other authorization tokens, process sessions , cache answers in memcached and then quickly and quickly give to other clients with the same requests inside the websocket.

Known to me flaws: the maximum data packet size on the web socket is 65Kb. If desired, you can add a breakdown into frames. The protocol is not complicated.

Test html (ws.html):

HTML is here
 <!DOCTYPE> <html> <head> <meta charset="utf-8" /> <script type="text/javascript"> "use strict"; let socket; function tryWebSocket() { socket = new WebSocket("ws://try6.local/ws/"); socket.onopen = function() { console.log(" ."); }; socket.onclose = function(event) { if (event.wasClean) { console.log('  '); } else { console.log(' '); // , ""   } console.log(': ' + event.code + ' : ' + event.reason); }; socket.onmessage = function(event) { console.log("  " + event.data); }; socket.onerror = function(error) { console.log(" " + error.message); }; } function tryWSSend(event) { let msg = document.getElementById('msg'); socket.send(msg.value); event.stopPropagation(); event.preventDefault(); return false; } function closeWebSocket(event) { socket.close(); } </script> </head> <body onLoad="tryWebSocket(event);return false;"> <form onsubmit="tryWSSend(event); return false;"> <button onclick="tryWebSocket(event); return false;">Try WebSocket</button> <fieldset> Message: <input value="Test message 4444" type="text" size="10" id="msg"/><input type="submit"/> </fieldset> <fieldset> <button onclick="closeWebSocket(event); return false;">Close Websocket</button><br/> </fieldset> </form> </body> </html> 


Test php (mywebsockethandler.php):

PHP is here
 <?php header("Content-Type: application/json; charset=utf-8"); echo json_encode(["status"=>"ok","response"=>"php websocket json @ ".time(), "payload"=>[$_REQUEST,$_SERVER]]); exit; 


To use FastCGI for Lua, install another Resty extension.

Source: https://habr.com/ru/post/338614/


All Articles