
Sometimes there is a need to write scripts, which work takes a long time. For example, scripts for creating / deploying backups, installing a demo version of an application, aggregating large amounts of data, importing / exporting data, etc. In order for such scripts to not stop their work at an unexpected moment, you need to know and remember some things.
External timeout
First of all, you need to set an appropriate value for the
max_execution_time parameter in the PHP config.
If the script is launched by the web server (ie, in response to an HTTP request from the user), then you should also correctly configure the timeout parameters in the web server config. For apache, these are the
TimeOut and
FastCgiServer parameters ... -idle-timeout ... (if PHP works through FastCGI), for nginx
send_timeout and
fastcgi_read_timeout (if PHP works through FastCGI).
A web server can also proxy requests to another web server, which will run a PHP script (not a rare example, nginx - frontend, apache - backend). In this case, you also need to configure a proxy timeout on the proxy web server. For apache
ProxyTimeout , for nginx
proxy_read_timeout .
')
User interrupt
If the script is launched in response to an HTTP request, the user can stop the execution of the request in his browser, in this case the PHP script will also stop working. If you want the script to continue its work even after the request is stopped, set to TRUE the
ignore_user_abort parameter in the PHP config.
Loss of open connections
If the script opens a connection with any service / service (from the database, mail server, FTP server, ...), and the connection is not used for some time during the script execution, it can be closed by this service. For example, if you do not perform queries to MySQL for some time while the script is running, MySQL will close the connection after the time specified in the
wait_timeout parameter. As a result, when you try to execute the next request, an error will occur.
In such cases, you should first try to increase the connection timeout. For example, for MySQL, you can run a query (thanks to
Snowly )
SET SESSION wait_timeout = 9999
If this is not possible or this option is not suitable for some reason, then you can check the activity of the connection, in those parts of the code where downtime of its use is possible, and reconnect if necessary. For example, in the MySQLi module there is a useful function
mysqli :: ping for checking connection activity, as well as the configuration parameter
mysqli.reconnect for automatic reconnection when the connection is disconnected. In the absence of such functions for other types of connections, you can try to write it yourself. It is necessary to refer to the service in a trivial way and in case of an error (catch using try ... catch ...) reconnect. for example
class FtpConnection { private $ftp; public function connect() { $this->ftp = ftp_connect('ftp.server'); ... } public function reconnect() { try { if (!ftp_pwd($this->ftp)) $this->connect(); } catch($e) { $this->connect(); } } ... }
or
class MssqlConnection { private $db; public function connect() { $this->db = mssql_connect('mssql.server'); ... } public function reconnect() { try { if (!mssql_query('SELECT 1 FROM dual', $this->db)) $this->connect(); } catch($e) { $this->connect(); } } ... }
Parallel launch
Often, long scripts run on a schedule (by cron), and it is expected that only one copy of the script will work at a time. But it may happen that the next launch of the script will occur before the previous one finishes its work, and as a rule this is undesirable (the same data is imported twice, the data used by the first script is imported, ...).
In such cases, you can use the lock on the resources used, but this problem is always solved individually. Or you can simply check if another copy of this script is running, and either wait for it to finish, or complete the current launch. To do this, you can view a list of running processes, or use the launch block of the script itself, something like:
if (lockStart('script.php')) {
Web server load
In cases where long scripts are run via a web server, the client’s connection to this web server remains open until the script is completed. This is not good, because the task of the web server to process the request as soon as possible and deliver the result. If the connection remains hanging, then one of the workers (processes) of the web server will be busy for a long time. And if quite a lot of such scripts are run at the same time, they can take all (well, or almost all) free workers (for apache, see
MaxClients ), and the web server simply cannot process other requests.
Therefore, when processing a user request, run the script in the background via php-cli, in order not to load the web server, and the user to respond that his request is being processed. If necessary, you can periodically check the processing status using AJAX requests.
Here, perhaps, all that I can tell about this topic. I hope someone will be useful.