Using an opencart plugin I am getting these errors in my apache error log
Read POST information timed out
(22)Invalid argument: client stopped connection before rvputs completed
I just wanted to verify this is due to a client side http connection timeout when reading a large xml file.
Is there any configuration I can change server side to prevent this? Since it's related to a client side timeout I am doubtful.
My idea is to remove the http requirement altogether. I'm thinking of just running a cron script, or starting a process and not waiting for it to finish like here. Don't wait for the process to exit
Put set_time_limit(0) on top of the request page. 0 is meant for max possible time.
Outputting data to the browser periodically fixed this. Apache was loading a request for about 10 minutes without output causing the browser to close the connection on the client side. I just sent periodic status updates using echo and it worked.
Related
I was wondering - as I wrote a snippet of code that could update up to 10,000 rows and might take a few seconds to complete, if when the file is accessed via an ajax request, the post query is send to the php file, then the browser is closed, does the file get fully executed? assume it takes about 25 seconds to complete the request, the user might not wait for 25 seconds, is good enough to "ping" this file and let the user browse along or close its browser window as the mysql queries are taking place?
The request has 3 parts
A browser connected to the web server
PHP script that is executed by the server
A query running in the DB server
When you close the browser, connection with the server is closed. The server may or may not kill the started PHP script (if PHP is running as apache module, it would be killed, unless ignore_user_abort is called). Also the web server may have a time limit for the request and either kill the script or just send the client a connection timeout message, without killing the script, but without giving it the chance to send anything to the browser.
Here is the tricky part - the update is running in the database and it won't be killed by the web server, neither by PHP.
So what you want to achieve is pinging a PHP script, that is executing a query, but the client do not wait the result. You may or may not want the query itself to be asyncronous (the PHP script not to wait the query), but you have to tell the client that the request is fulfilled, by sending content-length of 0 for example, and flushing the output (the http headers actually), and running PHP with ignore_user_abort so it continues the execution.
Use ignore-user-abort to continue running the script even after the client has disconnected
ignore_user_abort(true);
set_time_limit(0);
You can use connection_status to track if the connection has disconnected
if (connection_status()!=0) { //connection disconnected
Here's the answer for your question:
http://www.php.net/manual/en/features.connection-handling.php
Normally no, but your script pass in ABORTED status.
More details in the manual page about Connection handling:
http://www.php.net/manual/en/features.connection-handling.php
Internally in PHP a connection status is maintained. There are 3
possible states:
0 - NORMAL
1 - ABORTED
2 - TIMEOUT
When a PHP script is running normally the NORMAL state, is active. If
the remote client disconnects the ABORTED state flag is turned on. A
remote client disconnect is usually caused by the user hitting his
STOP button.
As soon as you close the browser, it disconnects from the server before getting the reply. I do not know exactly how different servers behave in this condition but I assume that most of the server will abort the thread that they are working on to reply the request.
Further, things can be different with different operations - i.e. file i/o or database operation. If it is an atomic database operation, my assumption is, it will complete any how.
My web page uses an ajax call to return data from a very long php script, so if I exit the page early and reload the page, that php script is still being carried out, which will cause me problems.
Is there a way I could tell the server to abort the execution of the previous ajax request, if there is one that's still running?
thanks
Not directly. You will need to set up a scheme where the work is offloaded to an external (to the web server) process, and that process has a communication channel with the web server set up that enables it to check if it should drop what it's doing every so often (e.g. a simple but not ideal scheme would be checking for the last-modified time of a "lock file"; if it's more than X seconds in the past, abort the task).
Your web page would then make a call to a script that would then "keep alive" the background task appropriately (e.g. by touching the lock file of the previous example).
This way, when the task is initiated through an AJAX request, the client begins making "keep-alive" requests to the server and the server forwards the "keep-alive" message to the external process. If the user reloads the page the "keep-alive" requests stop and the worker process will abort when the keep-alive threshold elapses. If all goes well and the work completes, your server would detect this through the communication channel it has with the worker process and report this back to the client on their next keep-alive "ping".
Maybe try use set_time_limit() function for this script.
Or create some few php scripts and randomly generates a url for it.
did you try setting the XMLHttpRequest object to null when the page reloads?
I was wondering about the lifespan of a PHP script when called via Ajax. Assume that there is a long-running (i.e. 30 seconds) PHP script on a server and that page is loaded via Ajax. Before the script completes, the user closes the browser. Does the script continue running to completion, is it terminated, or is this a function of the server itself (I'm running Apache fwiw). Thanks for any help.
This may be of interest: ignore_user_abort()
ignore_user_abort — Set whether a client disconnect should abort script execution
However note
PHP will not detect that the user has aborted the connection until an attempt is made to send information to the client.
The script will continue running. Closing the browser on the client does not notify the server to stop processing the request.
If you have a large time consuming script, then I would suggest splitting it up into chunks. Much better that way
I've written in PHP a script that takes a long time to execute [Image processing for thousands of pictures]. It's a meter of hours - maybe 5.
After 15 minutes of processing, I get the error:
ERROR
The requested URL could not be retrieved
The following error was encountered while trying to retrieve the URL: The URL which I clicked
Read Timeout
The system returned: [No Error]
A Timeout occurred while waiting to read data from the network. The network or server may be down or congested. Please retry your request.
Your cache administrator is webmaster.
What I need is to enable that script to run for much longer.
Now, here are all the technical info:
I'm writing in PHP and using the Zend Framework. I'm using Firefox. The long script that is processed is done after clicking a link. Obviously, since the script is not over I see the web page on which the link was and the web browser writes "waiting for ...".
After 15 minutes the error occurs.
I tried to make changes to Firefox threw about:config but without any success. I don't know, but the changes might be needed somewhere else.
So, any ideas?
Thanks ahead.
set_time_limit(0) will only affect the server-side running of the script. The error you're receiving is purely browser-side. You have to send SOMETHING to keep the browser from deciding the connection's dead - even a single character of output (followed by a flush() to make sure it actually get sent out over the wire) will do. Maybe once every image that's processed, or on a fixed time interval (if last char sent more than 5 minutes ago, output another one).
If you don't want any intermediate output, you could do ignore_user_abort(TRUE), which will allow the script to keep running even if the connection gets shut down from the client side.
If the process runs for hours then you should probably look into batch processing. So you just store a request for image processing (in a file, database or whatever works for you) instead of starting the image processing. This request is then picked up by a scheduled (cron) process running on the server, which will do the actual processing (this can be a PHP script, which calls set_time_limit(0)). And when processing is finished you could signal the user (by mail or any other way that works for you) that the processing is finished.
use set_time_limit
documentation here
http://nl.php.net/manual/en/function.set-time-limit.php
If you can split your work in batches, after processing X images display the page with some javascript (or META redirects) on it to open the link http://server/controller/action/nextbatch/next_batch_id.
Rinse and repeat.
batching the entire process also has the added benefit that once something goes wrong, you don't have to start out the entire thing anew.
If you're running on a server of your own and can get out of safe_mode, then you could also fork background processes to do the actual heavy lifting, independent of your browser view of things. If you're in a multicore or multiprocessor environment, you can even schedule more than one running process at any time.
We've done something like that for large computation scripts; synchronization of the processes happened over a shared database---but luckily enough, they processes were so independent that the only thing we needed to see was their completion or termination.
I want to run a relatively time consuming script based on some form input, but I'd rather not resort to cron, so I'm wondering if a php page requested through ajax will continue to execute until completion or if it will halt if the user leaves the page.
It doesn't actually output to the browser until a json_encode at the end of the file, so would everything before that still execute?
It depends.
From http://us3.php.net/manual/en/features.connection-handling.php:
When a PHP script is running normally
the NORMAL state, is active. If the
remote client disconnects the ABORTED
state flag is turned on. A remote
client disconnect is usually caused by
the user hitting his STOP button.
You can decide whether or not you want
a client disconnect to cause your
script to be aborted. Sometimes it is
handy to always have your scripts run
to completion even if there is no
remote browser receiving the output.
The default behaviour is however for
your script to be aborted when the
remote client disconnects. This
behaviour can be set via the
ignore_user_abort php.ini directive as
well as through the corresponding
php_value ignore_user_abort Apache
httpd.conf directive or with the
ignore_user_abort() function.
That would seem to say the answer to your question is "Yes, the script will terminate if the user leaves the page".
However realize that depending on the backend SAPI being used (eg, mod_php), php cannot detect that the client has aborted the connection until an attempt is made to send information to the client. If your long running script does not issue a flush() the script may keep on running even though the user has closed the connection.
Complicating things is even if you do issue periodic calls to flush(), having output buffering on will cause those calls to trap and won't send them down to the client until the script completes anyway!
Further complicating things is if you have installed Apache handlers that buffer the response (for example mod_gzip) then once again php will not detect that the connection is closed and the script will keep on trucking.
Phew.
It depends on your settings - usually it will stop but you can use ignore_user_abort() to make it carry on.
Depending on the configuration of the web server and/or PHP, the PHP process may, or may not, kill the thread when the user terminates the HTTP connection. If an AJAX request is pending when the user walks away from the page, it is dependent on the browser killing the request (not guaranteed) ontop of your server config (not guaranteed). Not the answer you want to hear!
I would recommend creating a work queue in a flat file or database that a constantly-running PHP daemon can poll for jobs. It doesn't suffer from cron delay but keeps CPU/memory usage to a usable level. Once the job is complete, place the results in the flat file/database for AJAX fetch. Or promise to e-mail the user once the job is finished (my preferred method).
Hope that helps
If the client/user/downloader/viewer aborts or disconnects, the script will keep running until something tries do flush new data do the client. Unless you have used
ignore_user_abort(), the script will die there.
In the same order, PHP is unable to determine if client is still there without trying to flush any data to the httpd.
found the actual solution for my case of it not terminating the connection. The SESSION on my Apache/Php server needed to close before the next one could start.
Browser waits for ajax call to complete after abort.