PHP script being interrupted at random time - php

I have a php script that does a lot of stuff, produces a lot of output and can take a few minutes to complete.
I have already placed a set_time_limit(999999) at the beginning.
Before that, it obviously exceeded the time limit of 30 seconds and got interrupted. When that happened, a fatal error was logged indicating just that.
Now, from the browser I call that script, I observe as the browser receive the output from the script almost in real time, but at a random point, the script just is interrupted. It does not exceed the time limit (obviously), and NO ERROR at all is logged. Actually, if I try it a second time it gets interrupted at a different point.
The question is, if no error is encountered, and the time limit is not reached, what else can cause the script to be interrupted? Perhaps a disconnection from the client (browser) due to timeout? (not timeout waiting for data, though, as data is constantly being output). Perhaps the browser may have a maximum size for an html page?
But if it's one of these two cases, how comes this is not logged as an error?

When the script reaches the time limit, a fatal error is thrown, so if you are not seeing any error, this shouldn't be the cause (Make sur that error reporting is activated, either set it in the script or in the php.ini file).
If you want to desactivate time limit use this instead :
set_time_limit(0);
Without more informations, we can't be of much help unfortunately.

Related

Running long process on Php/Apache/Ubuntu

I'm trying to run long process on Php/Apache/Ubuntu (AWS)
This is a simple process that builds a cache during the night.
The process can run for a few hours, and is initiate by crontab accessing a special url with curl.
Sometimes the process stops at a random with no error, I suspect that it is killed by the apache, although I set
#set_time_limit(0);
#ini_set('max_execution_time', -1);
Is it a known issue with Php/Apache/Ubuntu?
Is there a way to solve it?
Currently, my solution is to run the process every 5 minutes, and store the state on the disk, and continue from where it stopped.
But I would like to know more about this issue and if there is a better way to tackle it?
NOTE:
The process stops randomly or doesn't stop at all - the longer the process (i.e. bigger cache) the chance it will stop is higher
One possible reason is that the client disconnects (e.g. after a timeout): PHP stops the request processing by default in this case. To prevent this, you can use ignore_user_abort:
ignore_user_abort(true);
Also note that the set_time_limit call may actually fail (e.g. on a restricted environment) — so it might make sense to remove the error suppression (#) or explicitly check whether set_time_limit(0) returned true.

php function re-executed after about 15 minutes

I have a php script which is runtime should take about 20-25 minutes, and approximately 15 minutes during execution time it simply re-executing for no apparent reason.
I gave it a token to make sure it kills the second execution before it runs again, but it's not the solution I prefer to use in the long run.
Some details about the script:
It ignites after a socket connection (socket does not timeout nor restarting at any point of the session.)
I gave the php script ignore_user_abort(true) in order to prevent the script from ending when user goes somewhere else.
I don't get any execution runtime errors, or any errors at all... In the client side I get a GET error message to the script after 15 minutes, which is actually the second execution try. It's still running and socket is still delivering data, though it appears the script has ended.
There are no two separated requests
I'm not sure what could be the problem of it, hopefully one of the masters here could hint with a clue?
Much thanks!

PHP script runs much longer than max_execution_time?

PHP is running as an Apache module.
The script start with a: ini_set('max_execution_time', 300);
What it does is basically connecting to a database, doing a big SELECTquery and looping through the results, writing them to a file and echoing a "result ok" after each write with explicit flush();
There is no sleep() call.
This is a "test" script made from a co-worker of mine for backup purposes and is intended to run up to a few hours! I thought I was aware of script execution time limit and thought his script would time out after 300 seconds...
But it didn't !
It's invoked from a web browser. The page is left open and we can see the results being echoed in real-time.
Why doesn't it time out?
Even stranger, one of the test issued a "Maximum execution time of 300 seconds exceeded" but this appeared at least after 2 hours of execution!
What's going on here? Is there something to understand between max_execution_time and flush() or a browser window being kept opened?
As you can see on the man page for the set_time_limit function, here the total execution time you are setting only affects the actual script. The time spent on database queries or any other external calls is not counted (if the OS is not Windows).
The only thing I can think of that may cause this is if PHP is running in safe mode. ini_get('max_execution_time') should tell you if it's actually being set.
Edit: Spotted your comment above.
echo ini_get('max_execution_time'); // says 300
If 300 is reported, and your not on Windows, #mishu is likely right. Your SELECT query is probably taking hours.

PHP set_time_limit no effect

I have a very painful slow script that gets lots of data from MySQL and creates a large report out of it that it serves to the user at the end as application/force-download.
Long story short, on production server it keeps terminating after about 30 seconds (quite consistently) and spitting out an empty file instead. On development server it works fine, but it does take significantly longer to execute - about 90 seconds. Just to be 'safe', I set my php.ini file to max_execution_time = 2000 and also run set_time_limit(4000) in the beginning of my script (numbers way over the expected completion time, but just to be sure ;)).
What could be causing my web server to ignore the time limits I set and quit on me after only 30 seconds?
Edit: one thing I know for sure is that it takes MySQL portion of the code 8-9 seconds to complete, and it successfully gets past that point every time.
Maybe the PHP safe_mode.
Try to do a
die(ini_get('max_execution_time'))
to read the value after you have called the set_time_limit(0); to see if actually it get overwrited.
If it gets overwrited to 0 and your script still dies then the cause could be somewhere else in your code

How to Increase the time till a read timeout error occurs?

I've written in PHP a script that takes a long time to execute [Image processing for thousands of pictures]. It's a meter of hours - maybe 5.
After 15 minutes of processing, I get the error:
ERROR
The requested URL could not be retrieved
The following error was encountered while trying to retrieve the URL: The URL which I clicked
Read Timeout
The system returned: [No Error]
A Timeout occurred while waiting to read data from the network. The network or server may be down or congested. Please retry your request.
Your cache administrator is webmaster.
What I need is to enable that script to run for much longer.
Now, here are all the technical info:
I'm writing in PHP and using the Zend Framework. I'm using Firefox. The long script that is processed is done after clicking a link. Obviously, since the script is not over I see the web page on which the link was and the web browser writes "waiting for ...".
After 15 minutes the error occurs.
I tried to make changes to Firefox threw about:config but without any success. I don't know, but the changes might be needed somewhere else.
So, any ideas?
Thanks ahead.
set_time_limit(0) will only affect the server-side running of the script. The error you're receiving is purely browser-side. You have to send SOMETHING to keep the browser from deciding the connection's dead - even a single character of output (followed by a flush() to make sure it actually get sent out over the wire) will do. Maybe once every image that's processed, or on a fixed time interval (if last char sent more than 5 minutes ago, output another one).
If you don't want any intermediate output, you could do ignore_user_abort(TRUE), which will allow the script to keep running even if the connection gets shut down from the client side.
If the process runs for hours then you should probably look into batch processing. So you just store a request for image processing (in a file, database or whatever works for you) instead of starting the image processing. This request is then picked up by a scheduled (cron) process running on the server, which will do the actual processing (this can be a PHP script, which calls set_time_limit(0)). And when processing is finished you could signal the user (by mail or any other way that works for you) that the processing is finished.
use set_time_limit
documentation here
http://nl.php.net/manual/en/function.set-time-limit.php
If you can split your work in batches, after processing X images display the page with some javascript (or META redirects) on it to open the link http://server/controller/action/nextbatch/next_batch_id.
Rinse and repeat.
batching the entire process also has the added benefit that once something goes wrong, you don't have to start out the entire thing anew.
If you're running on a server of your own and can get out of safe_mode, then you could also fork background processes to do the actual heavy lifting, independent of your browser view of things. If you're in a multicore or multiprocessor environment, you can even schedule more than one running process at any time.
We've done something like that for large computation scripts; synchronization of the processes happened over a shared database---but luckily enough, they processes were so independent that the only thing we needed to see was their completion or termination.

Categories