Close window STOP php script? - php

I have tried searching for this forever, but unfortunately I could not find the answer.
I am calculating a whole lot of Pearson correlations on huge matrixes on my server. I do this by opening example.org/testscript.php.
The script itself will terminate about a 2 days after it has started and will perform many INSERT INTO databases for recommendation purposes.
I was wondering when I closed the window of my browser, whether the PHP script would stop or not. I am assuming not; however I am not a 100% sure.
P.S. I have noticed that in some browsers on some computers I will receive an internal server error (500) when starting the script after about 10 minutes. The script itself however was still running as it was still inserting rows in my database.
On this computer however I have not received such error and therefore I was wondering what would happen when I closed the tab.

The php script will terminate after reaching the timeout. You can change the timeout: http://php.net/manual/en/function.set-time-limit.php
The browser will however timeout much sooner, if you are not producing any output. The browser timeout is of course browser-specific.
As Dagon suggested, the correct way would be to execute the php script on the server in the background.

Related

php function re-executed after about 15 minutes

I have a php script which is runtime should take about 20-25 minutes, and approximately 15 minutes during execution time it simply re-executing for no apparent reason.
I gave it a token to make sure it kills the second execution before it runs again, but it's not the solution I prefer to use in the long run.
Some details about the script:
It ignites after a socket connection (socket does not timeout nor restarting at any point of the session.)
I gave the php script ignore_user_abort(true) in order to prevent the script from ending when user goes somewhere else.
I don't get any execution runtime errors, or any errors at all... In the client side I get a GET error message to the script after 15 minutes, which is actually the second execution try. It's still running and socket is still delivering data, though it appears the script has ended.
There are no two separated requests
I'm not sure what could be the problem of it, hopefully one of the masters here could hint with a clue?
Much thanks!

Possible causes for connection interrupted, LAMP stack

MySQL 5.1.73
Apache/2.2.15
PHP 5.6.13
CentOS release 6.5
Cakephp 3.1
After about 4 minutes (3 min, 57 seconds) the import process I'm running stops. There are no errors or warnings in any log that I can find. The import process consists of a lot of SQL calls and data processing, nothing too crazy, but it can take about 10 minutes to get through 5500 records if it's doing a full compare for updates.
Firefox: Secure Connection Failed - The connection to the server was reset while the page was loading.
Chrome: ERR_NO RESPONSE
The php set time limit is set to 900, which is working. I can set it to 5 seconds and get an error. The limit is not being reached.
I can sleep another controller for 10 minutes, and this error does not happen, indicating that something in the actual program is causing it to fail, and not the hosting service killing the request because it's taking too long (read about VPS doing this to prevent spam).
The php errors are turned all the way up in the php.ini, and just to be sure, in the controller itself.
The import process completes if I reduce the size of the file being imported. If it's just long enough, it will complete AND show the browser message. This indicates to me it's not failing at the same point of execution each time.
I have deleted all the cache and restarted the server.
I do not see any output in the apache logs other then that the request was made.
I do not see any errors in the mysql log, however, I don't know if it's because its not turned on.
The exact same code works on my local host without any issue. It's not a perfect match to the server, but it's close. Ubuntu Desktop vs Centos, php 5.5 vs php 5.6
I have kept an eye on the memory usage and don't see any issues there.
At this point I'm looking for any good suggestions on what else to look at or insights into what could be causing the failure. There are a lot of possible places to look, and without an error, it's really difficult to narrow down where the issue might be. Thanks in advance for any advice!
UPDATE
After taking a closer look at the memory usage during the request, I noticed it was getting much higher than it ideally should.
The httpd (apache) process gets killed and a new thread spawned. Once the new thread runs out of memory, the error shows up on the screen. When I had looked at it previous, it was only at 30%, probably because it had just killed the old process. Watching it the whole way through, I saw it get as high as 80%, which with the other processes was enough to get have it run out of memory, and a killed process can't log anything, hence the no errors or warnings. It is interesting to me that the process just starts right back up.
I found a command to show which processes had been killed due to memory which proved very useful:
dmesg | egrep -i 'killed process'
I did have similar problems with debugkit.
I had bug in my code during memory peak and the context was written to html in the error "log".

php script stops executing and shows no errors

I have a php script that scrapes the web and inserts the scraped data into a database.
This php script runs for a very long time(about a couple of hours).
Sometimes, after the script runs for a long time, the php script just stops executing and shows no error.
The problem isn't caused due to the amount of execution time of the script because i set the php script to an unlimited amount of execution time.
ini_set('max_execution_time', 0);
I also set the php script to show all errors.
ini_set('display_errors',1);
error_reporting(E_ALL);
But I get no error after the php script stops execution.
I also ran the script in several other computers and i still encounter the same problem, so the problem isn't due to server restrictions either.
I researched the issue and apparently it's a networking problem.
The php script stops communicating with the server and disconnects from it(probably because the php script sent a http request and didn't receive any response).
My question is this:
Is there any way I can check for network disconnections through the php script, and resume the script and try to reconnect if there was a network disconnection?
First "set_time_limit" is a better way to use that function and I would check your server's rules and regulations. If it is your private server, then great. If you are renting a VPS or other type of server, it is highly possible that they have restrictions to prevent misuse, like SPAMing and such. Make sure that you are within their rules and go from there.

How do I view the status of my msql query?

I recently executed a mysql query via chrome and closed it out. How exactly does a browser stop a PHP script using the stop button? I thought PHP was a server-side language and could not be controlled through a client.
*UPDATE*I'm now aware of SHOW PROCESSLIST, but this only shows you which threads are running.Is there a SQL command I can use to view a executed query with great detail?
A client (Chrome) has nothing to do with the execution of scripts (PHP) on the server, which in turn have no control over database processes (MySQL query).
Look at your servers process list to see what's going on in general (Apache processes).
Or even better: use SHOW PROCESSLIST; on the MySQL console to find the long running query. You may quit it by using KILL ###ID_OF_QUERY###;.
No, you don't need to keep it open. If you exit a running car, does the car turn off? No.
Sorry, that came off a little snotty, but it wasn't intended too.
The browser, in your case Chrome, is not actually running the actual code. The server is. Thus, once the instruction is executed, closing the browser no longer matters as the request has been given to the server.
two functions are essential for executing time consuming php scripts.
it has nothing to do with the browser (as other users already pointed out)
lookup ignore_user_abort and set_time_limit
The script will continue to execute regardless of browser closure. You can free up your browser by sending the response and allowing the php process to continue on.
ignore_user_abort(true);
$response = "Processing!";
header("Connection: close");
header("Content-Length: " . mb_strlen($response));
echo $response;
flush();
// Insert your lengthy query here
The Answer is it depends, as others mentioned you can check what is running on the mysql server by using the show processlist;
If it is a single query that takes a long time, the it will most likely carry on running after the browser has closed. PHP will have sent the request to the Database and will in effect be sat waiting for it to complete, in turn the browser will be waiting for the webserver to finish building the page/resource that is on that url
so the request is: browser <-> web server (<-> php ) <-> mysql in an ideal world if the user cancels the request everything would tidy itself up nicely, but that in my experience sadly is not the case, if one of the chain decides not to wait, the process that it is waiting for doesn't necessarily know until it tries to send the response back and fails
Come on guys, this is PHP 101. Quoted from the manual:
You can decide whether or not you want a client disconnect to cause
your script to be aborted. Sometimes it is handy to always have your
scripts run to completion even if there is no remote browser receiving
the output. The default behaviour is however for your script to be
aborted when the remote client disconnects.
Execution will stop at the next tickable event after the connection flag is set to ABORTED - which will be detected when PHP attempts to flush output to the client
The current MySQL query will finish executing (as the next event that PHP has control over doesn't occur until after the query has completed), but your script would not make it past that, unless you explicitly set ignore_user_abort. It's always important to account for this when writing code.
There are two ways around this
Set ignore_user_abort to true for the invocation of your script
Do not print anything back to the client until after all of your processing is complete - since a connection closed status won't be detected until output is flushed

How to Increase the time till a read timeout error occurs?

I've written in PHP a script that takes a long time to execute [Image processing for thousands of pictures]. It's a meter of hours - maybe 5.
After 15 minutes of processing, I get the error:
ERROR
The requested URL could not be retrieved
The following error was encountered while trying to retrieve the URL: The URL which I clicked
Read Timeout
The system returned: [No Error]
A Timeout occurred while waiting to read data from the network. The network or server may be down or congested. Please retry your request.
Your cache administrator is webmaster.
What I need is to enable that script to run for much longer.
Now, here are all the technical info:
I'm writing in PHP and using the Zend Framework. I'm using Firefox. The long script that is processed is done after clicking a link. Obviously, since the script is not over I see the web page on which the link was and the web browser writes "waiting for ...".
After 15 minutes the error occurs.
I tried to make changes to Firefox threw about:config but without any success. I don't know, but the changes might be needed somewhere else.
So, any ideas?
Thanks ahead.
set_time_limit(0) will only affect the server-side running of the script. The error you're receiving is purely browser-side. You have to send SOMETHING to keep the browser from deciding the connection's dead - even a single character of output (followed by a flush() to make sure it actually get sent out over the wire) will do. Maybe once every image that's processed, or on a fixed time interval (if last char sent more than 5 minutes ago, output another one).
If you don't want any intermediate output, you could do ignore_user_abort(TRUE), which will allow the script to keep running even if the connection gets shut down from the client side.
If the process runs for hours then you should probably look into batch processing. So you just store a request for image processing (in a file, database or whatever works for you) instead of starting the image processing. This request is then picked up by a scheduled (cron) process running on the server, which will do the actual processing (this can be a PHP script, which calls set_time_limit(0)). And when processing is finished you could signal the user (by mail or any other way that works for you) that the processing is finished.
use set_time_limit
documentation here
http://nl.php.net/manual/en/function.set-time-limit.php
If you can split your work in batches, after processing X images display the page with some javascript (or META redirects) on it to open the link http://server/controller/action/nextbatch/next_batch_id.
Rinse and repeat.
batching the entire process also has the added benefit that once something goes wrong, you don't have to start out the entire thing anew.
If you're running on a server of your own and can get out of safe_mode, then you could also fork background processes to do the actual heavy lifting, independent of your browser view of things. If you're in a multicore or multiprocessor environment, you can even schedule more than one running process at any time.
We've done something like that for large computation scripts; synchronization of the processes happened over a shared database---but luckily enough, they processes were so independent that the only thing we needed to see was their completion or termination.

Categories