I have a PHP script which contains many database queries, and copies several database tables, and as such, it takes quite a long time to complete. The problem I am getting, is that it is timing out. However, it appears to be completed, which is what is confusing.
The script is suppose to redirect to view once completed. However, even after extending the time limit to 5 minutes, it gives me the timing out error page. However, when I check the database, all of the tables have been copied completely, indicating that the script was completed.
Am I missing something easy here? Is there a general reason it would time out as opposed to redirecting to the view? I would post some of the code, but the entire script is approximately 1000 lines of code, so it seems a bit extensive to show here.
Also, I am using CodeIgniter.
Thanks in advance for your help!
It's possible that the PHP script is not timing out, but the browser you're using has given up waiting for any result. If thats the case you'll need to handle the whole thing differently. For example, run the script in the background and report periodic updates via AJAX or something.
Think of it this way:
Your browser asks your server for a web page and waits for the results.
Your server runs your PHP script, which then asks MySQL to run a query, and waits for results.
MySQL runs the query and returns a result to PHP.
PHP does some more processing and returns a result to the browser.
At step 3, PHP may have timed out and is no longer there. MySQL didn't know that while it was working, it just did its job and then handed a result back to nothing.
At step 4, the browser may have timed out and dropped the connection. PHP wouldn't know that, so it did its job and then returned a result to nothing.
It's two separate timeouts in this example, but your query was completed either way.
Related
I am struggling with something.
I have an PHP page that does an ajax call to another page using jQuery $.ajax. It sends the request async to the processing page which then returns a response.
This works fine now but we are making some changes to the backend and the processing (SQL stored procedure) that runs is now taking a lot longer like well over 5 minutes. The wait is is fine because we are dealing with close to 200MM records in SQL.
The thing is I need to be able to send the request to the processing page and not have to wait for a response. The processing page fires off the stored procedure in PHP like this:
$query = $dbh2->prepare('exec sp_name :countID');
$query->bindParam('countID', $countID);
$query->execute();
Now again that stored procedure takes awhile to run and we do not need the results of that to be presented back to the user. There is though some additional PHP code that needs to run after the stored procedure but again nothing needs to be send back to the browser.
I am trying to figure out a way that I can make a call to the processing page and it runs the stored procedure and the other code but the user's browser does not need to wait for the response. Right now if the try to click off the page too soon it basically locks up the browser for awhile and does not finish the processing.
Any insight into this would be great.
Thanks in advance for any help.
Sequenzia, if I understand correctly, then I've been here and found a way through this quagmire after a lot of research.
I provided an answer to a similar question a few months ago. Unfortunately, the OP nor anyone else has ever accepted/commented/upvoted/downvoted - nada.
Run a batch file from my website
And here are some useful references :
Running a background script (unix command)
Ref: http://nsaunders.wordpress.com/2007/01/12/running-a-background-process-in-php/
Ref: http://www.mathinfo.u-picardie.fr/asch/f/MeCS/courseware/users/help/general/unix/redirection.html
How to compose PHP $shortopts and $longopts
This is the way to interpret parameters passed to a PHP script when run from the command-line, or from another PHP script with shell-exec()
Ref: http://www.php.net/manual/en/function.getopt.php
You might look at setting the timeout option for the $.ajax() method. By setting a timeout of maybe half a second or whatever, the ajax will just timeout and go into the error handler (if any).
Simple question, but I can't seem to find the answer.
My php code takes a really long time to process because I'm generating a report from a large database. I coded an html table to display the results in a web page, but the page loads (and gets sent to clients) before my php code finishes because all the table values are empty. I run the query on phpMyAdmin and it works, but it just takes a long time. Ideas? Are there any other ways I can display the report in a table format besides seeing it in a webpage? Can I make the webpage wait until the code finishes?
There are several approaches
one is using
ob_start();
// processing
ob_flush();
flush();
the next is adding pagination, aka limiting the result size.
SELECT * FROM table LIMIT 0,10
SELECT * FROM table LIMIT 10,10
SELECT * FROM table LIMIT 20,10
of course it all depends on your code, without seeing your code there's only guessing what the reason might be
Can I make the webpage wait until the code finishes?
It's really, really difficult to write PHP code which implements asynchronous database calls - which it would need to do if the PHP script completes before the MySQL script. Just change strip out all the asynchronous handlers in the PHP code and make the MySQL calls blocking and it will not exit before the queries complete - but I very much doubt that is what your code really is doing.
but the page loads (and gets sent to clients)
This is confused too - if you're generating HTMLthen the page is laoded after the HTML is sent to the client - not before.
Simple question
No it's not - it's very confused!
The prudent way
One of the correct approaches, at least one I'd recommend, would be to, upon request from the user, add the job to a queue that is handled by a background process, for example, a PHP command line script running from a cron job. While that is going on, you can periodically request job status from the server via an AJAX call from your webpage, display progress, if you can, and present the user with the result once the job is finished. Since command line PHP scripts don't have time limits, you don't have to worry about timeouts.
Another way is what is implemented, for example, in 37signals' Highrise - they take the request add a job but display a page saying "It will be ready when it's ready," and when it is ready, they send an email to the user saying "Here's your file, come here and download."
The quick fix
To answer the question "Can I make the webpage wait until the code finishes?" – there is the set_time_limit() function that does exactly what you want.
tl;dr: Is it possible to feed some "dummy spaces" periodically back to the browser while waiting for a SQL query to execute? This to not have the browser hang up on me while nothing returns.
Longer story:
I've made a small "web tool" against a database (MS SQL, using their PDO driver).
Sometimes, the queries that I run take a long time.
After about 100 seconds, the browser just stops "rotating". I don't know yet what causes this, but it is the same with Firefox and Chrome. The stack is PHP 5.3, IIS 6, FastCGI. It is not PHP nor DB/SQLSRV timeout, as I've increased both of those - and other queries I have take a longer time to feed back all the result. (I can reproduce the problem by writing some header, chilling for 110 seconds, and then write the footer. Only the header-part is then shown.)
The problem with the present query, is that it doesn't feed back anything for about 200 seconds, then the whole thing comes. But this doesn't help when something along that stack have stopped listening/receiving/transmitting after about 100 seconds.
Thus, the question: Is it possible to trickle-feed the browser some dummy spaces while the script is waiting for the SQL to return? In my native tounge of Java, this would be trivial, but in PHP, one is AFAIK utterly single threaded (actually, "single process'd"). I know that this trickling would work, as I have other scripts that in total takes much longer, but which continually sends small pieces of the result back to the browser - this renders just fine.
Not if you only intend to run one query. However, depending on the nature of your query, you can probably just split it up into multiple smaller queries, and then loop through those.
Contrary to your other answers and comments, you CAN "trickle-feed" data to the browser, if your split your calls up. You're looking for the flush() function.
DEMO
<?php
for ($i = 0; $i <= 200; $i++)
{
sleep(1);
echo ' ';
flush();
}
echo 'It worked!';
?>
Try running this. It should take 200 seconds. However, because flush() is there, it'll send data to the browser after each iteration of the loop, and hopefully not time out! My boss's web host times out after 30 seconds of inactivity (Rackspace, grrrr!) so I've had to use this very same trick countless times.
PHP does not send output to the browser as you echo it. It writes the contents to a buffer, and sends the entire contents to the browser at once. So, no, you cannot trickle output to the browser.
flush doesn't work very well there is a better way - you can pipe the output to a file and then use a independent php script to only ever rip the last line, then use ajax on the client to poll that independent script every 200ms the last line. this gives the effect you want I am working on it now and will get the code here ready when I do
EDIT1: HAHa Endre it's you! <3 from England you little athiest you x
EDIT2: the timeout comes from a plethora of subtle params in and out of PHP.ini, mostly undocumented in the sense they seem to be unrelated, but also do NOT underestimate the browser being lame - in my experience I was only ever able to irradiate timeouts utterly once and only then on firefox but I was able to run/poll a 3 hour long script this way (sadly it was a AD migration script, at my old job and I don't have the code)
EDIT3: you can "thread" in PHP by using the PHP CLI and then the process ID or by using curl both are fairly ewww but you CAN do it
I'm using PHP/MySQL, although I think this question is essentially language/db ambivalent. I have a PHP script that connects to one API, gets the response data, parses it, and then sends it to a different API for storage in its database. Sometimes this process fails because of an error with one of the APIs. I would therefore like to easily track its success/failure.
I should clarify that "success" in this case is defined as the script getting the data it needs from the first API and successfully having it processed by the second API. Therefore, "failure" could result from 3 possible things:
First API throws an error
Second API throws an error
My script times out.
This script will run once a day. I'd like to store the success or failure result in a database so that I can easily visit a webpage and see the result. I'm currently thinking of doing the following:
Store the current time in a variable at the start of the script.
Insert that timestamp into the database right away.
Once the script has finished, insert that same timestamp into the database
again.
If the script fails, log the reason for failure in the DB.
I'd then gauge success or failure based on whether a single timestamp has two entries in the database, as opposed to just one.
Is this the best way to do it, or would something else work better? I don't see any reason why this wouldn't work, but I feel like some recognized standard way of accomplishing this must exist.
A user declared shutdown function might be an alternative: using register_shutdown_function() you can decalre a callback to be executed when the script terminates, whethe rsuccessfully, user-aborted, or timed-out
You could use a lock file :
at the very beginning of your script, you create a lock file somewhere on the filesystem
at the very end of your script, if everything worked good, you delete it from filesystem
Then you've just to monitor the directory where you've placed these files. With the lock file's creation date you can find which day didn't work.
You can combine this system by a monitoring script that sends alerts if lock files are present and have a creation date older than a given interval (let's say 1 hour for example if your script usually runs in a few minutes).
I have a PHP script that takes about 10 minutes to complete.
I want to give the user some feedback as to the completion percent and need some ideas on how to do so.
My idea is to call the php page with jquery and the $.post command.
Is there a way to return information from the PHP script without ending the script?
For example, from my knowledge of this now, if I return the variable, the PHP script will stop running.
My idea is to split the script into multiple PHP files and have the .post run each after a return from the previous is given.
But this still will not give an accurate assessment of time left because each script will be a different size.
Any ideas on a way to do this?
Thanks!
You can echo and flush() output, but that's suboptimal and rather fragile solution.
For long operations it might be good idea to launch script in the background and store/updte script status in shared location.
e.g. you could lanuch script using fopen('http://… call, proc_open PHP CLI process or even just openg long-running script in an <iframe>.
You could store status in the database or in shared memory (using apc_store()).
This will let user to check status of the script at any time (by refreshing page, or using AJAX) and user won't lose track of the script if browser's connection times out.
It also lets you avoid starting same long script twice.