PHP bypass max_script_execution by repeatedly calling itself by header location - php

I've got a script that does heavy database manipulation.
it worked all okay untill i reached a higher number of databases where to fire the manipulations. now i collide with the max_script_execution timeout.
is it possible to handle the databases one by one when i redirect to the same script with a param for the next db by header location?
or would this not affect the max_script_execution and timeout anyway?

Try this,
Add this line in the top of your php script.
ini_set('max_execution_time', 0);
But, one thing remember, it is not a good trick to follow. You should check you script to minimize the execution time.

Related

Creating a long TCPDF document without timeout (so long running php process)

I'm building a feature of a site that will generate a PDF (using TCPDF) into a booklet of 500+ pages. The layout is very simple but just due to the number of records I think it qualifies as a "long running php process". This will only need to be done a handful of times per year and if I could just have it run in the background and email the admin when done, that would be perfect. Considered Cron but it is a user-generated type of feature.
What can I do to keep my PDF rendering for as long as it takes? I am "good" with PHP but not so much with *nix. Even a tutorial link would be helpful.
Honestly you should avoid doing this entirely from a scalability perspective. I'd use a database table to "schedule" the job with the parameters, have a script that is continuously checking this table. Then use JavaScript to poll your application for the file to be "ready", when the file is ready then let the JavaScript pull down the file to the client.
It will be incredibly hard to maintain/troubleshoot this process while you're wondering why is my web server so slow all of a sudden. Apache doesn't make it easy to determine what process is eating up what CPU.
Also by using a database you can do things like limit the number of concurrent threads, or even provide faster rendering time by letting multiple processes render each PDF page and then re-assemble them together with yet another process... etc.
Good luck!
What you need is to change the allowed maximum execution time for PHP scripts. You can do that by several means from the script itself (you should prefer this if it would work) or by changing php.ini.
BEWARE - Changing execution time might seriously lower the performance of your server. A script is allowed to run only a certain time (30sec by default) before it is terminated by the parser. This helps prevent poorly written scripts from tying up the server. You should exactly know what you are doing before you do this.
You can find some more info about:
setting max-execution-time in php.ini here http://www.php.net/manual/en/info.configuration.php#ini.max-execution-time
limiting the maximum execution time by set_time_limit() here http://php.net/manual/en/function.set-time-limit.php
PS: This should work if you use PHP to generate the PDF. It will not work if you use some stuff outside of the script (called by exec(), system() and similar).
This question is already answered, but as a result of other questions / answers here, here is what I did and it worked great: (I did the same thing using pdftk, but on a smaller scale!)
I put the following code in an iframe:
set_time_limit(0); // ignore php timeout
//ignore_user_abort(true); // optional- keep on going even if user pulls the plug*
while(ob_get_level())ob_end_clean();// remove output buffers
ob_implicit_flush(true);
This avoided the page load timeout. You might want to put a countdown or progress bar on the parent page. I originally had the iframe issuing progress updates back to the parent, but browser updates broke that.

Expanding PHP execution time limit

I have a question about PHP's execution time limit. I need to run a script for many hours sending HTTP requests. These requests have to be apart a certain time, so that's why the whole thing is supposed to take hours. Does someone have experience setting this kind of time limit for PHP using the line below? For example:
ini_set('max_execution_time', 28800); // 8 hours
Strange question, I know, but let me know if this would work or not. TIA!
Update: I was going to try it from the browser. I'm not familiar with running PHP scripts from the command line.. I should look into this. I did found an alternate way to get this information that could be retrieved from the HTTP request; It turns out we have a database with some of the information already locally accumulated over a long period of time.
Are you running this from browser or from CLI? If from CLI (as you should with such script), there is no exection time limit (i.e. max_execution_time is hardcoded to 0)
set_time_limit(28800);
some (shared)hosts do not allow this
what i will suggest you is maintain a log of when was your last attempt (unix time stamp), and use cron to execute a script which checks if its time to make the next HTTP request, and if yes then update the timestamp in the file to current time stamp.
Hope that helps

run php script after every 100ms

Is it possible to run a php script after every 100ms ? This script will check a database for changes and then the changes will be reflected into some other database. I am already doing it using Triggers. But I want to know if there is any other way to do it without using Cron and Triggers. I will be using Linux for this purpose.
Thanks
Running something every 100ms almost means that it runs all the time , might as well create a daemon that continuously loops and executes
or use triggers. Essentially on every database change it will copy to another table/db.
http://codespatter.com/2008/05/06/how-to-use-triggers-to-track-changes-in-mysql/
It is not possible to do this with cron (it has a max frequency of one minute) and this is a really bad idea. You will be running a whole new php interpreter ten times per second, not to mention doing database connection too.
Far better perhaps would be to run one program that re-uses it's connection and checks every second or so.
Sounds a little like you are trying to make your own database replication or sync between two databases.
You could write a daemon to do it, essentially a script which continually runs in memory somewhere to then run whatever code you want to.
So that daemon would then do the database processing for you, and you wouldn't have to call a script over and over again.
Use your favorite programming language and set up a permanent loop to run it every 100ms, then put the script into inittab with 'respawn' (man inittab for complete syntax). Finally, init q to reload init.
It's best if you write a little daemon for that. Use the pcntl functions to do so. In your case you might get away with:
<?php
while (1) {
usleep(100000);
if (pcntl_fork() == 0) {
include("/lib/100ms-script.php");
exit;
}
// else pcntl_wait(); eventually
}
I'm assuming that this is in reference to some type of web page to be created. If so, this sounds like this is a job for Ajax, not PHP. As you may already know PHP processing is done on the server side. Once processing is complete the page is served up to the client.
With Ajax/JavaScript processing can continue via the browser. You can setup a timer that can then be used to communicate with the server. Depending on the output of the response the page may be updated to reflect the necessary changes.

PHP session handling when the same client requests the same script multiple times at once

So here's my test setup:
session_start();
if(!isset($_SESSION['bahhhh']))
$_SESSION['bahhhh'] = 0;
$_SESSION['bahhhh']++;
sleep(5);
die('a'.$_SESSION['bahhhh']);
What I expect to happen is that each time I hit the page, it returns a different number.
But if I use multiple tabs, and refresh them each within 5 seconds of the first, they all return the same number. (This isn't client side caching, as the 5 second delay is still evident.)
Why is it doing this, and how do I fix this?
It seems to have the same strange caching issue with file and database data as well, and is preventing me from building a working mutex to prevent running the same code more than once at a time.
Here's another, simpler example:
echo microtime();
sleep(10);
Run this 3 times, each 2 seconds apart, and all three return the same microsecond. WTF?
Session data, be default, is not saved until the request terminates. So your increment is not saved while sleeping. If you want to save the session prematurely checkout session_write_close()
I would have the script itself append something to a log file to verify the script is actually getting executed as many times as you think. Maybe you have some software somewhere that is intercepting the request and returning a cached response.
If it weren't for your comment that this also happens with microtime(), I would have given an explanation of how php manages concurency with sessions, and when it might not.
Apparently this is some bug in my browser itself. Opera behaves this way while Internet Explorer does not.
I did initial testing in IE with the same results but with more complex code. Apparently that complex code had an error that triggered the misbehavior in IE, and this simplified code does not.
Sorry to bother everyone.

How to execute a PHP spider/scraper but without it timing out

Basically I need to get around max execution time.
I need to scrape pages for info at varying intervals, which means calling the bot at those intervals, to load a link form the database and scrap the page the link points to.
The problem is, loading the bot. If I load it with javascript (like an Ajax call) the browser will throw up an error saying that the page is taking too long to respond yadda yadda yadda, plus I will have to keep the page open.
If I do it from within PHP I could probably extend the execution time to however long is needed but then if it does throw an error I don't have the access to kill the process, and nothing is displayed in the browser until the PHP execute is completed right?
I was wondering if anyone had any tricks to get around this? The scraper executing by itself at various intervals without me needing to watch it the whole time.
Cheers :)
Use set_time_limit() as such:
set_time_limit(0);
// Do Time Consuming Operations Here
"nothing is displayed in the browser until the PHP execute is completed"
You can use flush() to work around this:
flush()
(PHP 4, PHP 5)
Flushes the output buffers of PHP and whatever backend PHP is using (CGI, a web server, etc). This effectively tries to push all the output so far to the user's browser.
take a look at how Sphider (PHP Search Engine) does this.
Basically you will just process some part of the sites you need, do your thing, and go on to the next request if there's a continue=true parameter set.
run via CRON and split spider into chunks, so it will only do few chunks at once. call from CRON with different parameteres to process only few chunks.

Categories