execution time handling for functions in a loop - php

Im facing some issues with a longrunning loop.
It goes through around 40.000 urls that it scrapes. however, some external pages can be unreachable at that exact time etc.
If it throws maximum execution time, the whole function stops. so im at a loss of how to go around that.
been scourging the net for answers and found how to catch the timeout exception. so did a function like this :
ini_set('max_execution_time',15 );
function shutdown()
{
$a=error_get_last();
if($a==null)
DoMyStuff();
else
DoMyStuff();
}
register_shutdown_function('shutdown');
foreach blabla {
shutdown();
}
My thoughts is that it will catch a timeout, and give it another attempt if it does.
But that's not really good enough.
Optimally i would want a script that tries to do my function 3 times and if it times out 3 times- then go on to next in loop.
but i cannot for the life of me figure out how to do that.
Any help is greatly appreciated SO! =)

Related

How to avoid php timeout while running this loop?

I'm writing a function that supposed to update the price for all woocommerce products.. the price data i will get from amazon using amazon api which has a certain x query per second limit thats why i will have to sleep on each loop. Im planing to run that function as a cron job. the below function is just an example for what im planing to do so please ignore he missing variable declaration etc..
I understand that i can increase the php timeout limit but im imagining what if i have hundreads or thousand of products and on each query i will have to to sleep for a while to avoid query throtling so to update all products it can take hours in this case so im wondering what is the best and easiest solution to keep that function looping for hours and stop after reaching the last id on $products_ids array?
function text(){
foreach ($products_ids as $products_id) {
//apm_single_lookup func do an API call which has max query/sec limit thats why i added the next sleep
$lookup_data = apm_single_lookup($eu_asin, $amazon_domain);
update_post_meta($products_id, $field_name, esc_attr($lookup_data['price']));
sleep(1);
}
}
You can change max_execution_time of the server.
Or use :
http://php.net/manual/fr/function.set-time-limit.php
like this :
set_time_limit(3600);
function text(){
...
}
Or another solution :
Split your loop in multiple Ajax call (or cronjobs), so you can stop and go and do what you want.
Like this:
function text(){
foreach ($products_ids as $products_id) {
set_time_limit(60); //timelimit per loop iteration
//apm_single_lookup func do an API call which has max query/sec limit thats why i added the next sleep
$lookup_data = apm_single_lookup($eu_asin, $amazon_domain);
update_post_meta($products_id, $field_name, esc_attr($lookup_data['price']));
sleep(1);
}
//set_time_limit(5*60); //set back to longer after the loop if you want etc.
}
I think it's better in the case of loop to reset the timer on each iteration.
When called, set_time_limit() restarts the timeout counter from zero.
This way you can keep the timout small per iteration and not worry if it you have a lot of iterations. That made more sense in my head ....
It's even better to do it in the CLI (Command Line). Even when you set PHP's max execution time, Apache has it's own time limits too (likely in mod_fcgi etc).
put this code before loop:-
ini_set('max_execution_time', 0);
or
set_time_limit(0);

Repeat function

I create function for if no get one value - in this case ok - repeat and recheck other time and other time , but the problem it´s all time when check 2 or 3 times finally put me connection down
My function :
<?php
function recursive() {
if (file_exists("pol.txt")) {
print "ok";
} else {
print "bad";
usleep(1);
recursive();
}
}
recursive();
?>
The idea it´s the function test this all time and stop when other function create the file called pol.txt , but the problem it´s the function , because all time in 2 seconds put me coonection down in all brownsers
I test actually in localhost and in server with the same results
The question it´s if no possible run this kind of function or exists other way for do this without refresh page each time , because i can´t use refresh in this case
Thank´s , regards
The problem is that you are using usleep(). That delays the execution in micro-seconds so in two seconds you will have a very deep level of recursion, causing your script to error out.
For this you should probably just use a loop and make sure it ends after an x-number of seconds or iterations.

How to run a php function after certain time without using cron

I have a block of code and want to run it after certain time intervals, say after every 3 seconds. If I get required result it should die otherwise another request will be sent after 3 seconds. Any idea to do it with php.
Thanks in advance for any help
You can sleep in a loop:
while (true) {
$result = doSomething;
if (resultIsGood) {
break;
}
sleep(3);
}
If you don't want to keep the browser waiting, you can look up ignore_user_abort() solutions on Google.
If what you want to execute MySQL queries (and nothing else), maybe using the MySQL event scheduler could fit your need:
http://dev.mysql.md/doc/refman/5.5/en/events-overview.html

The bwshare module and PHP scraping

I wrote a script downloading a list of pages from a website. From time to time I receive the following error (the number of seconds is variable):
The bwshare module will refuse your requests for the next 7 seconds.
You have downloaded data too rapidly.
I found when using sleep(2) in the loop, it works much better, however the time delay is too expensive.
What's the best way how to deal with this module? Should I scrape it without any delay and if the response will be similar to the above message simply use sleep for the requested number of seconds?
It all depends on how many pages you can get before the error message.
Try and measure how many pages in average you can get.
4 pages before the bwshare message is the minimum.
If you are getting the error message before reaching 4 page downloads, then il would be faster to sleep(2) after each download.
try this way... it might help u.
$requestTime = 0.1; // s/connection
foreach(/* blah */) {
$start = microtime(true);
// Do your stuff to here.. get_file_content($url) and other processing .........
if($timeTaken = microtime(true)-$start < $requestTime) {
usleep(($requestTime-$timeTaken)*1000000);
}
}
if your problem is solved then try to post your answer so that other people may also be benefited

PHP Retry Script until Success or Error

I'm trying to create a php script which retries another php script up to 3 times until an error is displayed. I'm thinking perhaps this could be done using a php loop? If the code works successfully the first time, then there is no need for it to retry 3 times, however, if it doesn't work the first time, then it should retry the php script up to 3 times till an error message is displayed.
Using php coding, I've managed to make a script which grabs/fetches content from another location using "file_get_contents" and thereafter gives each word/data a php variable. All this was done by getting help from other members on stackoverflow (which I extremely appreciate). The code below is what does it all:
$searchdata = file_get_contents('http://www.example.com');
list($no1, $no2, $no3, $no4, $no5,
$no6, $no7, $no8, $no9) = explode(" ", $searchdata);
So, I'd like to add some sort of loop which retries this script up to 3 times; if it doesn't work the first time. To determine whether the script works the first/second/third time or not, the text "#endofscript" or "failure" should be found when using "file_get_contents". If anything else is displayed other than "#endofscript" or "failure" that should be counted as an error and should be looped till found. If it still isn't found after the third try, could an error message be displayed? Such as "Error - Please try again".
Thank you for all your assistance and I will appreciate each and every single reply. If you need more details, please feel free to ask. And again, I'm really grateful for this. :)
$maxTries = 3;
for ($try=1; $try<=$maxTries; $try++) {
// your code
if ($success) {
break;
}
}
// if $try > 3, script failed

Categories