i need help for correct delay code for run the job.. im using laravel 6 and now i doing scraping youtube number of view.
let say that i have table that fill with youtube link. when i do scraping to youtube. we need to add delay time before we go to other link. if we dont make delay it. youtube will block our access. i want to add delay maybe 30 second for each loop.
so what i already trid in my app/Console/Commands/GetYoutube.php file is this
use App\Models\Youtube;
use App\Jobs\GetYoutubeView;
use Carbon\Carbon;
...
public function handle()
{
$youtubes = Youtube::get();
foreach ($youtubes as $youtube) {
GetYoutubeView::dispatch($youtube->link)->delay(Carbon::now()->addSeconds(30));
}
}
i tried this code. but still can't add delay 30 seconds for each loop
other think that i add delay code sleep(30); on jobs file. here is what i've done in my jobs app/Jobs/GetYoutubeLink.php
public function handle()
{
sleep(30);
// Scraping youtube total number of views code
}
but this same. it not delay 30 second for each loop. what is the correct delay code for this.
please help. what is correct delay code in my case.
When you call dispatch the job is dispatched. You are calling the delay after that. You should create the job and delay the job before dispatching.
$youtubes = Youtube::get();
$start = Carbon::now();
foreach ($youtubes as $youtube) {
$job = new GetYoutubeView($youtube->link);
$job->delay($start->addSeconds(30));
dispatch($job);
}
EDIT
You where calling multiple now dates before, you should only create one date, and keep adding 30 seconds to it.
Related
I'm writing a function that supposed to update the price for all woocommerce products.. the price data i will get from amazon using amazon api which has a certain x query per second limit thats why i will have to sleep on each loop. Im planing to run that function as a cron job. the below function is just an example for what im planing to do so please ignore he missing variable declaration etc..
I understand that i can increase the php timeout limit but im imagining what if i have hundreads or thousand of products and on each query i will have to to sleep for a while to avoid query throtling so to update all products it can take hours in this case so im wondering what is the best and easiest solution to keep that function looping for hours and stop after reaching the last id on $products_ids array?
function text(){
foreach ($products_ids as $products_id) {
//apm_single_lookup func do an API call which has max query/sec limit thats why i added the next sleep
$lookup_data = apm_single_lookup($eu_asin, $amazon_domain);
update_post_meta($products_id, $field_name, esc_attr($lookup_data['price']));
sleep(1);
}
}
You can change max_execution_time of the server.
Or use :
http://php.net/manual/fr/function.set-time-limit.php
like this :
set_time_limit(3600);
function text(){
...
}
Or another solution :
Split your loop in multiple Ajax call (or cronjobs), so you can stop and go and do what you want.
Like this:
function text(){
foreach ($products_ids as $products_id) {
set_time_limit(60); //timelimit per loop iteration
//apm_single_lookup func do an API call which has max query/sec limit thats why i added the next sleep
$lookup_data = apm_single_lookup($eu_asin, $amazon_domain);
update_post_meta($products_id, $field_name, esc_attr($lookup_data['price']));
sleep(1);
}
//set_time_limit(5*60); //set back to longer after the loop if you want etc.
}
I think it's better in the case of loop to reset the timer on each iteration.
When called, set_time_limit() restarts the timeout counter from zero.
This way you can keep the timout small per iteration and not worry if it you have a lot of iterations. That made more sense in my head ....
It's even better to do it in the CLI (Command Line). Even when you set PHP's max execution time, Apache has it's own time limits too (likely in mod_fcgi etc).
put this code before loop:-
ini_set('max_execution_time', 0);
or
set_time_limit(0);
I have a function which adds a new blog post to my database.
After running add_new_post() function the post should be approved by my another function approve_post().
But I don't want to approve post immediately. I want to run approve_post() function in time between 1 to 5 minute (random).
Now I am using this code:
function add_new_post() {
*my code of adding post*
$time = rand(60, 300);
echo $time;
sleep($time);
approve_post();
}
But browser shows loading until sleep ends. Also, I found that my Windows machine shows time execution error.
Is there a way to run approve_post() function in the background without showing page loading to the browser?
I would be grateful for а code example.
First, you need to increase the maximum execution time of a PHP script by using the following link https://www.plus2net.com/php_tutorial/max-ex.php
Then use sleep() function to delay code execution as:
<?php
echo date('H:i:s');
sleep(15);
flush();
echo "<br>";
echo date('H:i:s');
?>
Reference link
Or you can CronJob
There are multiple options.
You can use ajax which is required to open a page or you can use a Cron job. I will prefer to use Cron job
The concept will same, for an example. The post table must has created_date ,approve_date and approve_status. So when you create the post it will save the approve_date too.
Example, You create a post on 18:00 then the approve_time will 18:05. Then the background process (ajax or Cron job) will check the approve_status first then execute the approve process when live time same like approve_time
ref : cron job & Ajax
sorry for my English
I have a function where I want to delay a certain call in the middle, but I don't want to stop the whole function. Even after the last line I want the part I delayed to run after a designated time. Here's an example of what I mean:
function delayInMiddle()
{
function call 1
if(some condition met in 30 seconds)
{function call 2}
function call 3
}
I want to have function 1 called, check for function 2, continue to function call 3, then go back and do function call 2 in 30 seconds.
Is this possible in php?
How about this as a possible solution.
Instead of calling function2 in that script, write the relevant data to a table that is operated as a queue + a time before which it should not process this queue item, that creates your x second delay.
Then make function 2 a cron job that runs every 30 seconds. It looks at the queue to see what if anything there is for it to do.
foreach ( row in queue ) {
send twitter;
delete row from table;
}
exit;
You can use sleep() for delay. http://php.net/manual/en/function.sleep.php
I have a products database that synchronizes with product data ever morning.
The process is very clear:
Get all products from database by query
Loop through all products, and get and xml from the other server by product_id
Update data from xml
Log the changes to file.
If I query a low amount of items, but limiting it to 500 random products for example, everything goes fine. But when I query all products, my script SOMETIMES goes on the fritz and starts looping multiple times. Hours later I still see my log file growing and products being added.
I checked everything I could think of, for example:
Are variables not used twice without overwriting each other
Does the function call itself
Does it happen with a low amount of products too: no.
The script is called using a cronjob, are the settings ok. (Yes)
The reason that makes it especially weird is that it sometimes goes right, and sometimes it doesnt. Could this be some memory problem?
EDIT
wget -q -O /dev/null http://example.eu/xxxxx/cron.php?operation=sync its in webmin called on a specific hour and minute
Code is hundreds of lines long...
Thanks
You have:
max_execution_time disabled. Your script won't end until the process is complete for as long as it needed.
memory_limit disabled. There is no limit to how much data stored in memory.
500 records were completed without issues. This indicates that the scripts completes its process before the next cronjob iteration. For example, if your cron runs every hour, then the 500 records are processed in less than an hour.
If you have a cronjob that is going to process large amount of records, then consider adding lock mechanism to the process. Only allow the script to run once, and start again when the previous process is complete.
You can create script lock as part of a shell script before executing your php script. Or, if you don't have an access to your server you can use database lock within the php script, something like this.
class ProductCronJob
{
protected $lockValue;
public function run()
{
// Obtain a lock
if ($this->obtainLock()) {
// Run your script if you have valid lock
$this->syncProducts();
// Release the lock on complete
$this->releaseLock();
}
}
protected function syncProducts()
{
// your long running script
}
protected function obtainLock()
{
$time = new \DateTime;
$timestamp = $time->getTimestamp();
$this->lockValue = $timestamp . '_syncProducts';
$db = JFactory::getDbo();
$lock = [
'lock' => $this->lockValue,
'timemodified' => $timestamp
];
// lock = '0' indicate that the cronjob is not active.
// Update #__cronlock set lock = '', timemodified = '' where name = 'syncProducts' and lock = '0'
// $result = $db->updateObject('#__cronlock', $lock, 'id');
// $lock = SELECT * FROM #__cronlock where name = 'syncProducts';
if ($lock !== false && (string)$lock !== (string)$this->lockValue) {
// Currently there is an active process - can't start a new one
return false;
// You can return false as above or add extra logic as below
// Check the current lock age - how long its been running for
// $diff = $timestamp - $lock['timemodified'];
// if ($diff >= 25200) {
// // The current script is active for 7 hours.
// // You can change 25200 to any number of seconds you want.
// // Here you can send notification email to site administrator.
// // ...
// }
}
return true;
}
protected function releaseLock()
{
// Update #__cronlock set lock = '0' where name = 'syncProducts'
}
}
Your script is running for quite some time (~45m) and wget think it's "timing out" since you don't return any data. By default wget will have a 900s timeout value and a retry count of 20. So first you should probably change your wget command to prevent this:
wget --tries=0 --timeout=0 -q -O /dev/null http://example.eu/xxxxx/cron.php?operation=sync
Now removing the timeout could lead to other issue, so instead you could send (and flush to force webserver to send it) data from your script to make sure wget doesn't think the script "timed out", something every 1000 loops or something like that. Think of this as a progress bar...
Just keep in mind that you will hit an issue when the run time will get close to your period as 2 crons will run in parallel. You should optimize your process and/or have a lock mechanism maybe?
I see two possibilities:
- chron calls the script much more often
- script takes too long somehow.
you can try estimate the time a single iteration of the loop takes.
this can be done with time(). perhaps the result is suprising, perhaps not. you can probably get the number of results too. multiply the two, that way you will have an estimate of how long the process should take.
$productsToSync = $db->loadObjectList();
and
foreach ($productsToSync AS $product) {
it seems you load every result into an array. this wont work for huge databases because obviously a million rows wont fit in memory. you should just get one result at a time. with mysql there are methods that just fetch one thing at a time from the resource, i hope yours allows the same.
I also see you execute another query each iteration of the loop. this is something I try to avoid. perhaps you can move this to after the first query has ended and do all of those in one big query? otoh this may bite my first suggestion.
also if something goes wrong, try to be paranoid when debugging. measure as much as you can. time as much as you can when its a performance issue. put the timings in you log file. usually you will find the bottleneck.
I solved the problem myself. Thanks for all the replies!
My MySQL timed out, that was the problem. As soon as I added:
ini_set('mysql.connect_timeout', 14400);
ini_set('default_socket_timeout', 14400);
to my script the problem stopped. I really hope this helps someone. Ill upvote all the locking answers, because those were very helpful!
I have a block of code and want to run it after certain time intervals, say after every 3 seconds. If I get required result it should die otherwise another request will be sent after 3 seconds. Any idea to do it with php.
Thanks in advance for any help
You can sleep in a loop:
while (true) {
$result = doSomething;
if (resultIsGood) {
break;
}
sleep(3);
}
If you don't want to keep the browser waiting, you can look up ignore_user_abort() solutions on Google.
If what you want to execute MySQL queries (and nothing else), maybe using the MySQL event scheduler could fit your need:
http://dev.mysql.md/doc/refman/5.5/en/events-overview.html