Call a function multiple times without waiting for it to complete - php

I have a cron job that calls a script that iterates through some items and submits them as posts to Facebook Graph API every minute. The issue is, each call takes a few seconds. If there is more than 10 posts to be sent to the API in a given minute, the script runs longer than a minute and then starts causing issues when the script starts running again at the next minute.
The general process is like this:
1. Each facebook profile posts every hour
2. Each of these profiles have a 'posting minute', which is the minute of the hour that they are posted at
3. A cron job runs every minute to see which profiles should be posted do, any given minute, and then posts to them
My question: Is it possible to continue the script immediately after calling the $facebook->api(...) method below, rather than waiting for it to complete before continuing? So that it can ensure to post to all profiles within the given minute, rather than potentially risk having too many profiles to post to and the script exceeding 60 seconds.
$profilesThatNeedToBePostedTo = getProfilesToPostTo(date(i));
foreach($profilesThatNeedToBePostedTo as $profile)
{
$facebook->api($endPoint, 'POST', $postData); // $postData and $endPoint omitted for brevity
}

what you are asking if in PHP you can do Asynchronous calls. There are some solutions here: Asynchronous PHP calls
Just to point there are not the most reliable options.
Is not limiting number of post to 9 each time reduce the times you have exceeded the 60 seconds. But what happens if the Facebook Graph is slow? what do you do?
Or run you code for 60 seconds and once the 60 seconds time is reached close the function. And let it run again the next time.

I am not sure if I understand your question hundred percent clearly. You meant that you want to run the cronjob every minute. However, it takes longer than 1 minute to call facebook API. Do you mean that foreach loop may delay your cronjob?
If so, I wonder if you did a right cronjob script, better to show your cronjob script. Because I think php script should not affect your sh script(cronjob)

Related

Run PHP Script on Button Click in Background (Regardless of User Page Change, or Browser Exit)

I have a script that parses a large amount of historical data through MySQL queries and other PHP manipulation.
Essentially it is a "Order Report" generator that outputs a lot of data and forms it into an array, and then finally encoding it into json and saving it on my server.
I would like to optimize these more, but to generate a 30 day report takes about 10-15 seconds, 90 days takes 45-seconds to a minute to complete, 180 days anywhere from 2-3 minutes, and for an entire year it takes typically over 5 minutes for the script to finish.
Like eBay and Amazon, I am thinking, due to loading times, a 'File Queue' system is needed, as the user isn't going to want to wait 5 minutes in a loading scenario waiting for a yearly report.
I understand about ajax requests, and how this can be done, or have even read about hidden iframes that perform the request. My concern is, what if the user changes the page (if normal ajax request), or exits the browser, the script execution will not finish.
My only other thought would be to possibly have a cron job along with a MySQL table that inserts the users request, but thinking forward the CRON would have to run about once every minute, all day, every single day, non-stop to check if a report request was issued.
This seems plausible, but perhaps not ideal as having a cron run every minute 24/7/365 seems it'd produce a bit of baggage in its continuity (especially since reports being generated would not be conducted very often).
Can anyone recommend/suggest a route of action to take that won't be a burden to my server, and will still run and complete regardless of user action?

Why response take long time in ajax but php code executed quickly?

I made a PHP application and it requires API Response from a server and it checks with mysql database.
Every time it takes long time to execute, minimum time to get response is 3 seconds and goes 90 seconds and so on.
I have attached the screenshot of two files for one is continuous request delayed response and another is waiting time.
I want to know the reason for the waiting time and blocked time in response. But when I check the php code execution it done in less than one seconds.
Whether I want to improve php code or that API server?

Cron Job timing Schedule

I have a PHP web page which serves the RSS feed, but it takes about 15-20 seconds to generate a response (which then will be cached for 10 minutes on the server for faster responses).
How could I set a cron job timing for this operation? I am having problem with this. I think if I call the page before 10 minutes it will run cached page so I won't get latest updated page, is this true? And if I call that page after 10 mins then will I have to wait for 15-20 seconds to get a response?
How do I manage to make this process where I will get updated feed with swift response? I haven't tried cron job before, this is my first time, so i find this confusing.
My cron command is : */10 * * * * wget http//www.example.com/multifeed.php
Is it right?
You won't have a perfect cron to bring the fresh data as soon as it's available, That's a limitation you'll have to live with I think.
What I would do is run this cron every 2 minutes and try to get new data, I would check to see if the last update is different than what I already have, and if it is, update the file, if it's not, do nothing and wait for the next cron.
This method will provide at most two minutes of stale data.
Another option is to check the mtime of hte file : http://php.net/manual/en/function.filemtime.php
Basically, I visit your page, We check the mtime of the file, if it's greater than 10 minutes, we fetch fresh data, this, combined with the cron, can provide a way for users to always see fresh data. If the freshness of the information isn't that important(can you live with two minutes of stale data?), if it's not that important, simply run the two minute cron.
Hope it helps.

Only one call from concurrent request with 60 sec timeout

I have a function callUpdate() that needs to be executed after every update in the webpage admin.
callUpdates execute some caching (and takes up to 30 sec..) so it is not important to execute it immediately but in reasonable amount of time after the last update lets say 60 sec.
The goal is to skip processing if the user (users) make several consecutive changes in a short amount of time.
here is my current code:
//this in separate stand alone script that is called asynchronous way
//so hanging for 1min does not and block the app.
function afterUpdate(){
$time = time();
file_put_contents('timer.txt', $time);
sleep(60);
if (file_get_contents("timer.txt") == $time) {
callUpdate();
}
}
My concerns here are bout the sleep function .. if it takes too much resources
(if I make 10 quick saves, this will start 10 PHP processes running for almost 60 sec each ..)
And what will happen if 2 users call simultaneously file_put_contents() on the same file.
Please tell me if there is better approach and if there are some major issues in mine.
NOTE: data between sessions can be stored only in a file
and there I have limited access to the server setup "APC settings and such"

Refresh to next page to prevent timeout concerns? Or after certain amount of queries?

I'm trying to manipulate my mysql tables with PHP but when I update them via PHP I have trouble with time out concerns.
Is it possible to refresh or "start a new session" while continuing on from last session? So it doesn't time out? Or is it possible to somehow go to a new page to prevent time outs?
What's the best possible way can someone help me please?
Maybe switch over to a new page after 150 successful queries or after 10 seconds?
Thanks.
10 seconds won't give you a time out usually... i was interested myself how long a browser will time out usually, but you can try sleeping for 20 seconds and then 30 and more and see if your browser times out.
I have had a Rails server running on a local Macbook with many partials and query, and 45 seconds and the page won't time out.
If you don't need immediate feedback, you can put the task in a queue to be processed by cron job every 10 minutes or every 30 minutes depending on your need.
If you need immediate feedback and the task is exceeding long, you can break them down to a safe limit, and issue a redirect to a second and third page with param task=1000, task=2000 and report so on the page. Another way that might work is to fork a process and report an ID... and when the task is all done, near the end, update a SQL table that the task with that ID is done, and have a page display all finished task and that page can refresh every 10 seconds.
Try to use set_time_limit() after successful query execution so you won't incur into timeout if everything is running ok.
http://php.net/manual/en/function.set-time-limit.php

Categories