Why response take long time in ajax but php code executed quickly? - php

I made a PHP application and it requires API Response from a server and it checks with mysql database.
Every time it takes long time to execute, minimum time to get response is 3 seconds and goes 90 seconds and so on.
I have attached the screenshot of two files for one is continuous request delayed response and another is waiting time.
I want to know the reason for the waiting time and blocked time in response. But when I check the php code execution it done in less than one seconds.
Whether I want to improve php code or that API server?

Related

PHP, Mysql Notification Server Requirements

I am creating a project management system and in need to do push notifications when an activity took place.
Question : If I do a jquery to refresh and fetch notification from mysql database, say every 30seconds, will there be a huge impact in the server? What are the minimum requirements?
So basically, I'm looking at 10 notifications/day for 20 employees.
Assuming you're talking about an AJAX request to the server in order to update DOM elements, most basic web servers would very much be able to handle a few requests every 30 seconds or so. More important is how well-optimized the server-side code that finds & returns the notifications is. Assuming you'll have a few clients requesting every 30 seconds, I would suggest making sure the code only takes a few seconds to process the request and respond with the updated data.

Run PHP Script on Button Click in Background (Regardless of User Page Change, or Browser Exit)

I have a script that parses a large amount of historical data through MySQL queries and other PHP manipulation.
Essentially it is a "Order Report" generator that outputs a lot of data and forms it into an array, and then finally encoding it into json and saving it on my server.
I would like to optimize these more, but to generate a 30 day report takes about 10-15 seconds, 90 days takes 45-seconds to a minute to complete, 180 days anywhere from 2-3 minutes, and for an entire year it takes typically over 5 minutes for the script to finish.
Like eBay and Amazon, I am thinking, due to loading times, a 'File Queue' system is needed, as the user isn't going to want to wait 5 minutes in a loading scenario waiting for a yearly report.
I understand about ajax requests, and how this can be done, or have even read about hidden iframes that perform the request. My concern is, what if the user changes the page (if normal ajax request), or exits the browser, the script execution will not finish.
My only other thought would be to possibly have a cron job along with a MySQL table that inserts the users request, but thinking forward the CRON would have to run about once every minute, all day, every single day, non-stop to check if a report request was issued.
This seems plausible, but perhaps not ideal as having a cron run every minute 24/7/365 seems it'd produce a bit of baggage in its continuity (especially since reports being generated would not be conducted very often).
Can anyone recommend/suggest a route of action to take that won't be a burden to my server, and will still run and complete regardless of user action?

Long ajax respond time?

Based on my understanding ajax respond(return) can not be more than a couple of seconds. So is there any way you can ajax every couple minute or second and check if result is ready(php finished).
The reason I am asking is because my php code is sending data over tcp which it take around 20 min, and I want check the result and show it on the browser.
Right now I am writing the result on file, and then every minute I ajax to another php file which read that file to check it.

Call a function multiple times without waiting for it to complete

I have a cron job that calls a script that iterates through some items and submits them as posts to Facebook Graph API every minute. The issue is, each call takes a few seconds. If there is more than 10 posts to be sent to the API in a given minute, the script runs longer than a minute and then starts causing issues when the script starts running again at the next minute.
The general process is like this:
1. Each facebook profile posts every hour
2. Each of these profiles have a 'posting minute', which is the minute of the hour that they are posted at
3. A cron job runs every minute to see which profiles should be posted do, any given minute, and then posts to them
My question: Is it possible to continue the script immediately after calling the $facebook->api(...) method below, rather than waiting for it to complete before continuing? So that it can ensure to post to all profiles within the given minute, rather than potentially risk having too many profiles to post to and the script exceeding 60 seconds.
$profilesThatNeedToBePostedTo = getProfilesToPostTo(date(i));
foreach($profilesThatNeedToBePostedTo as $profile)
{
$facebook->api($endPoint, 'POST', $postData); // $postData and $endPoint omitted for brevity
}
what you are asking if in PHP you can do Asynchronous calls. There are some solutions here: Asynchronous PHP calls
Just to point there are not the most reliable options.
Is not limiting number of post to 9 each time reduce the times you have exceeded the 60 seconds. But what happens if the Facebook Graph is slow? what do you do?
Or run you code for 60 seconds and once the 60 seconds time is reached close the function. And let it run again the next time.
I am not sure if I understand your question hundred percent clearly. You meant that you want to run the cronjob every minute. However, it takes longer than 1 minute to call facebook API. Do you mean that foreach loop may delay your cronjob?
If so, I wonder if you did a right cronjob script, better to show your cronjob script. Because I think php script should not affect your sh script(cronjob)

jQuery times out waiting for server to respond back

Background:
I have two pages (index.php and script.php).
I have a jQuery function that calls script.php from index.php.
script.php will do a ton of data processing and then return that data back to index.php so that it can be displayed to the user.
Problem:
index.php appears to be timing out because script.php is taking to long to finish. script.php can sometimes take up to 4 hrs to finish processing before it can return the data to index.php.
the reason I say index.php is timing out is b/c it never updates and just sits there with an hour glass even after script.php stops processing.
i know for sure that script.php does finish processing successfully b/c i'm writing the output to a log file as well and see that everything is being processed.
if there is not much data to be processed by script.php then index.php will update as it is supposed to.
I'm not setting any timeout values within the function inside index.php when calling script.php.
Is there a better to get index.php to update after waiting a very long time for script.php to finish? I'm using FireFox, so is it maybe a FireFox issue?
Do you seriously want an ajax call to take four hours to respond? That makes little sense in the normal way the web and browsers work. I'd strongly suggest a redesign.
That said, jQuery's $.ajax() call has a timeout value you can set as an option described here: http://api.jquery.com/jQuery.ajax/. I have no idea if the browser will allow it to be set as long as four hours and still operate properly. In any case, it's not a high probability operation to require keeping a browser connection open and live for four hours. If there's a momentary hiccup, what are you going to do? start all over again? This is just not a good design.
What I would suggest as a redesign is that you break the problem up into smaller pieces that can be satisfied in much shorter ajax calls. If you really want a four hour operation, then I'd suggest you start the operation with one ajax call and then you poll every few minutes from the browser to inquire when the job is done. When it is finally done, then you can retrieve the results. This would be much more compatible with the normal way that ajax calls and browsers work and it wouldn't suffer if there is a momentary internet glitch during the four hours.
If possible, your first ajax call could also return an approximation for how long the operation might take which could provide some helpful UI in the browser that is waiting for the results.
Here's a possibility:
Step 1. Send ajax call requesting that the job start. Immediately receive back a job ID and any relevant information about the estimated time for the job.
Step 2. Calculate a polling interval based on the estimated time for the job. If the estimate is four hours and the estimates are generally accurate, then you can set a timer and ask again in two hours. When asking for the data, you send the job ID returned by the first ajax call.
Step 3. As you near the estimated time of completion, you can narrow the polling interval down to perhaps a few minutes. Eventually, you will get a polling request that says the data is done and it returns the data to you. If I was designing the server, I'd cache the data on the server for some period of time in case the client has to request it again for any reason so you don't have to repeat the four hour process.
Oh, and then I'd think about changing the design on my server so that nothing that is requested regularly every takes four hours. It should either be pre-built and pre-cached in some sort of batch fashion (e.g. a couple times a day) or the whole scheme should be redone so that common queries can be satisfied in less than a minute rather than four hours.
Would it be possible to periodically send a response back to index.php just to "keep it alive" ? If not, perhaps split up your script into a few smaller scripts, and run them in chunks of an hour at a type as opposed to the 4 hours you mentioned above.

Categories