Based on my understanding ajax respond(return) can not be more than a couple of seconds. So is there any way you can ajax every couple minute or second and check if result is ready(php finished).
The reason I am asking is because my php code is sending data over tcp which it take around 20 min, and I want check the result and show it on the browser.
Right now I am writing the result on file, and then every minute I ajax to another php file which read that file to check it.
Related
I made a PHP application and it requires API Response from a server and it checks with mysql database.
Every time it takes long time to execute, minimum time to get response is 3 seconds and goes 90 seconds and so on.
I have attached the screenshot of two files for one is continuous request delayed response and another is waiting time.
I want to know the reason for the waiting time and blocked time in response. But when I check the php code execution it done in less than one seconds.
Whether I want to improve php code or that API server?
I'm building a PHP application which has a database containing approximately 140 URL's.
The goal is to download a copy of the contents of these web pages.
I've already written code which reads the URL's from my database then uses curl to grab a copy of the page. It then gets everything between <body> </body>, and writes it to a file. It also takes into account redirects, e.g. if I go to a URL and the response code is 302, it will follow the appropriate link. So far so good.
This all works ok for a number of URL's (maybe 20 or so) but then my script times out due to the max_execution_time being set to 30 seconds. I don't want to override or increase this, as I feel that's a poor solution.
I've thought of 2 work arounds but would like to know if these are a good/bad approach, or if there are better ways.
The first approach is to use a LIMIT on the database query such that it splits the task up into 20 rows at a time (i.e. run the script 7 separate times, if there were 140 rows). I understand from this approach it still needs to call the script, download.php, 7 separate times so would need to pass in the LIMIT figures.
The second is to have a script where I pass in the ID of each individual database record I want the URL for (e.g. download.php?id=2) and then do multiple Ajax requests to them (download.php?id=2, download.php?id=3, download.php?id=4 etc). Based on $_GET['id'] it could do a query to find the URL in the database etc. In theory I'd be doing 140 separate requests as it's a 1 request per URL set up.
I've read some other posts which have pointed to queueing systems, but these are beyond my knowledge. If this is the best way then is there a particular system which is worth taking a look at?
Any help would be appreciated.
Edit: There are 140 URL's at the moment, and this is likely to increase over time. So I'm looking for a solution that will scale without hitting any timeout limits.
I dont agree with your logic , if the script is running OK and it needs more time to finish, just give it more time it is not a poor solution.What you are suggesting makes things more complicated and will not scale well if your urls increase.
I would suggest moving your script to the command line where there is no time limit and not using the browser to execute it.
When you have an unknown list wich will take an unknown amount of time asynchronous calls are the way to go.
Split your script into a single page download (like you proposed, download.php?id=X).
From the "main" script get the list from the database, iterate over it and send an ajax call to the script for each one. As all the calls will be fired all at once, check for your bandwidth and CPU time. You could break it into "X active task" using the success callback.
You can either set the download.php file to return success data or to save it to a database with the id of the website and the result of the call. I recommend the later because you can then just leave the main script and grab the results at a later time.
You can't increase the time limit indefinitively and can't wait indefinitively time to complete the request, so you need a "fire and forget" and that's what asynchronous call does best.
As #apokryfos pointed out, depending on the timing of this sort of "backups" you could fit this into a task scheduler (like chron). If you call it "on demand", put it in a gui, if you call it "every x time" put a chron task pointing the main script, it will do the same.
What you are describing sounds like a job for the console. The browser is for the users to see, but your task is something that the programmer will run, so use the console. Or schedule the file to run with a cron-job or anything similar that is handled by the developer.
Execute all the requests simultaneously using stream_socket_client(). Save all the socket IDs in an array
Then loop through the array of IDs with stream_select() to read the responses.
It's almost like multi-tasking within PHP.
Background:
I have two pages (index.php and script.php).
I have a jQuery function that calls script.php from index.php.
script.php will do a ton of data processing and then return that data back to index.php so that it can be displayed to the user.
Problem:
index.php appears to be timing out because script.php is taking to long to finish. script.php can sometimes take up to 4 hrs to finish processing before it can return the data to index.php.
the reason I say index.php is timing out is b/c it never updates and just sits there with an hour glass even after script.php stops processing.
i know for sure that script.php does finish processing successfully b/c i'm writing the output to a log file as well and see that everything is being processed.
if there is not much data to be processed by script.php then index.php will update as it is supposed to.
I'm not setting any timeout values within the function inside index.php when calling script.php.
Is there a better to get index.php to update after waiting a very long time for script.php to finish? I'm using FireFox, so is it maybe a FireFox issue?
Do you seriously want an ajax call to take four hours to respond? That makes little sense in the normal way the web and browsers work. I'd strongly suggest a redesign.
That said, jQuery's $.ajax() call has a timeout value you can set as an option described here: http://api.jquery.com/jQuery.ajax/. I have no idea if the browser will allow it to be set as long as four hours and still operate properly. In any case, it's not a high probability operation to require keeping a browser connection open and live for four hours. If there's a momentary hiccup, what are you going to do? start all over again? This is just not a good design.
What I would suggest as a redesign is that you break the problem up into smaller pieces that can be satisfied in much shorter ajax calls. If you really want a four hour operation, then I'd suggest you start the operation with one ajax call and then you poll every few minutes from the browser to inquire when the job is done. When it is finally done, then you can retrieve the results. This would be much more compatible with the normal way that ajax calls and browsers work and it wouldn't suffer if there is a momentary internet glitch during the four hours.
If possible, your first ajax call could also return an approximation for how long the operation might take which could provide some helpful UI in the browser that is waiting for the results.
Here's a possibility:
Step 1. Send ajax call requesting that the job start. Immediately receive back a job ID and any relevant information about the estimated time for the job.
Step 2. Calculate a polling interval based on the estimated time for the job. If the estimate is four hours and the estimates are generally accurate, then you can set a timer and ask again in two hours. When asking for the data, you send the job ID returned by the first ajax call.
Step 3. As you near the estimated time of completion, you can narrow the polling interval down to perhaps a few minutes. Eventually, you will get a polling request that says the data is done and it returns the data to you. If I was designing the server, I'd cache the data on the server for some period of time in case the client has to request it again for any reason so you don't have to repeat the four hour process.
Oh, and then I'd think about changing the design on my server so that nothing that is requested regularly every takes four hours. It should either be pre-built and pre-cached in some sort of batch fashion (e.g. a couple times a day) or the whole scheme should be redone so that common queries can be satisfied in less than a minute rather than four hours.
Would it be possible to periodically send a response back to index.php just to "keep it alive" ? If not, perhaps split up your script into a few smaller scripts, and run them in chunks of an hour at a type as opposed to the 4 hours you mentioned above.
I set up a very small (internal) dedicated web server, and I need to pull some energy data every 10 seconds or so from an XML file. This is the PHP code I have thus far:
<?php
$mydata = simplexml_load_file('http://192.168.x.xx:yyy/data.xml');
echo $mydata->device[0]->name;
echo $mydata->device[0]->value;
?>
I tested similar code out on my web server and PHP is installed and I think this should work, but I'd like to have this run every 10 seconds or so. This way the data displaying on my web page is always up to date. The web page will be left running 24/7 as a sign on the wall. What's the easiest way to refresh the data?
I would simply refresh the portion of the web page that displays the data using Ajax. Trigger the refresh using a JavaScript timer.
The easiest way? Add this line to your page.
<meta http-equiv="refresh" content="10">
This may not be the best way though, especially if you have a lot of stuff on the page that you don't need reloaded every 10 seconds. If that's the case, you should look into AJAX.
If you have a page setup that returns only the data you need (like your example) you can make an asynchronous request every 10 seconds using JavaScript's setInterval() to get the latest data and show it.
If you are running this through a browser then I would go with Eric's answer.
However if you have this running from command line you can do one of the two:
Have your current script say pull_energy_data.php to run from a cron job every 10 seconds. Don't forget to create some sort of locking mechanism. Just in case the job takes more than 10 seconds to run and you'll have more than one script running at the same time.
Another approach is having a script wrapping your pull_energy_data.php running in a loop and executing it every 10 seconds. This is less desirable than the previous approach as you'll need to keep track of when pull_energy_data.php last ran and how to issue a command to stop the wrapper script.
assuming you are running the server in Linux/Unix, perhaps look into writing a cronjob (automated job) for it?
One solution I can think of is to run the PHP script as a cronjob every 10 seconds, writing the output into a file or a database table.
You can then write a separate PHP script that reads the contents of the file/DB entry whenever anyone loads it in a browser.
I have a script, lets call it linkchecker, that loops through about 10.000 URLs, checking them for http status codes.
When they are checked, the url is marked as checked in my DB.
It wont output anything until its done, which can take many hours.
So I thought about just having another script that will run the linkchecker in the background, while continually polling the DB about how many URLs are checked, so I can follow the progress, and if any URLs are giving a problem with long connection time and so on.
I tried just running the linkchecker in an iframe, but nothing will load until the linkchecker has finished.
How can i execute this linkchecker in the background while the main script runs normally, executing other tasks?
You have to set a cron job (if you are running Linux) that executes a curl command to access a PHP script (external, like 'curl http://domain.com/php/something.php') or just executing a php command pointing to a internal file.
You can make a scheduler that executes every minute (that's the minimum execution time supported by cron job) and executes a "block" of your work. Of course, you must set PHP to skip the 30 seconds execution limit used by default.
In the DB make a column you call "Checked". Just make the PHP-script update which are checked in the database. Use phpMyAdmin to see the database graphically, just choose to sort after the column "Checked" and then you could see how far it has come.
You have to do it this way, because the webpage will not update until the script is done. However you could make the script run another script that says how far the process is, but that would maybe be too time-consuming?
You also have to go into php.ini to check that max_execution_time is set to several hours. 60*60*24 = 1day = 86 400 seconds.
Hope this helps! :)
I suggest making an AJAX request to new page lets call it ajaxChecker.php
In thuis page its just see if there is unchecked URLs (return number)
If the number of unchecked is zero then echo the output in the new div
function checker()
{
$.post('ajaxChecker.php',function(data){
if(data.length > 0)
$('#result').html(data);
});
}
setInterval( "checker()", 10000 );
And of course make a request through ajax or cron job to start it first