Are AJAX interval functions bad for servers? - php

I have this function set to run at intervals using JavaScript which points to a PHP page. I was just wondering if this is bad practice for the server load, especially at scale.
The PHP page just retrieves data from the Slack API through CURL and just echoes it out back.
setInterval(function(){
$.post('slackAPI.php?',function(data){
//append data received to some class
}, 'html');
}, 100);
This option works perfectly but I'm worried that it'll cause a heavy load on the server, is there a better option to retrieve data from an API in real time?

Your server load stems from two sources:
having to handle the incoming AJAX call,
having to issue a Slack call and get its return.
Depending on the scenario you can improve either of them (or even both):
Cache third-party calls
If a third-party information is likely to be updated no oftener than 'x', you can store the call's result into a database, memory keystore, or cache file, and read its contents if it's still "fresh".
Even better, you can use GET to retrieve the datum, and issue appropriate Expires header from the PHP side. This will prevent most AJAX libraries from issuing unnecessary calls. Of course, you now run the risk of not retrieving immediately a fresh information that came by in the meantime:
// 30 seconds timeout
Header('Expires: '.gmdate('D, d M Y H:i:s \\G\\M\\T', time() + 30));
Replace fixed interval with chained AJAX calls.
Instead of using a fixed interval, if you can whip up an API that waits for data to come, you can put the updating function as a callback of the updating function itself.
This way, you will issue a POST that will take, say, from 100 to 20000 milliseconds to complete. Before it completes, there is no data, so it would have been useless to issue 199 other calls to the same API in the meanwhile. When it completes, it immediately fires off a new AJAX call, which will wait again, and so on. This is better done with a jQuery Promise loop; writing it in straight Javascript would result in a recursive function, and there's a limit on how many recursions you can enter.
Another advantage is that you can control the client update frequency from the server.
You would do this using setTimeout, not setInterval.
function refresher() {
$.get('/your/api/endpoint', { maxtimeout: 30 })
.then(function(reply) {
if (reply.changes) {
// ...update the UI? Here you call your "old" function.
}
// Immediately issue another call, which will wait.
window.setTimeout(refresher, 1);
});
}
// Call immediately. The return callback will wait.
refresher();
The best option is the third:
Change API.
Most such poll services have the possibility of either issuing a long-blocking call (the one you would use above) or to register an URL which will receive the data when there's data to be fetched. Then you store the information and cache it, while the receiving endpoint on the server keeps the cache updated. You now only have to worry about the inbound calls, which are quickly handled.
The inbound call could then block and wait, consuming very few resources:
// $cacheFile is written by the callback API entry point
$ms = 200; // Granularity is 200 milliseconds
$seconds = 30; // Max linger time is 30 seconds (could be higher)
$delay = floor(($seconds * 1000.0) / $ms);
while ($delay--) {
// filemtime must read up-to-date information, NOT stale.
clearstatcache(false, $cacheFile);
if (filemtime($cacheFile) !== $lastCreationTime) {
break;
}
usleep($ms * 1000);
}
Header('Content-Type: application/json;charset=UTF-8');
readfile($cacheFile);
die();
The above will have a overhead of 150 stat() calls (30 seconds, 5 calls per second), which is absolutely negligible, and will save 149 web server calls and all related traffic and delays.

Related

php and ajax: show progress for long script

I have php script which can take quite a lot of time (up to 3-5 minutes), so I would like to notify user how is it going.
I read this question and decided to use session for keeping information about work progress.
So, I have the following instructions in php:
public function longScript()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$generatingProgressSession->unsetAll();
....
$generatingProgressSession->total = $productsNumber;
...
$processedProducts = 0;
foreach($models as $model){
//Do some processing
$processedProducts++;
$generatingProgressSession->processed = $processedProducts;
}
}
And I have simple script for taking data from session (number of total and processed items) which return them in json format.
So, here is js code for calling long script:
$.ajax({
url: 'pathToLongScript',
data: {fileId: fileId, format: 'json'},
dataType: 'json',
success: function(data){
if(data.success){
if(typeof successCallback == "function")
successCallback(data);
}
}
});
//Start checking progress functionality
var checkingGenerationProgress = setInterval(function(){
$.ajax({
url: 'pathToCheckingStatusFunction',
data: {format: 'json'},
success: function(data){
console.log("Processed "+data.processed+" items of "+data.total);
if(data.processed == data.total){
clearInterval(checkingGenerationProgress);
}
}
});
}, 10000)
So, long scripted is called via ajax. Then after 10 seconds checking script is called one time, after 20 second - second time etc.
The problem is that none of requests to checking script is completed until main long script is complete. So, what does it mean? That long script consumes too many resources and server can not process any other request? Or I have some wrong ajax parameters?
See image:
-----------UPD
Here is a php function for checking status:
public function checkGenerationProgressAction()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$this->view->total = $generatingProgressSession->total;
$this->view->processed = $generatingProgressSession->processed;
}
I'm using ZF1 ActionContext helper here, so result of this function is json object {'total':'somevalue','processed':'another value'}
I'd
exec ('nohup php ...');
the file and send it to background. You can set points the long running script is inserting a single value in DB to show it's progress. Now you can go and check every ten or whatever seconds if a new value has been added and inform the user. Even might be possible to inform the user when he is on another page within your project, depending on your environment.
Yes, it's possible that the long scripts hogs the entire server and any other requests made in that time are waiting to get their turn. Also i would recommend you to not run the check script every 10 seconds no matter if the previous check has finished or not but instead let the check script trigger itself after it has been completed.
Taking for example your image with the requests pending, instead of having 3 checking request running at the same time you can chain them so that at any one time only one checking request is run.
You can do this by replacing your setInterval() function with a setTimeout() function and re-initialize the setTimeout() after the AJAX check request is completed
Most likely, the following calls are not completing due to session locking. When one thread has a session file open, no other PHP threads can open that same file, as it is read/write locked until the previous thread lets go of it.
Either that, or your Server OR Browser is limiting concurrent requests, and therefore waiting for this one to complete.
My solution would be to either fork or break the long-running script off somehow. Perhaps a call to exec to another script with the requisite parameters, or any way you think would work. Break the long-running script into a separate thread and return from the current one, notifying the user that the execution has begun.
The second part would be to log the progress of the script somewhere. A database, Memcache, or a file would work. Simply set a value in a pre-determined location that the follow-up calls can check on.
Not that "pre-determined" should not be the same for everyone. It should be a location that only the user's session and the worker know.
Can you paste the PHP of "pathToCheckingStatusFunction" here?
Also, I notice that the "pathToCheckingStatusFunction" ajax function doesn't have a dataType: "json". This could be causing a problem. Are you using the $_POST['format'] anywhere?
I also recommend chaining the checks into after the first check has completed. If you need help with that, I can post a solution.
Edit, add possible solution:
I'm not sure that using Zend_namespace is the right approach. I would recommend using session_start() and session_name(). Call the variables out of $_SESSION.
Example File 1:
session_name('test');
session_start();
$_SESSION['percent'] = 0;
...stuff...
$_SESSION['percent'] = 90;
Example File 2(get percent):
session_name('test');
session_start();
echo $_SESSION['percent'];

Securely importing contents of a protected js-file by using AJAX

To import a js-file is simple... just use >script src='file.js' type='text/javascript'>>/script>.
But then the source code will show a direct url to the contents of the file.
The content should be executable, but not directly viewable by using source url.
What is the best way to load the content of file.js to memory using AJAX.
I've come to the following initial way of working (just and idea, totally flawed?):
function get_contents() {
-> ajax execute PHP {copy 'file.js' to 'token.js' in tmp-directory}
-> ajax get contents of 'tmp/token.js' and load to memory
-> ajax execute PHP {delete 'tmp/token.js' in tmp-directory}
return(true); // content (ie. functions) should now be usable
}
But I'm not sure if the second ajax excute is enough to now be able to succesfully call the functions.
PHP returns content, but javascript 'ajax success' may see it (and stores it) as an variable... doh!
Is this ajax success idea going to work ??
Can someone suggest a better idea ?
Edit:
According to initial responses this way of working is virtually and humanly impossible.
Will solve it by loading common functions the 'normal' unprotected way, and using Jerry's suggestion (see comment) for calculations that happen less often.
Edit #2:
Below mentioned (time consuming) problem can be solved by following next template.
Still making use of the suggested 'hidden PHP code' method.
I am making use of a buffer (or sumthing), like a Youtube video... except 'video data' is 'results from AJAX-PHP functions'.
AJAX request "30 cycle", "60 cycle", "300 cycle", "600 cycle"
store result to buffer
initiate "start cycle"
function cycle() // run every second !!
{
//do stuff... no AJAX needed
//do some more stuff... like animations and small calculations
//per 30 cycle (30 seconds)
if ($cycle==30)
{
perform last "30 cycle" AJAX result [PHP-function set "A"]
... when finished: AJAX request "30 cycle"
store result to buffer in 'background'
}
//per 60 cycle (1 minute)
if ($cycle==60)
{
perform last "60 cycle" AJAX result [PHP-function set "B"]
... when finished: AJAX request "60 cycle"
store result to buffer in 'background'
}
//and so on....
}
Initial question 99% solved (-1 because of developer tools).
Thanks for commenting and suggestions.
Even if you split up your JavaScript files into lots of individual functions it would take very little work to put it all back together again.
With modern browsers even a file is "loaded into memory" you can see exactly what was loaded.
Try using the developer tools that come with browsers, you can use them to see when an Ajax call is made, exactly what was loaded in text format.
If you are mixing PHP and JavaScript then you should put anything that is sensitive in your PHP code whilst using JavaScript for your presentation of the results PHP provides.
EDIT: as per your update, instead of doing "cycles" could you not do this?
function pollAjax()
{
$.ajax({
-- ajax settings --
}).success(function(data) {
// do something with our results
doSomething(data);
// Fire again to get the next set of results.
setTimeout(function() {pollAjax()}, 10);
});
}
This means you're less likely to hang the browser with 1000's of pending ajax requests It will ask for the results, when it gets them, it will ask for the next set of results.

jQuery AJAX Wait

I have constructed a PHP file which scrapes a web page (using cURL) to obtain some data, and outputs it to the screen in JSON format.
The target website involves some redirects which temporarily outputs data to my PHP file. Once the redirects have completed successfully, the JSON is presented as expected. The problem that I am encountering is that when I try to access the JSON using jQuery's $.ajax() method, it sometimes returns the incorrect data, because it isn't waiting for the redirects to complete.
My question is if it's possible to tell the AJAX request to wait a certain number of seconds before returning the data, thus allowing time for the redirects in the PHP script to execute successfully?
Please note that there is no cleaner solution for the page scrape, the redirects are essential and have to be outputted to the screen for the scraping to complete.
There's always timeout in the settings.
jQuery docs:
timeout Number
Set a timeout (in milliseconds) for the request. This will
override any global timeout set with $.ajaxSetup().
The timeout period starts at the point the $.ajax call is made;
if several other requests are in progress and the browser
has no connections available, it is possible for a request
to time out before it can be sent. In jQuery 1.4.x and below,
the XMLHttpRequest object will be in an invalid state if
the request times out; accessing any object members may
throw an exception. In Firefox 3.0+ only, script and JSONP
requests cannot be cancelled by a timeout; the script will
run even if it arrives after the timeout period.
You should use promise() in jQuery.
You could always store the result of your ajax call and then wait for the redirects to finsih, i.e.:
$.ajax({
success: function(e)
{
var wait = setTimeout(function(){ doSomethingWithData(e.data); }, 5000); //5 sec
}
})
Alternatively, you could set up an Interval to check if something happened (redirect finished) every x amount of ms. I'm assuming your redirects are letting you know they completed?
http://examples.hmp.is.it/ajaxProgressUpdater/
$i=0;
while (true)
{
if (self::$driver->executeScript("return $.active == 0")) {
break;
}
if($i == 20) {
break;
}
$i++;`enter code here`
echo $i;
usleep(10000);
}

PHP: Longpolling & Comet related

Recently, I am going to make a instant-notification system for my website. I heard COMET is an essential in such cases.
I have been searching about PHP & Comet for a while already, however, the guides & articles I have found seems like just ajax requests in a loop. For example, there is a basic javascript code which gets the value from PHP file every 2 seconds and outputs to HTML. As far as I know, it should be COMET pushing new values to HTML, hence, the loop should be on server side, not client. Half of the articles in my native language was using setInterval() and contact PHP file every X seconds.
So, I have some questions to ask you.
Is there any guides or examples, which doesn't use any external framework like XAJAX/NOLOH that is easy to understand?
What is the performance difference between using COMET in server side, or requesting value from ajax.php every X seconds?
The timed requests I mentioned above can be called as COMET? (ex. Long Polling using jQuery and PHP)
Do I need any extensions to run COMET serverside? (My webhost is using Apache, I personally use Nginx)
You have to use a client-side script (AJAX), because the server has to be polled. The server cannot simply send messages to someone's browser without an open connection.
I'm not too familiar with HTML5 websockets, but I believe this allows you can have a persistent connection with the server, however HTML5 browsers aren't used widely to use this as a solution on a 'public' website.
How long polling works is that an asynchronous request is sent from the browser with a long time-out time (e.g. 30 seconds), when the request arrives at the server, it goes and checks for new messages, but when there are now messages to be displayed, instead of directly outputting the result, it goes into an infinite loop, polling the database e.g. every second (using sleep to postpone the queries), until a message has been found. When a message has been found it terminates the loop and outputs the result. If there have been no messages after 30 seconds, the script times out and sends back an empty request.
So the request can be sent back between 0 and 30 seconds. As soon as the request arrives in the browser, it is handled and a new 30 second request is sent.
As for your questions;
You will need a client-side framework for doing the polling
You cannot use Comet only on server-side. Using longpolling over normal polling (e.g. polling every second) is significant because you make much less server requests
To my understanding; yes
You can use any server-side language, as long as it can keep the connection open while querying for messages.
Also take a look at http://nodejs.org/
I don't know what exactly COMMET is mean. But for this purpose you have many solutions.
One, as you mentioned is long-polling by ajax. is simple. and not requeire new browsers only (HtML5).
One more option is "server-sent -event". It's require browser with HTML5 but it keep connection alive without polling:
client:
if (window.EventSource) {
window.onload = function() {
window.scrollTo(0,1);
setTimeout(
function() {
var source = new EventSource("events.php");
source.onmessage = function (event) {
document.body.innerHTML += event.data + "<br>";
};
}, 1000);
};
} else {
document.write("Please visit this page in a browser that supports EventSource to see the test");
}
server:
if ($_SERVER['HTTP_ACCEPT'] === 'text/event-stream') {
header('Content-Type: text/event-stream');
echo "data: This is the first event\n\n";
flush();
$i = 5;
while (--$i) {
sleep(1);
$time = date('r');
echo "data: The server time is: {$time}\n\n";
flush();
}
} else {
echo 'This demo is for use with an EventSource compatible browser.';
}
goodluck.

Pushing data across browsers (same session)

I've seen pages like facebook where, if you post a message in your newsfeed, it automatically pushes that across your browsers. Or like on this page... if someone has answered a question while you are typing, a bar drops down.
Are they just calling AJAX requests every 30 seconds or whatever? It seems like that would be a resource drain on your server. Is there a way to push something at the browser instead?
There are 3 options here:
Use the new (experimental) browser API (sockets)
Long polling / comet
Using / listening to cookies
Long polling / comet example in PHP / AJAX
// PHP SIDE
$max_wait_time = 30; // at most, 30 seconds
$start_time = microtime(true);
while( $start_time - microtime(true) < $max_wait_time ){
// ...check if something changed (eg, run an SQL query or something)
if($something_changed){
echo 'something changed';
die;
}
// if the user did abort, terminate immediately
if( connection_aborted() ) die;
// sleep for one second. For faster responses, keep
// splitting this suitably (eg, 0.5 of a second...)
usleep(1000000);
}
// JS SIDE
var poll = function(){
jQuery.get('the url', function(){
poll();
});
}
poll();
Cookie example in PHP / JS (you need the jQuery cookie plugin)
<?php
// PHP SIDE
setcookie('test', mt_rand(0,100));
?><!-- HTML/JS SIDE -->
Rand!
Rand=<span><?php echo $_COOKIE['test']; ?></span>
<script type="text/javascript">
var oldrand = <?php echo $_COOKIE['test']; ?>;
setInterval(function(){
var newrand = jQuery.cookie('test');
if( newrand!=oldrand ){
jQuery('span').html(newrand);
oldrand = newrand;
}
}, 500);
</script>
The cookie one is pretty good for several reasons:
it is pretty fast (no AJAX calls)
it is less resource intensive on both client and server side
it consumes less bandwidth / network resources
it is much easier to control
In some cases where cookies cannot work, I'd still advocate the use of cookies as a signal to run an AJAX call, hence you wouldn't need to run a lot of AJAX calls just to wait for a change to happen.
On the other hand, the cookie one won't work when the change is happening by a third party, eg, it won't be suitable at all for chat systems.
Read into the differences between push and pull for more information:
In your example, the AJAX requests every 30 seconds would be a pull request - constantly asking the server if any updates are available, followed by a response.
You can set up a server/website to send push notifications to the client browser - whereby the client sits quietly, and the server sends the data/information to the client as soon as it is available (reducing network traffic etc.).
Push is much better in my opinion.
Yep, you'd have to poll with a looping Ajax script. To keep resource drain down, you might want to send some kind of hash (the timestamp of the last news item for instance) so the server knows if the client is up to date. This way, it can instantly return if there's no changes to push.

Categories