To import a js-file is simple... just use >script src='file.js' type='text/javascript'>>/script>.
But then the source code will show a direct url to the contents of the file.
The content should be executable, but not directly viewable by using source url.
What is the best way to load the content of file.js to memory using AJAX.
I've come to the following initial way of working (just and idea, totally flawed?):
function get_contents() {
-> ajax execute PHP {copy 'file.js' to 'token.js' in tmp-directory}
-> ajax get contents of 'tmp/token.js' and load to memory
-> ajax execute PHP {delete 'tmp/token.js' in tmp-directory}
return(true); // content (ie. functions) should now be usable
}
But I'm not sure if the second ajax excute is enough to now be able to succesfully call the functions.
PHP returns content, but javascript 'ajax success' may see it (and stores it) as an variable... doh!
Is this ajax success idea going to work ??
Can someone suggest a better idea ?
Edit:
According to initial responses this way of working is virtually and humanly impossible.
Will solve it by loading common functions the 'normal' unprotected way, and using Jerry's suggestion (see comment) for calculations that happen less often.
Edit #2:
Below mentioned (time consuming) problem can be solved by following next template.
Still making use of the suggested 'hidden PHP code' method.
I am making use of a buffer (or sumthing), like a Youtube video... except 'video data' is 'results from AJAX-PHP functions'.
AJAX request "30 cycle", "60 cycle", "300 cycle", "600 cycle"
store result to buffer
initiate "start cycle"
function cycle() // run every second !!
{
//do stuff... no AJAX needed
//do some more stuff... like animations and small calculations
//per 30 cycle (30 seconds)
if ($cycle==30)
{
perform last "30 cycle" AJAX result [PHP-function set "A"]
... when finished: AJAX request "30 cycle"
store result to buffer in 'background'
}
//per 60 cycle (1 minute)
if ($cycle==60)
{
perform last "60 cycle" AJAX result [PHP-function set "B"]
... when finished: AJAX request "60 cycle"
store result to buffer in 'background'
}
//and so on....
}
Initial question 99% solved (-1 because of developer tools).
Thanks for commenting and suggestions.
Even if you split up your JavaScript files into lots of individual functions it would take very little work to put it all back together again.
With modern browsers even a file is "loaded into memory" you can see exactly what was loaded.
Try using the developer tools that come with browsers, you can use them to see when an Ajax call is made, exactly what was loaded in text format.
If you are mixing PHP and JavaScript then you should put anything that is sensitive in your PHP code whilst using JavaScript for your presentation of the results PHP provides.
EDIT: as per your update, instead of doing "cycles" could you not do this?
function pollAjax()
{
$.ajax({
-- ajax settings --
}).success(function(data) {
// do something with our results
doSomething(data);
// Fire again to get the next set of results.
setTimeout(function() {pollAjax()}, 10);
});
}
This means you're less likely to hang the browser with 1000's of pending ajax requests It will ask for the results, when it gets them, it will ask for the next set of results.
Related
I have read many similar questions concerning cancelling a POST request with jQuery, but none seem to be close to mine.
I have your everyday form that has a PHP-page as an action:
<form action="results.php">
<input name="my-input" type="text">
<input type="submit" value="submit">
</form>
Processing results.php on the server-side, based on the post information given in the form, takes a long time (30 seconds or even more and we expect an increase because our search space will increase as well in the coming weeks). We are accessing a Basex server (version 7.9, not upgradable) that contains all the data. A user-generated XPath code is submitted in a form, and the action url then sends the XPath code to the Basex server which returns the results. From a usability perspective, I already show a "loading" screen so users at least know that the results are being generated:
$("form").submit(function() {
$("#overlay").show();
});
<div id="overlay"><p>Results are being generated</p></div>
However, I would also want to give users the option to press a button to cancel the request and cancel the request when a user closes the page. Note that in the former case (on button click) this also means that the user should stay on the same page, can edit their input, and immediately re-submit their request. It is paramount that when they cancel the request, they can also immediately resend it: the server should really abort, and not finish the query before being able to process a new query.
I figured something like this:
$("form").submit(function() {
$("#overlay").show();
});
$("#overlay button").click(abortRequest);
$(window).unload(abortRequest);
function abortRequest() {
// abort correct request
}
<div id="overlay">
<p>Results are being generated</p>
<button>Cancel</button>
</div>
But as you can see, I am not entirely sure how to fill in abortRequest to make sure the post request is aborted, and terminated, so that a new query can be sent. Please fill in the blanks! Or would I need to .preventDefault() the form submission and instead do an ajax() call from jQuery?
As I said I also want to stop the process server-side, and from what I read I need exit() for this. But how can I exit another PHP function? For example, let's say that in results.php I have a processing script and I need to exit that script, would I do something like this?
<?php
if (isset($_POST['my-input'])) {
$input = $_POST['my-input'];
function processData() {
// A lot of processing
}
processData()
}
if (isset($_POST['terminate'])) {
function terminateProcess() {
// exit processData()
}
}
and then do a new ajax request when I need to terminate the process?
$("#overlay button").click(abortRequest);
$(window).unload(abortRequest);
function abortRequest() {
$.ajax({
url: 'results.php',
data: {terminate: true},
type: 'post',
success: function() {alert("terminated");});
});
}
I did some more research and I found this answer. It mentions connection_aborted() and also session_write_close() and I'm not entirely sure which is useful for me. I do use SESSION variables, but I don't need to write away values when the process is cancelled (though I would like to keep the SESSION variables active).
Would this be the way? And if so, how do I make one PHP function terminate the other?
I have also read into Websockets and it seems something that could work, but I don't like the hassle of setting up a Websocket server as this would require me to contact our IT guy who requires extensive testing on new packages. I'd rather keep it to PHP and JS, without third party libraries other than jQuery.
Considering most comments and answers suggest that what I want is not possible, I am also interested to hear alternatives. The first thing that comes to mind is paged Ajax calls (similar to many web pages that serve search results, images, what-have-you in an infinite scroll). A user is served a page with the X first results (e.g. 20), and when they click a button "show next 20 results" those are shown are appended. This process can continue until all results are shown. Because it is useful for users to get all results, I will also provide a "download all results" option. This will then take very long as well, but at least users should be able to go through the first results on the page itself. (The download button should thus not disrupt the Ajax paged loads.) It's just an idea, but I hope it gives some of you some inspiration.
On my understanding the key points are:
You cannot cancel a specific request if a form is submitted. Reasons are on client side you don't have anything so that you can identify the states of a form request (if it is posted, if it is processing, etc.). So only way to cancel it is to reset the $_POST variables and/or refresh the page. So connection will be broken and the previous request will not be completed.
On your alternative solution when you are sending another Ajax call with {terminate: true} the result.php can stop processing with a simple die(). But as it will be an async call -- you cannot map it with the previous form submit. So this will not practically work.
Probable solution: submit the form with Ajax. With jQuery ajax you will have an xhr object which you can abort() upon window unload.
UPDATE (upon the comment):
A synchronous request is when your page will block (all user actions) until the result is ready. Pressing a submit button in the form - do a synchronous call to server by submitting the form - by definition [https://www.w3.org/TR/html-markup/button.submit.html].
Now when user has pressed submit button the connection from browser to server is synchronous - so it will not be hampered until the result is there. So when other calls to server is made - during the submit process is going on - no reference of this operation is available for others - as it is not finished. It is the reason why sending termination call with Ajax will not work.
Thirdly: for your case you can consider the following code example:
HTML:
<form action="results.php">
<input name="my-input" type="text">
<input id="resultMaker" type="button" value="submit">
</form>
<div id="overlay">
<p>Results are being generated</p>
<button>Cancel</button>
</div>
JQUERY:
<script type="text/javascript">
var jqXhr = '';
$('#resultMaker').on('click', function(){
$("#overlay").show();
jqXhr = $.ajax({
url: 'results.php',
data: $('form').serialize(),
type: 'post',
success: function() {
$("#overlay").hide();
});
});
});
var abortRequest = function(){
if (jqXhr != '') {
jqXhr.abort();
}
};
$("#overlay button").on('click', abortRequest);
window.addEventListener('unload', abortRequest);
</script>
This is example code - i just have used your code examples and changed something here and there.
Himel Nag Rana demonstrated how to cancel a pending Ajax request.
Several factors may interfere and delay subsequent requests, as I have discussed earlier in another post.
TL;DR: 1. it is very inconvenient to try to detect the request was cancelled from within the long-running task itself and 2. as a workaround you should close the session (session_write_close()) as early as possible in your long-running task so as to not block subsequent requests.
connection_aborted() cannot be used. This function is supposed to be called periodically during a long task (typically, inside a loop). Unfortunately there is just one single significant, atomic operation in your case: the query to the data back end.
If you applied the procedures advised by Himel Nag Rana and myself, you should now be able to cancel the Ajax request and immediately allow a new requests to proceed. The only concern that remains is that the previous (cancelled) request may keep running in the background for a while (not blocking the user, just wasting resources on the server).
The problem could be rephrased to "how to abort a specific process from the outside".
As Christian Bonato rightfully advised, here is a possible implementation. For the sake of the demonstration I will rely on Symphony's Process component, but you can devise a simpler custom solution if you prefer.
The basic approach is:
Spawn a new process to run the query, save the PID in session. Wait for it to complete, then return the result to the client
If the client aborts, it signals the server to just kill the process.
<?php // query.php
use Symfony\Component\Process\PhpProcess;
session_start();
if(isset($_SESSION['queryPID'])) {
// A query is already running for this session
// As this should never happen, you may want to raise an error instead
// of just silently killing the previous query.
posix_kill($_SESSION['queryPID'], SIGKILL);
unset($_SESSION['queryPID']);
}
$queryString = parseRequest($_POST);
$process = new PhpProcess(sprintf(
'<?php $result = runQuery(%s); echo fetchResult($result);',
$queryString
));
$process->start();
$_SESSION['queryPID'] = $process->getPid();
session_write_close();
$process->wait();
$result = $process->getOutput();
echo formatResponse($result);
?>
<?php // abort.php
session_start();
if(isset($_SESSION['queryPID'])) {
$pid = $_SESSION['queryPID'];
posix_kill($pid, SIGKILL);
unset($pid);
echo "Query $pid has been aborted";
} else {
// there is nothing to abort, send a HTTP error code
header($_SERVER['SERVER_PROTOCOL'] . ' 599 No pending query', true, 599);
}
?>
// javascript
function abortRequest(pendingXHRRequest) {
pendingXHRRequest.abort();
$.ajax({
url: 'abort.php',
success: function() { alert("terminated"); });
});
}
Spawning a process and keeping track of it is genuinely tricky, this is why I advised using existing modules. Integrating just one Symfony component should be relatively easy via Composer: first install Composer, then the Process component (composer require symfony/process).
A manual implementation could look like this (beware, this is untested, incomplete and possibly unstable, but I trust you will get the idea):
<?php // query.php
session_start();
$queryString = parseRequest($_POST); // $queryString should be escaped via escapeshellarg()
$processHandler = popen("/path/to/php-cli/php asyncQuery.php $queryString", 'r');
// fetch the first line of output, PID expected
$pid = fgets($processHandler);
$_SESSION['queryPID'] = $pid;
session_write_close();
// fetch the rest of the output
while($line = fgets($processHandler)) {
echo $line; // or save this line for further processing, e.g. through json_encode()
}
fclose($processHandler);
?>
<?php // asyncQuery.php
// echo the current PID
echo getmypid() . PHP_EOL;
// then execute the query and echo the result
$result = runQuery($argv[1]);
echo fetchResult($result);
?>
With BaseX 8.4, a new RESTXQ annotation %rest:single was introduced, which allows you to cancel a running server-side request: http://docs.basex.org/wiki/RESTXQ#Query_Execution. It should solve at least some of the challenges you described.
The current way to only return chunks of the result is to pass on the index to the first and last result in your result, and to do the filtering in XQuery:
$results[position() = $start to $end]
By returning one more result than requested, the client will know that there will be more results. This may be helpful, because computing the total result size is often much more expensive than returning only the first results.
I hope I understood this correctly.
Instead of letting the browser "natively" submit the FORM, don't: write JS code that does this instead. In other words (I didn't test this; so interpret as pseudo-code):
<form action="results.php" onsubmit="return false;">
<input name="my-input" type="text">
<input type="submit" value="submit">
</form>
So, now, when the that "submit" button is clicked, nothing will happen.
Obviously, you want your form POSTed, so write JS to attach a click handler on that submit button, collect values from all input fields in the form (actually, it is NOT nearly as scary as it sounds; check out the link below), and send it to the server, while saving the reference to the request (check the 2nd link below), so that you can abort it (and maybe signal the server to quit also) when the cancel-button is clicked (alternatively, you can simply abandon it, by not caring about the results).
Submit a form using jQuery
Abort Ajax requests using jQuery
Alternatively, to make that HTML markup "clearer" relative to its functionality, consider not using FORM tag at all: otherwise, what I suggested makes its usage confusing (why it is there if it's not used; know I mean?). But, don't get distracted with this suggestion until you make it work the way you want; it's optional and a topic for another day (it might even relate to your changing architecture of the whole site).
HOWEVER, a thing to think about: what to do if the form-post already reached the server and server already started processing it and some "world" changes have already been made? Maybe your get-results routine doesn't change data, so then that's fine. But, this approach probably cannot be used with change-data POSTs with the expectation that "world" won't change if cancel-button is clicked.
I hope that helps :)
The user doesn't have to experience this synchronously.
Client posts a request
The server receives the client request and assigns an ID to it
The server "kicks off" the search and responds with a zero-data page and search ID
The client receives the "placeholder" page and starts checking if the results are ready based on the ID (with something like polling or websockets)
Once the search has completed, the server responds with the results next time it's polled (or notifies the client directly when using websockets)
This is fine when performance isn't quite the bottleneck and the nature of processing makes longer wait times acceptable. Think flight search aggregators that routinely run for 30-90 seconds, or report generators that have to be scheduled and run for even longer!
You can make the experience less frustrating if you don't block user interactions, keep them updated of search progress and start showing results as they come in if possible.
You must solve this conceptually first before writing any code. Here are some things that come to mind offhand:
What does it mean to free up resources on the server?
What constitutes to a graceful abort that will free up resources?
Is it enough to kill the PHP process waiting for the query result(s)? If so, the route suggested by RandomSeed could be interesting. Just keep in mind that it will only work on a single server. If you have multiple load balanced servers you won't have a way to kill a process on another server (not as easily at least).
Or do you need to cancel the database request from the database itself? In that case the answer suggested by Christian GrĂ¼n is of more interest.
Or is it that there is no graceful shutdown and you have to force everything to die? If so, this seems awfully hacky.
Not all clients are going to explicitly abort
Some clients are going to close the browser, but their last request won't come through; some clients will lose internet connection and leave the service hanging, etc. You are not guaranteed to get an "abort" request when a client disconnects or has gone away.
You have to decide whether to live with potentially unwanted behavior, or implement an additional active state tracking, e.g. client pinging server for keepalive.
Side notes
30 secs or greater query time is potentially long, is there a better tool for the job; so you won't have to solve this with a hack like this?
you are looking for features of a concurrent system, but you're not using a concurrent system; if you want concurrency use a better tool/environment for it, e.g. Erlang.
Thank you for reading.
I have an input field that sends its contents in an XMLHttpRequest to a php script. The script queries the database with the POST data from the field and returns the results.
Because the XMLHttpRequest is invoked using onkeyup, typing in a lengthy value sends several calls in a short period. To combat this I wrote some code that creates a timestamp, loads it into the session, sleeps, then rechecks the timestamp. if the timestamp has increased, it means a subsequent call was made and the script should abort. Otherwise the script executes. Here is the code.
$micro = microtime(true);
$_SESSION['micro'] = $micro;
usleep(500000); // half a second
if ($micro < floatval($_SESSION['micro']))
{
// later call has been made, abort
echo 'abort';
exit;
}
else
{
// okay to execute
}
The code appears to work as expected at first. If I add or remove a character or two from the input field the result appears quickly.
However if I type a good 12 characters as fast as I can there is a large delay, sometimes 2 or 3 seconds long.
I am working on localhost, so there is no connection issues. The query is also really small, grabbing one column containing a single word from a specific row.
I have also set XMLHttpRequest to be asynchronous, so that should also be fine.
xmlhttp.open("POST","/test/",true);
If I remove the flood prevention code, typing in the field returns results instantly - no matter how much and how quickly I type.
It's almost as if usleep() keeps stacking itself or something.
I came up with this code on my own, best I could do at my level. No idea why it isn't behaving as expected.
Help is greatly appreciated, thanks!
When you open a session using session_start(), PHP locks the session file so any subsequent requests for the same session while another request has it open will be blocked until the session closes (you were exactly right with the "stacking" you suspected was happening).
You can call session_write_close() to close the session and release the lock but this probably won't help in this situation.
What's happening is each time the key is pressed, a request gets issued and each one is backed up while the previous one finishes, once the session is released one of the other requests opens the session and sleeps, and this keeps happening until they've all finished.
Instead, I'd create a global variable in Javascript that indicates whether or not a request is in progress. If one is, then don't send another request.
Something like this:
<script>
var requesting = false;
$('#input').on('keyup', function() {
if (requesting) return ;
requesting = true;
$.ajax({
url: "/url"
}).done(function() {
requesting = false;
});
}
</script>
drew010's answer explained my problem perfectly (Thanks!). But their code example, from what I gather by how it was explained (I didn't try it), does the opposite of what I need. If the user types "hello" the h will get sent but the ello might not unless the result makes it back in time. (Sorry if this was a wrong assumption)
This was the solution I came up with myself.
<input type="text" onkeyup="textget(this.value)" />
<script>
var patience;
function ajax(query)
{
// XMLHttpRequest etc
}
function textget(input)
{
clearTimeout(patience);
patience = setTimeout(function(){ajax(input)},500);
}
</script>
when a key is pressed in the input field, it passes its current value to the textget function.
the textget function clears an existing timer if any and starts a new one.
when the timer finishes counting down, it passes the value further to the ajax function to perform the XMLHttpRequest.
because the timer is reset every time the textget function is called, if a new call is made before the timer finishes (0.5 seconds), the previous call will be lost and is replaced by the new one.
Hope this helps someone.
So i'm working on a project that will be using PHP to download files with CURL and update them on the server. I've already gotten that part taken care of and now i'm working on a front end gui that will display the progress to the user.
So my original thought method was to use the code in this manner:
$('#startupgrade').click(function() {
$('#upgradeprogress').css({"width": "0%"});
$('#upgradestatus').append('<p>Starting Upgrade...</p>');
$('#upgradeprogress').css({"width": "10%"});
$('#currentstep').load('upgrade.php #statusreturn', 'step=1');
$('#upgradestatus').append('<p>Step 1 Complete...</p>');
$('#upgradeprogress').css({"width": "20%"});
})
The thought behind this was to load the PHP page and pull a "status message" from the page which PHP generates in the ID #statusreturn which would then be appended to the current page if that step fails or is successful. In the example above I kept it simple and just went ahead and appended Step 1 Complete but i would add some type of standard JS in there to check if the first step was completed or not.
I've very new to JS and jQuery and I feel like there should be an easier or more correct way of doing this. Maybe running some standard JS using an if statement based on the number of steps?
Does anybody have any suggestions or recommendations on how this could be done with less code or in an easier manner? I want to make sure the method I end up using and learning will be correct and hopefully i'm not making newbie mistake and using more code than is necessary.
I appreciate any input and thanks for your help in advance.
The #upgradeprogress is using Twitter Bootstrap progress bar which is controlled by the width. The #upgradestatus is a Bootstrap well that will display the output almost similar to a terminal.
The load function is an asynchronous function. That means it will directly continue and not wait for the load. In order to wait for the load place your update of progress bar into the success callback handler:
http://api.jquery.com/load/
$('#result').load('steps/step1.php', function() {
$('#upgradeprogress').css({"width": "10%"}); //update your status bar here
$('#currentstep').load('upgrade.php #statusreturn', function() {
$('#upgradeprogress').css({"width": "20%"});
//add more steps here if required
});
});
Only when that success function( ) is executed you can be sure the request was finished.
I have an application that rates a large set of items and outputs a score for each one.
In my php script I'm using ob_start and ob_flush to handle the output of data for each rating. This works great if I directly load the script. But when I try to use .get via jquery, the entire content loads and then is placed into a container, instead of incrementally adding.
I'm wondering the following
Is there a way to initiate data placement before the get has completed?
Do I need to continually poll the script until the process is complete?
Is there a more efficient way to display the data instead of get?
For this kind of problems, I will have this approach:
Keep the old script that using ob_start() and ob_flush() for a user that disable javascript in their browser.
For a user that have javascript enable, load the predefined number content one at a time. To differentiate between js enable user and not, I'm thinking of 2 page. In first page you display a link to old script. Then put a jquery code in this page to intercept click on the link to old script, so click on that link will display (or create) a div, then load the content into that div.
You can use a setTimeout to call AJAX code continuously, then after a certain condition reached (Ex, empty response), you can remove the setTimeout using clearTimeout. Each AJAX request will need to have an offset param, so it will fetch content from last AJAX call. After receive response, increment the offset for the next AJAX call. You can use global variable for this.
You can use a simple global variable to prevent an AJAX request run while the last AJAX still waiting response, to prevent race condition. Example code:
//lock variable
var is_running = FALSE;
//offset start with 0
var offset = 0;
function load_content($) {
//check lock
if (! is_running) {
//lock
is_running = true;
//do AJAX
$.get(URL,
{ PARAM },
function(resp){
//put data to 'div'
//...
//if empty, then call clearTimeout
//...
//increase offset here
offset = offset + NUM_ITEM_FETCHED
//release lock
is_running = false;
});
}
}
The point you must pay attention that using AJAX call, you must determine the response manually, since ob_start and ob_flush will have no effect in this scenario.
I hope this will help you create your own code.
Jquery will receive a success status from the ajax call when the complete page has finished loading .. so whatever you do in the php will not get returned to the calling page until the whole process has finished .. (ajax is a one-send/one-receive system)
You would need to complicate your system to do what you want..
example..
your php updates an external file of progress, and your jquery polls this file in some interval and displays progress..
You would initiate the interval polling on ajax submit, and on ajax success terminate it..
I had a similar problem awhile back where I wanted a php script to send a series of emails and update the jquery page to say something like "Sending 23/50".
What I ended up doing was setting up the php script to handle one item at a time. This might also work in your case. Could you have jquery pass an item identifier of some sort to a php script that handles just that one item? Then in the callback, you could place the data for that item in the page as well as creating a new ajax request for the next item. In other words, each callback would create a new request for the next item until the entire list of items has been looped through.
What do you think?
-DLH
I'm trying to create a very simple message board (author, text, and date written) that will auto-update every few moments to see if a new message has arrived, and if it has, auto load the latest message(s).
I'm proficient in PHP, but my knowledge in AJAX is lacking.
The way I see it, I would have to create a PHP file called get_messages.php that would connect to a database and get through a $_GET variable return all posts beyond date X, and then I would somehow through jquery call this PHP file every few minutes with $_GET=current time?
Does this sound correct?
How would I got about requesting and returning the data to the web page asynchronously?
You're pretty close, you'll need a PHP script that can query the database for your results. Next, you'll want to transfigure those results into an array, and json_encode() them:
$results = getMyResults();
/* Assume this produce the following Array:
Array(
"id" => "128","authorid" => "12","posttime" => "12:53pm",
"comment" => "I completely agree! Stackoverflow FTW!"
);
*/
print json_encode($results);
/* We'll end up with the following JSON:
{
{"id":"128"},{"authorid":"12"},{"posttime":"12:53pm"},
{"comment":"I completely agree! Stackoverflow FTW!"}
}
*/
Once these results are in JSON format, you can better handle them with javascript. Using jQuery's ajax functionality, we can do the following:
setInterval("update()", 10000); /* Call server every 10 seconds */
function update() {
$.get("serverScript.php", {}, function (response) {
/* 'response' is our JSON */
alert(response.comment);
}, "json");
}
Now that you've got your data within javascript ('response'), you are free to use the information from the server.
Ignore the ASP.NET stuff, this link is a good start:
http://www.aspcode.net/Timed-Ajax-calls-with-JQuery-and-ASPNET.aspx
What you're going to use is a javascript function called setTimeout, which asynchronously calls a javascript function on an interval. From there, jQuery has a fancy function called "load" that will load the results of an AJAX call into a DIV or whatever element you're looking for. There are also numerous other ways to get jQuery to do alter the DOM the way you'd like.
There are a hundred ways to do this, but I'd say avoid writing plain Javascript to save yourself the headache of cross-browser functionality when you can.
I suggest you go for the Simple AJAX Code-Kit (SACK) available on Google code.
I've been using it since before it was on Google code. It's very light and straightforward. It's one js file that you have to include. I've seen it being used in online browser games as well.
http://code.google.com/p/tw-sack/
Example for loading page contents from get_messages.php in a div (if you don't care about the page contents from get_messages.php, and simply want to call the php file, simple remove the ajax.element line):
<script type="text/javascript" src="tw-sack.js"></script>
<script>
var ajax = new sack();
ajax.method = "GET"; // Can also be set to POST
ajax.element = 'my_messages'; // Remove to make a simple "ping" type of request
ajax.requestFile = "get_messages.php";
ajax.setVar("user_name","bobby");
ajax.setVar("other_variables","hello world");
ajax.setVar("time",dateObject.getTime());
ajax.onCompleted = whenCompleted;
ajax.runAJAX();
function whenCompleted(){
alert('completed');
}
</script>
<div id="my_messages">Loading...</div>
You don't need to specify an "ajax.element" if you want to do a simple "ping" type of request and ignore the output of the php file. All you have to do to implement your requirements now is to use a "setTimeout" making the ajax calls.
There are also many other options like:
//ajax.onLoading = whenLoading;
//ajax.onLoaded = whenLoaded;
//ajax.onInteractive = whenInteractive;
No need to learn or include huge frameworks. And you'll get started in no time with tw-sack.