Check if webpages are reachable with php and refresh it with ajax - php

At the moment I’ve got one function to check, if a webpage is reachable. I’ll call this function at about 100 times in a while-loop, which means it sometime lasts 5 minutes to check all these 100 webpages.
I never before used ajax but I would think that it would be a good idea to solve this problem with ajax, but I never used ajax before and have no idea, how to start. Could you give me a good hint? Thanks for every answer!

I would use jquery-ajax, makes it simpler.
So put jquery on your site to start.
This is how jquery ajax works:
$.ajax({
type: 'POST',
url: '--LINK TO PHP/ASP...---', // Place the link that has the command
data: dataString, // dataString is a json encode of the data that is sent to the file
dataType : 'json',
beforeSend:function(){
// Before you send the info, do what you want here (ie loading gif...)
},
success:function(data){
// If it is successful, then it will do what you want here.
}
});
I hope this helps.

I would suggest you use JQuery Ajax, easier to implement.
$.ajax({
url: "test.html",
context: document.body,
success: function(){
$(this).addClass("done");
}
});

From your (somewhat ill-defined) description, I'd say that using AJAX to control the web site verification would be a deeply in-appropriate approach.
Instead, a more sensible approach would be to "batch process" the web site data via the use of a cron triggered PHP cli script.
As such, once you'd inserted the relevant domains into a database table with a "processed" flag set as false, the background script would then:
Scan the database for web pages that aren't marked as checked within your required time period.
Carry out the CURL lookup, etc.
Update the database record accordingly with the current timestamp.
...
To ensure no overlap with an existing executing batch processing script, you should only invoke the php script every five minutes from cron and (within the PHP script itself) check how long the script has been running at the start of the "scan" stage and exit if its been running for four minutes or longer. (You might want to adjust these figures, but hopefully you can see where I'm going with this.)
By using this approach, you'll be able to leave the background script running indefinitely (as it's invoked via cron, it'll automatically start after reboots, etc.) and simply add web pages to the database/review the results of processing, etc. via a separate web front end.
You could of course use AJAX to get a regular summary of the current status from the database for the purposes of client-side display.

Related

Jquery/AJAX to output data while processing long task in php

Good afternoon.
I have the following code, that on submit, sends data to a PHP file, which queries multiple network nodes for health status. The problem is that the tasks take about 40 seconds to complete and during that time there is no output. I tried to use ob_flush and flush. No effect, although I have in the php portion of the code. I still see the loading message and get the complete printout once it's ready. However, flush and ob_flush works in general on my server (tested as standalone script), so that's not the issue.
In my understanding that's the jQuery/ajax call that waits for the code to execute completely before spitting out the printout. I looked through the forums and couldn't find any applicable solution, as most of them are related to "GET" request while I'm using "POST".
Can someone please point me in right direction on this? Is there a way to receive printout while the PHP is still processing the request?
JS Code
$(document).ready(function(){
$('#userForm3g').on('submit', function(e){
e.preventDefault();
e.stopImmediatePropagation();
$('#response').html("<b>Loading data...</b>");
$.ajax({
type: 'POST',
url: 'myphpfile.php',
data: $(this).serialize()
})
.done(function(data){
$('#response').html(data);
})
.fail(function() {
alert( "Posting failed." );
});
return false;
});
});
Thanks!
The problem with the approach you mentioned is that there are multiple ways that it can fail: you might forget to clear the php output buffer, or the setting might be tightly controlled for some reason, there could be a proxy or load balancer that waits for the request to complete, or it could also be the web browser (chrome used to render partial content, but they stopped doing this).
There are few answers though:
You could use partial responses (which is basically streaming on http). There's a specialized library to do this which I've used before, but I forgot its name. I wouldn't recommend this option though, if it fails it will be tough to find why.
You could use any method of long polling, including comet or sockets, but you'll need a compatible server (node.js or reactphp)
You could use a 3rd party service like Pusher or OneSignal (they also use the previous approach, but it's more integrated and reliable)

php exec and running a bash script until its finished

I have a bash script that can take hours to finish.
I have a web frontend for it to make it easy to use.
On this main page, I wanna have a url that I press that starts my php command
<?exec('myscript that take a long time');?>
After the exec has finished, I want it to load a cookie.
setcookie('status',"done");
This is all easily done and works as is. However, the url that loads my exec command is a blank white page. I dont want this. I want the url to be an action which starts my phpscript and sets the cookie when the exec command returns all in the background.
Is this possible?
If not, how close can I get to this behavior.
EDIT:
function foo(){
var conn = new Ext.data.Connection();
conn.request({
url:‘request.php’,
method:‘POST’,
success: function(responseObject) {
alert(“Hello,Word!”);
},
failure: function() {
alert(“Something fail”);
}
});}
I have tried the above code with no luck.
I have a bash script that can take hours to finish
Stop there. That's your first problem. The WWW is not designed to maintain open requests for more than a couple of minutes. Maybe one day (since we now have websockets) but even if you know how to configure your webserver so this is not an off-switch for anyone passing by, it's exceeding unlikely that the network in between or your browser will be willing to wait this long.
Running of this job cannot be done synchronously with a web request. It must be run asynchronously.
By all means poll the status of the job from the web page (either via a meta-refresh or an ajax) but I'm having trouble understanding the benefit of setting a cookie when it has completed; usually for stuff like this I'd send out an email from the task when it completes. You also need a way to either separate out concurrent tasks invoked like this or a method of ensuring that only one runs at a time.
One solution would be to pass the PHP session id as an argument to the script, then have the script write a file named with the session id on completion - or even provide partial updates via the file - then you web page can poll the status of the job using the session id. Of course your code should check there isn't already an instance of the job running before starting a new one.
While the other answer are correct and the WWW is not meant for open requests, we have to consider the www was never "meant" to take information.
As programmers, we made it take information such as logins.
Further more, my question was simple, how do I commit action A with result B. While sending and email would be nice and dandy as the other post by #symcbean suggests, its not a solution but a sidestep to the problem.
Web applications often need to communicate with the webserver to update their status. This is an oxymoron because the websterver is stateless. Cookies are the soultion.
Here was my solution:
$.ajax({
url: url,
data: data,
success: success,
dataType: dataType
});
setcookie('status',"done");
The url is a php function page with an if statement acting as a switch statement and running my external script that takes a really long time. This call is blocking, that is, it will not execute setcookie until it has finished.
Once it does, it will set my cookie and the rest of my web application can continue working.

How to get server time and update every second?

I am making ajax request to a particular server and getting the response also. I am doing cross domain request so using jsonp. Is there a way to get the server time to which i am making the request. Will it be better to write php script or doing just ajax request is good. Suppose if i make the following request :
$.ajax({
dataType: 'jsonp',
data: 'jsonp=date',
jsonp: 'jsonp_callback',
url: 'http://www.google.com',
success: function (data) {
}
});
How can i get the server time from this request? Please help if any suggestion. Also after getting the time if i use setInterval method to update time every second will it be a costly operation or better to make the same ajax request after a particular time to update time. I have real time data to update with the time.
You can't get the server's time if it doesn't explicitly provide it to you.
You can read its HTTP headers, but that's not a good thing since the headers may not provide this information every time, or their format may not be the same all the time.
Also, I don't see the point of asking it every second.
Ask it one single time, calculate the difference between its time and yours, and here you go: you got the difference of time and you can use it wherever you want.
Keep in mind that even if you get the time, there probably will be a difference between the one you got and the server's real time because of the network's latency.
The answer to part 1 is that you will need to output the server time in the response in order for your javascript to read it. For part 2, I would wait for the response to load, and then use setTimeout. Using setInterval means that you might fire the ajax call twice before the first response returns.
When you suggest server time do you mean the script that is executing the call or the actual time on the server?
What is the need for this, are you trying to figure out time between calls to automate calls every 5 secs for example?
If you are then just simply getting a locale time from javascript and comparing that from when you get a response in ajax would suffice?
Otherwise javascript isn't going to be able to get server time, due to its 'clientside' nature.

The overheads of using setInterval to get latest data from database

im creating a messaging feature that gets the latest messages from a database every few seconds, basically this method does a continuous database search in order to display new messages.
example of the code:
function getLatestActivities(){
var ignoreMessagesArr = $("input.activityId").map(function(){ return this.value; }).get().join(",");
var profileId = $("input#userActivityId").val();
$.ajax({
traditional: true,
dataType: "json",
type: "GET", url: "include/process.php",
data:{
getLatestActivity: "true",
toUser: profileId,
ignoreMessages: ignoreMessagesArr
},
success: function(data){
$.each(data, function (i, elem) {
$('.commentMessage').after(elem.value);
});
}
});
}
What I want to know is whether there is a more efficient way of performing this task, i.e. detecting a change in the database instead(???).
It sounds like you want your web app to receive data when it changes in the database, but not necessarily to send data in real time. If two way communication is required then you are looking for Web Sockets which Socket.IO will help you with server side but requires that you run a Node server. There is a Google code project that enables Web Sockets for PHP called PHPWebSocket but, if I remember right, requires that you run it in a separate process (i.e. run it from the command line). So that kind of takes care of the server part, but now you have to worry about the front-end.
Currently only FireFox and Chrome fully support Web Sockets according to CanIUse. For those browsers lacking support you need a polyfill. Here is a list of HTML5 polyfills. So Web Sockets can be sort of a mess to implement, so make sure that's what you want to do.
On the other hand if your webapp only needs to receive data then EventSource (a.k.a. Server Sent Events) is the way to go. The support is better on the front-end and you don't really have to do much on the server. Again, for less than stellar browsers you will need a polyfill, but that pretty much just means IE. See the following sites/tutorials on how to use this feature.
http://my.opera.com/WebApplications/blog/show.dml/438711
http://dsheiko.com/weblog/html5-and-server-sent-events
http://www.html5rocks.com/en/tutorials/eventsource/basics/
If none of that works there are a few options. Constant polling using some kind of repeating structure like setTimeout, long polling (a.k.a. hanging get) where the server leaves the AJAX request open until there is new data, and there's also the infinite iframe trick, or maybe even a Flash plugin that connects to the server to get data.
You might want to look into Socket.IO it's geared toward realtime communications with the server. Then you can design the backend to push data to the client when it's available rather than constantly polling the database for new information from the frontend
HTML5 Web Sockets allows for two way communication between the client and your server...
In my opinion you should request only the last ID inserted in your database and add it as a parameter to to your ajax request.
process.php should handle that ID and if there are other rows to do the search.
like
$query = mysql_query("SELECT `ID` FROM `table` WHERE `ID`>'$lastId'");
$result = mysql_num_rows(); //use that to see if you have new rows.

Grabbing data from MySQL using PHP realtime?

I am wondering how I would get data from a MySQL database and display it in real time using PHP. Without having to refresh the page. Thanks!
Use AJAX (I suggest using the jQuery library for this), and have your AJAX script (written in PHP) query the MySQL database.
You can use Socket.Io for more Speed ,efficiency and performance for Your Server
http://socket.io/
But you need Node.Js for that
You will have to use javascript. You can use setInterval(function, n) to fire your update calls every n milliseconds, and a library like jQuery to handle the ajax call and updating your page.
Download jQuery or link to a CDN hosted version of jQuery in your page. Then put something like this on your page:
setInterval(function(){
// inside here, set any data that you need to send to the server
var some_serialized_data = jQuery('form.my_form').serialize();
// then fire off an ajax call
jQuery.ajax({
url: '/yourPhpScriptForUpdatingData.php',
success: function(response){
// put some javascript here to do something with the
// data that is returned with a successful ajax response,
// as available in the 'response' param available,
// inside this function, for example:
$('#my_html_element').html(response);
},
data: some_serialized_data
});
}, 1000);
// the '1000' above is the number of milliseconds to wait before running
// the callback again: thus this script will call your server and update
// the page every second.
Read the jquery docs under 'ajax' to understand the jQuery.ajax() call, and read about 'selection' and 'manipulation' if you don't understand how to update the html page with the results from your ajax call.
The other way to continuously update your page is to use a persistent connection, like web sockets (not currently supported across all the common browser platforms) or a comet-style server-push setup. Try googling comet and/or web sockets for more info, but I think the above method is going to be much easier to implement.

Categories