I'm having to interact with the Facebook API for this project, which I find to be actually a bit slower than I expected. Because of this, I'm having to do something which I find rather unorthodox: I need to load the content Facebook provides back in my PHP script AS IT LOADS from Facebook. Traditionally I've loaded content into a div tag at the success of the script; however, I need to load the content as it appears. It would be absolutely unacceptable to have a client wait nearly a minute for Facebook to load an album and all respective comments before displaying anything. Hopefully I'm not being to vague; I'm not here to ask for code, but I've tried just about everything I can think of. Is this a simple concept I'm missing?... I feel as though this is easier than I'm making it.
I'm using jQuery AJAX as I find this easiest to work with. Any comments and/or help would be greatly appreciated.
The root of your problem is that jQuery's AJAX methods hook into the onreadystatechange event and readyState variable. readyState is only set to 4 when the file is completely transferred, and therefore your events will only fire after the download is complete.
Accessing the data as it is being sent is not consistent across different browser families. Doing it this way is going to be incredibly complex and time-consuming. I would recommend first doing this a little differently, perhaps by preloading the relevant facebook data on your own server predictively. This can be compiled to a static page, and that can in turn be served to your users very quickly.
To get the data to your users faster, you'll need to work outside the box as well. There's a jQuery plugin discussed here ( Does PHP flush work with jQuerys ajax? ) that makes jQuery ajax methods compatible with streamed output. Good luck.
The problem seems to stem from the fact that you're getting too much data at once. I suppose you are talking about receiving content from ajax as it is printed out immediately, but it is possible this content is built and sent at once and you won't have access to the data until the entire parse is complete. If this is untrue, look into COMET. If it is true, the solution is to put a limit on how much data you retrieve at once in an effort to reduce the parse time. For example, retrieve 5 photos in each request. Add those 5 photos to the DOM while you retrieve the next 5.
Instead of putting whatever code you want inside of the callback, just put it after the callback
for example
$("#div").load("facebook...", function() {
//do stuff
});
//put the stuff you want to load at the same time, here
Related
I have a PHP page with a simple form. One input text field & a button. Input text field accepts user queries & on button click an HTTP GET request is made to the server & the result has to be shown back in the same page containing the form. That's too simple to do. I can do this in two ways. One is AJAX & other one is the good old sodding form-submit method.
My question is simple- Which method should I use? Since both of the roads lead us to the same place, which one should I choose to travel?
First of all, let me talk about form submit method. I can use <?php echo $_SERVER['PHP_SELF'] ; ?> as the action of the form for submitting the values of my form to the same page. Once I store those values into some random variables, I can make a GET request & obtain the result & show it to the world. This method is easy to use. Happy Down Voting to all of you.
Or I can make a GET request using AJAX and jQuery or JavaScript or whatever you wish to use & obtain the same result as in the previous case. Output is same. Only the mode of execution is different.
So which one is better? Which one fetches result faster? And why should I use that? Is there any difference? GET, POST, PUT or whatever- it doesn't really matter. AJAX or form-submit?
There shouldn't be any significant, genuine speed difference between them.
The Ajax approach will load a smaller amount of data (since you aren't loading an entire HTML document), but by the time you take into account HTTP compression and the fact that (if your system is sensibly configured) your dependancies (images, scripts, stylesheets, etc) will be cached, it won't be significantly smaller.
By using JavaScript to create a loading indicator and not refreshing the entire window in front of the user, you can create the illusion of a faster load time though. So if feeling faster was the only concern, then Ajax is the way forward.
Using JavaScript, however, is more complicated and slightly more prone to failure. The consequences of failure states are more severe because, unless your code detects and does something with them, the user will (not) see it fail silently. For example, if a normal page load times out because the user is on a train and went through a tunnel, they'll see an error page provided by their browser suggesting that they refresh and try again. With Ajax, you need to write the error handling code yourself. This does give you more flexibility (such as allowing you to simply try again a few times) but the work isn't done for you.
The other consequence of using Ajax is that the address bar will not update automatically. This means that the results won't be bookmarkable or sharable unless you do something explicit the make that possible. The usual way to do that is pushState and friends, but again, it is more work.
You should also make the site work without JavaScript so that if the JS doesn't run for any reason then the site won't break completely. If you use pushState then you have to do this for the URLs you are setting the address bar to point to to be useful.
The short answer: Use a regular form submission, then consider layering JavaScript over the top if you think it will give your visitors a worthwhile benefit.
I Should stick to an Ajax request when possible.
This because you then don't really have to load every single item on the page again ( like all the images, menu and so on ). You can just give the relevant HTML back and JQuery can place it inside the relevant holder.
But that is just my humble opinion...
If you have to retrive simple data from server without reload the page my advice is use jquery .get o .post
also it provides you a very large API that allows you to reduce your programming time.
http://api.jquery.com/
obviously the execution time increase but in my experience the user cant fell the differce with a simple ajax request.
so in my opinion if jquery allow you to obtain the results, this is the best solution because halves your work time!
See the edited one it may help you.
I think that AJAX should be used for displays updates and form submissions should be done via a page reload. Reasoning?
When submitting forms, you are telling the application to do something. Users tend to want to feel that it was done. When a page doesn't reload, users are often left wondering "Did that work?". Then they have to check to make sure what they did was right.
but when you are displaying a table or something, and the user says to "display x data....now x1 data" for instance, they aren't "doing" something (creating new entities, sending emails, etc). So AJAX can provide a nice user interface in this case. Page reloads would be annoying here.
In conclusion, I think form submission should be done via page reloads (let the user see it working), whereas display updates should use AJAX (prevent annoying page reloads).
Of course, this is a preference thing. Some of my company's applications use AJAX all over. But those are the applications that are the most difficult to maintain and debug. ;)``
Facebook has introduced a ticker which shows live news scrolling down. How can I have this same time of functionality on my site? I don't care to use an iframe and have it refresh because it will A flicker and B make the page loading icon come up (depending on browser). How can this be done?
For this you'd want to fetch the data you're looking for with AJAX every X seconds.. Also known as polling.
Here's the break down: Every X seconds, we want to query our database for new data. So we send an asychronous POST to a php page, which then returns a data set of the results. We also declare a callback function (native to jQuery) that will be passed the data echo'd from our PHP.
Your PHP:
if (isset($_POST['action'])){
if ($_POST['action'] == 'pollNewData'){
pollNewData();
}
}
function pollNewData(){
$term = $_POST['term'];
$sql = "select * from TABLE where TERM = '$term'";
$result = get_rows($sql);
echo json_encode(array('status'=>200, 'results'=>$results));
}
Your front end javascript:
setTimeout(pollForNewData, 10000);
function pollForNewData(){
$.post('url/ajax.php',{
action: 'pollNewData',
term: 'your_term'
}, function(response){
if (response.status == 200){
$.each(response.results, function(index, value){
$("#container").append(value.nodeName);
});
}
}, 'json');
}
What essentially is going on here is that you will be posting asynchronously with jQuery's ajax method. The way you trigger a function in your PHP would be by referencing a key-value item in your post depicting which function you want to call in your ajax request. I called this item "Acton", and it's value is the name of the function that will be called for this specific event.
You then return your data fetched by your back end by echo'ing a json_encoded data set.
In the javascript, you are posting to this php function every 10 seconds. The callback after the post is completed is the function(response) part, with the echo'd data passed as response. You can then treat this response as a json object (since after the function we declared the return type to be json.
Pretty much the only way you can do it is with some sort of Asyncronous javascript function. The easiest way to do it is to have javascript priodically poll another http resource with that information and replace the current content in the dom with the new content. Jquery and other javascript frameworks provide AJAX wrappers to make this process reasonably simple. You aren't limited to using XML in the request.
It's a good idea to make sure that even without javascript enabled that some content is available. Just use the javascript to 'update it' without having to refresh the page.
You can do this kind of ticker using Ajax... using AJAX you can poll a URL that returns JSON/XML containing the new updates and once you get the data, you can update the DOM.
you can refer this page for introduction to Ajax.
Ajax is the best method but that's what everyone else already mentioned. I wanted to add that although I agree ajax is the best method, there are other means too, such as Flash.
There are two approaches possible for obtaining updates like this. The first is called push and the second is called pull.
With push updates, you rely on the server to tell the client when new information is available. In your example, a push update would come in the form of Facebook telling your site that something new happened. In general, push schemes will tend to be more bandwidth friendly because you only transmit information when something needs to be said (Facebook doesn't contact your site if nothing is going on). Unfortunately, push updating requires the server to be specially configured to support it. In other words, unless the service you are asking for updates from (ex. Facebook) has a push update service available to you, you cannot implement one yourself. That's where pull techniques come in.
With pull updating, the client requests new information from the server when it deems necessary. It is typically implemented using polling such that the client will regularly and periodically query the server to see what the latest information is. Depending on how the service is oriented, you may have to do client-side parsing to determine if what you receive as a response is actually new. The downside here is of course that you likely will consume unnecessary amounts of bandwidth for your polling requests where no useful information is obtained. Similarly, you may not be as up-to-date with new information as you would like to be, depending on your polling interval.
In the end, since your site is web-based and is interfacing with Facebook, you will likely want to use some sort of AJAX system.
There are a few hybrid-ish approaches, such as long polling via Comet, which are outlined rather well on Wikipedia:
Push technology: http://en.wikipedia.org/wiki/Push_technology
Pull technology: http://en.wikipedia.org/wiki/Pull_technology
I am hitting a lot of different sites to get a list of information and I want to display this information as I get it. Right now I am using a Smarty Template and what I would like to do is:
Pseudo code:
{foreach $page}
$smarty_var = use AJAX to call a PHP function
Render out a new table row on the fly w/ the newly assigned var
<tr><td>{$smarty_var}</td></tr>
{/foreach}
I don't know much about AJAX, I used it a long time ago, and it was similar to this, but not quite, there was user action taken. No I don't have a JS Framework in place. Am I way off here on how this should go? Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
Sure, I will tell you about what I am trying to do: http://bookscouter.com/prices.php?isbn=0132184745+&x=19&y=6
If you click on the 'Click to view prices from all 43 links' at the bottom on that page you will see. I am using cURL to get all the pages I want a price from. Then for each page I want to get the price. So each page is gonna fire off a function that runs some fun code like this:
function parseTBDOTpageNew($page, $isbn)
{
$first_cut = preg_split('/<table[^>]*>/', $page);
$second_cut = preg_split('/<td[^>]*>/', $first_cut[2]);
if(strstr($second_cut[4], "not currently buying this book") == true)
{
return "\$0.00";
}
$third_cut = preg_split('/<b[^>]*>/', $second_cut[9]);
$last_cut = preg_split('/</', $third_cut[3]);
return $last_cut[0];
}
This function is called from another function which puts the price returned from the function above, the name of the company, and a link in an array to be added to another bigger array that is sent to smarty. Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
I will take your advice on Jquery, what I have started is an onload function that receives the $pages to be parsed, and I was just in the middle of writing: foreach page get the info and spit some html w/ the info on the page.
Also the function that calls the function to get the price is in a php file, so I need the request to hit a function within a php file and NOT just call file.php?param1=foo, I need to it to actually hit the function in the file. I have Jquery in place, now just trying to figure it out and get it to do what I need, ugh. I am searching, any help would be appreciated.
No I don't have a JS Framework in place
Fix that first. You don't want to juggle XMLHTTPRequests yourself. jQuery is SO's canonical JS library, and is pretty nifty.
Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
How many rows will you be dealing with? Do they all have to be loaded asynchronously?
Let's tackle this in a braindead, straightforward way. Create a script that does nothing more than:
Take a site ID and fetch data from the corresponding URL
Render that data to some data transport format, either HTML or JSON.
Then it's a simple matter of making the page that the user gets, which will contain Javascript code that makes the ajax calls to the data fetcher, then either shoves the HTML in the page directly, or transforms the data into HTML and then shoves that into the page.
You'll note that at no point is Smarty really involved. ;)
This solution is highly impractical for anything more than a trivial number of sites to be polled asynchronously. If you need rows for dozens or hundreds of sites, that means each client is going to need to make dozens or hundreds of requests to your site for every single normal pageview. This is going to slaughter your server if more than one or two people load the page at once.
Can you tell us more about what you're doing, and what you're trying to accomplish? There are lots of ways to mitigate this problem, but they all depend on what you're doing.
Update for your question edit.
First, please consider using an actual HTML parser instead of regular expressions. The DOM is very powerful and you can target specific elements using XPath.
Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
So, here's the ultimate problem. You want to do something asynchronously. PHP does not have a built-in generalized way to perform asynchronous tasks. There are a few ways to deal with this problem.
The first is as I've described above. Instead of doing any of the curl requests on page load, you farm the work out to the end user, and have the end user's browser make requests to your scraping script one by one, filling in the results.
The second is to use an asynchronous work queue, like Gearman. It has excellent PHP support via a PECL extension. You'd write one or more workers that can accept requests, and keep a pool of them running at all times. The larger the pool, the more things you can do at once. Once all of the data has returned, you can throw the complete set of data at your template engine, and call it good.
You can even combine the two, having the user make only one or two or three extra requests via ajax to fetch part of the returned data. Heck, you can even kick off the jobs in the background and return the page immediately, then request the results of the background jobs later via ajax.
Regardless of which way you handle it, you have a giant, huge problem. You're scraping someone's site. You may well be scraping someone's site very often. Not everyone is OK with this. You should seriously consider caching results aggressively, or even checking with each of the vendors to see if they have an API or data export that you can query against instead.
I am a little bit new to the PHP/MYSQL arena, and had an idea to be able to interact with my Database by using a hidden Iframe to run PHP pages in the background(iframe) on events without having to leave the current page?
Good? Bad? Common Practice? Opinions?
This is most of the time bad, but sometimes inevitable.
The common practice to do it is to use AJAX, it's so common that even W3School has an article about it.
The advantages of using AJAX over IFrame is that AJAX can be multi-threaded. You can send several requests in a row, which is more troublesome to implement with IFrames. Moreover, AJAX supports status code so you can detect errors, where with IFrames you'd have to rely on scraping the page's HTML and hope you've determined the correct status by looking at the error page's HTML code.
AJAX is more JavaScript idiomatic and event driven, which means your callback will get notified automatically when there is a response. With IFrame you'd have to setTimeout() and keep polling the IFrame for a response, which may break easily.
IFrame is sometimes inevitable in cases like where you want to upload a file without leaving the current page. But that's probably not your scope since you mentioned only database interactions.
Learn to use XMLHttpRequest, which is the foundation of AJAX. After you've become familiar with that, try making it fun by using a JavaScript framework such as jQuery, Dojo, etc.
I'd guess something is supposed to happen when your database does something, right? I.e. your page should give some sort of feedback, maybe update a number or some text.
So you're going to use Javascript anyway. In that case, skip the iframe and just send off AJAX requests.
This is commonly accomplished using AJAX. The jQuery javascript library makes this easy
I don't think using iframes is a good way to accomplish this. You would still need javascript enabled to change the location of the iframe, and if javascript is available, why not just use AJAX?
If you use the iframe, you wouldn't be able to receive a response from the server in any meaningful way without doing a lot of workarounds. For example -- using jQuery, you could submit some information to the server with a single function call, and then when that request completes, a callback function can be invoked with response information from the server:
$.post("ajax.php", { var1: "data", var2: "moredata" },
function(data){
alert("Server says: " + data);
});
In this example, when the request completes, an alert box appears with the output of ajax.php.
With an iframe, you might do something like change the location of the iframe to server.com/iframe.php?var=data&var2=moredata&var3=moredata, then wait for a bit, and grab the contents of the iframe (if this is even possible) and do something with that.
Not to mention, when you run into problems doing this, you'll probably ask for advice on SO. and each time, people will probably say "drop that and use jQuery!" :) may as well skip all the pain and suffering and do it the Right Way to begin with
The hidden iframe method was used before the adoption of XMLHttpRequest api (Maybe you have heard of it as Ajax).
Years ago I was using a former implementation using rslite but nowadays this technique has, to me, just an historical value.
You can get directions on using Ajax techniques in plain javascript at http://www.xul.fr/en-xml-ajax.html or, better, you can choose to use a common library, jquery or mootools among others, to deal with the different implementations in different browser.
I have a PHP application running in iFrame mode. I am rendering an <fb:multi-friend-selector condensed="true"> inside of <fb:serverfbml> tags. This is inside a PHP file that calls a function that gets a list of user IDs using $facebook->api_client->friends_get();. The multi-friend selector renders just fine, but, when I leave the friend_get() call uncommented, the page takes between 15-20 seconds to load (confirmed with Firebug)! The goal is to limit the number of users displayed in the selector by building a list of user ids not to display, for use in the friend selector's exclude_ids parameter. And since it's "exclude_ids" and not "include_ids", I can't think of a way of getting around this api call. It seems to me there must be something I can do to make the api call faster, because I've seen friend selectors that load much more quickly.
After over a month of ripping my hair out over this issue, I discovered a fairly feasible workaround. The PHP API calls will work extremely slowly from any AJAX requests you make. This likely has something to do with Facebook parameters being missing, or some other such nonsense.
The workaround works like this: instead of calling the Facebook API function from the PHP file being called via AJAX, make sure you isolate all PHP calls to the Facebook API to the index file loaded when the app is first loaded. Save the returned values into a session variable, and you can now load those values in whatever subsequent AJAX calls you make.