Facebook API friends_get is extremely slow - php

I have a PHP application running in iFrame mode. I am rendering an <fb:multi-friend-selector condensed="true"> inside of <fb:serverfbml> tags. This is inside a PHP file that calls a function that gets a list of user IDs using $facebook->api_client->friends_get();. The multi-friend selector renders just fine, but, when I leave the friend_get() call uncommented, the page takes between 15-20 seconds to load (confirmed with Firebug)! The goal is to limit the number of users displayed in the selector by building a list of user ids not to display, for use in the friend selector's exclude_ids parameter. And since it's "exclude_ids" and not "include_ids", I can't think of a way of getting around this api call. It seems to me there must be something I can do to make the api call faster, because I've seen friend selectors that load much more quickly.

After over a month of ripping my hair out over this issue, I discovered a fairly feasible workaround. The PHP API calls will work extremely slowly from any AJAX requests you make. This likely has something to do with Facebook parameters being missing, or some other such nonsense.
The workaround works like this: instead of calling the Facebook API function from the PHP file being called via AJAX, make sure you isolate all PHP calls to the Facebook API to the index file loaded when the app is first loaded. Save the returned values into a session variable, and you can now load those values in whatever subsequent AJAX calls you make.

Related

How to log activities across multiple requests

Our site makes heavy use of AJAX, and ends up calling several different PHP files in the background to fill individual tabs. Specifically, JQuery and DataTables are used. Because PHP is stateless, I'm struggling to create an activity log to work across all requests for a single session (e.g., all SQL queries performed for this page view) as each PHP file executes its own queries in its own state, so they are unaware of each other.
Any tips on how to handle this? I fear I'm overcomplicating matters or missing an obvious solution.
In the end, ideally the footer of our application can say something like: Your application performed 6 SQL queries, here they are: ....
I don't require specific code, but hopefully the above makes sense and a Eureka moment can be discovered.
One solution using Jquery would be for each $.post Ajax call to pass a callback function. The callback then adds the result when returned from the server to a DIV or a js array.
Ajax post could be:
$.post(server_url, params, my_callback, "json")
and callback:
function my_callback(response) {
$("#thediv").append(response)
}
This would asynchronously log each response in the div with id "thediv"

singleton-registry pattern and object-interaction with ajax

My problem may be very specific i think. I already tryed to find some info about it, have viewed tons of sites, but with no success. And i'm a newbee in OO-php. I will try to explain issue without samples of code:
So, i have develop object-oriented php application. My Registry class implement singeltone pattern (have only one instance in whole app) and store objects that must be accessible in any part of application. At this moment i need to call JQuery AJAX to interact with user without page reloading. But calling php script via AJAX give me another instance of my Registry class (when i try to use registry in called php-file), this instance certainly empty (have no objects in array). I think this happened because AJAX calls works in different threads (may be i'm mistake). Anyway - is where some way to rich needed functionality using registry-pattern ? Or may be there is another ways to achieve it? I know that i can make my classes static, and use objects statically, but unfortunately i can't do it. I also know about global vars - it's no my way... Any help, please!
So every single request to a PHP application is handled by a separate process on the server. No memory/data is shared across requests. That being the case, the Ajax request to the PHP script will not have access to your other request's info.
In order to handle this, You'll need to keep state of the data you're trying to store in some other way. Store it in the DB, Sessions, etc.
So say you have a "singleton-ish" list of objects that are items available in a store, the number in stock and their meta data.
in pseudo code:
$inventory = array(
"cars" => array(SomeCarObject, AnotherCarObject),
"trucks" => array(SomeTruckObject, AnotherTruckObject)
);
Because this is not kept in shared memory accross requests, every time you need to asynchronously operate on on this list, you'll need to fetch it (from where ever you're keeping state of it), modify it, then save it and respond appropriately.
Your flow might look something like this:
client request $inventory => server fetches state of $inventory, does stuff, resaves => sends $inventory back to client.
You will likely need to add a locking mechanism to the DB to indicate that another process is using it but that's something you'd need to exercise caution with because it can cause other requests to wait for something that is modifying your data to finish before they can finish. You do not want to introduce a race condition. http://en.wikipedia.org/wiki/Race_condition

Page won't display until PHP functions have completed execution

My website runs simplexml commands to pull data from 2 different websites, and doesn't finish loading the page until after the functions have their responses.
This is really only 1-2 seconds, but it is noticable when regular webpages take milliseconds to load.
Since this code is already in PHP functions, how can I most efficiently load the page and execute the code after? I'm assuming that by the time the page loads, the functions will have executed as well, its just that the browser itself won't refresh and finish loading til execution completes.
Hope this makes sense to you.
Unfortunately, php runs on the server side before the page is loaded. That is what allows it to provide dynamically generated content to the page. If you want to load the page and then run the php functions, you should check out AJAX.
Ajax uses javascript to call external functions and change content on the page without a reload.
Create a webpage without calling any of these functions. Add some JavaScript to that page to make AJAX requests to PHP scripts that call the functions, then adds the returned results to the page.
You have a few options.
AJAX call -- once the important stuff loads, have JS send word to the server that it needs to do some process to complete loading. (rennekon and Dan Grossman seem to have already suggested this).
iframe similar to AJAX, but it does not require JS. Placed at the bottom of the HTML it can let the server know something needs to finish without worrying about any other rendering. (this can actually also be accomplished by any number of tags which make HTTP requests. img attacks are notorious for allowing this with vulnerable sites.)
Spawn a new thread. This is a bit more difficult/annoying, but it does not rely on user feedback to finish processing. You also may not be able to do this on most servers, but it is one way to finish processing in the background.
You can create a cron that would talk to the 2 different websites and store the data you need periodically and then when your page runs it would talk to the local version that cron stored for you taking the communication off of page render time

jQuery AJAX get PHP script, display the content as it loads

I'm having to interact with the Facebook API for this project, which I find to be actually a bit slower than I expected. Because of this, I'm having to do something which I find rather unorthodox: I need to load the content Facebook provides back in my PHP script AS IT LOADS from Facebook. Traditionally I've loaded content into a div tag at the success of the script; however, I need to load the content as it appears. It would be absolutely unacceptable to have a client wait nearly a minute for Facebook to load an album and all respective comments before displaying anything. Hopefully I'm not being to vague; I'm not here to ask for code, but I've tried just about everything I can think of. Is this a simple concept I'm missing?... I feel as though this is easier than I'm making it.
I'm using jQuery AJAX as I find this easiest to work with. Any comments and/or help would be greatly appreciated.
The root of your problem is that jQuery's AJAX methods hook into the onreadystatechange event and readyState variable. readyState is only set to 4 when the file is completely transferred, and therefore your events will only fire after the download is complete.
Accessing the data as it is being sent is not consistent across different browser families. Doing it this way is going to be incredibly complex and time-consuming. I would recommend first doing this a little differently, perhaps by preloading the relevant facebook data on your own server predictively. This can be compiled to a static page, and that can in turn be served to your users very quickly.
To get the data to your users faster, you'll need to work outside the box as well. There's a jQuery plugin discussed here ( Does PHP flush work with jQuerys ajax? ) that makes jQuery ajax methods compatible with streamed output. Good luck.
The problem seems to stem from the fact that you're getting too much data at once. I suppose you are talking about receiving content from ajax as it is printed out immediately, but it is possible this content is built and sent at once and you won't have access to the data until the entire parse is complete. If this is untrue, look into COMET. If it is true, the solution is to put a limit on how much data you retrieve at once in an effort to reduce the parse time. For example, retrieve 5 photos in each request. Add those 5 photos to the DOM while you retrieve the next 5.
Instead of putting whatever code you want inside of the callback, just put it after the callback
for example
$("#div").load("facebook...", function() {
//do stuff
});
//put the stuff you want to load at the same time, here

Call PHP Function from Smarty with AJAX with no user action

I am hitting a lot of different sites to get a list of information and I want to display this information as I get it. Right now I am using a Smarty Template and what I would like to do is:
Pseudo code:
{foreach $page}
$smarty_var = use AJAX to call a PHP function
Render out a new table row on the fly w/ the newly assigned var
<tr><td>{$smarty_var}</td></tr>
{/foreach}
I don't know much about AJAX, I used it a long time ago, and it was similar to this, but not quite, there was user action taken. No I don't have a JS Framework in place. Am I way off here on how this should go? Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
Sure, I will tell you about what I am trying to do: http://bookscouter.com/prices.php?isbn=0132184745+&x=19&y=6
If you click on the 'Click to view prices from all 43 links' at the bottom on that page you will see. I am using cURL to get all the pages I want a price from. Then for each page I want to get the price. So each page is gonna fire off a function that runs some fun code like this:
function parseTBDOTpageNew($page, $isbn)
{
$first_cut = preg_split('/<table[^>]*>/', $page);
$second_cut = preg_split('/<td[^>]*>/', $first_cut[2]);
if(strstr($second_cut[4], "not currently buying this book") == true)
{
return "\$0.00";
}
$third_cut = preg_split('/<b[^>]*>/', $second_cut[9]);
$last_cut = preg_split('/</', $third_cut[3]);
return $last_cut[0];
}
This function is called from another function which puts the price returned from the function above, the name of the company, and a link in an array to be added to another bigger array that is sent to smarty. Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
I will take your advice on Jquery, what I have started is an onload function that receives the $pages to be parsed, and I was just in the middle of writing: foreach page get the info and spit some html w/ the info on the page.
Also the function that calls the function to get the price is in a php file, so I need the request to hit a function within a php file and NOT just call file.php?param1=foo, I need to it to actually hit the function in the file. I have Jquery in place, now just trying to figure it out and get it to do what I need, ugh. I am searching, any help would be appreciated.
No I don't have a JS Framework in place
Fix that first. You don't want to juggle XMLHTTPRequests yourself. jQuery is SO's canonical JS library, and is pretty nifty.
Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
How many rows will you be dealing with? Do they all have to be loaded asynchronously?
Let's tackle this in a braindead, straightforward way. Create a script that does nothing more than:
Take a site ID and fetch data from the corresponding URL
Render that data to some data transport format, either HTML or JSON.
Then it's a simple matter of making the page that the user gets, which will contain Javascript code that makes the ajax calls to the data fetcher, then either shoves the HTML in the page directly, or transforms the data into HTML and then shoves that into the page.
You'll note that at no point is Smarty really involved. ;)
This solution is highly impractical for anything more than a trivial number of sites to be polled asynchronously. If you need rows for dozens or hundreds of sites, that means each client is going to need to make dozens or hundreds of requests to your site for every single normal pageview. This is going to slaughter your server if more than one or two people load the page at once.
Can you tell us more about what you're doing, and what you're trying to accomplish? There are lots of ways to mitigate this problem, but they all depend on what you're doing.
Update for your question edit.
First, please consider using an actual HTML parser instead of regular expressions. The DOM is very powerful and you can target specific elements using XPath.
Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
So, here's the ultimate problem. You want to do something asynchronously. PHP does not have a built-in generalized way to perform asynchronous tasks. There are a few ways to deal with this problem.
The first is as I've described above. Instead of doing any of the curl requests on page load, you farm the work out to the end user, and have the end user's browser make requests to your scraping script one by one, filling in the results.
The second is to use an asynchronous work queue, like Gearman. It has excellent PHP support via a PECL extension. You'd write one or more workers that can accept requests, and keep a pool of them running at all times. The larger the pool, the more things you can do at once. Once all of the data has returned, you can throw the complete set of data at your template engine, and call it good.
You can even combine the two, having the user make only one or two or three extra requests via ajax to fetch part of the returned data. Heck, you can even kick off the jobs in the background and return the page immediately, then request the results of the background jobs later via ajax.
Regardless of which way you handle it, you have a giant, huge problem. You're scraping someone's site. You may well be scraping someone's site very often. Not everyone is OK with this. You should seriously consider caching results aggressively, or even checking with each of the vendors to see if they have an API or data export that you can query against instead.

Categories