Building infinite-scroll that facilitates Varnish caching - php

As you know infinite-scroll does repeated ajax request to get new content, and now hits directly Apache because the call is POST, and it has Cookies. We store in the session the last displayed item for each visitor, that's why the session hence the cookies.
We would like to take advantage of Varnish caching, so we are looking to improve this, and we are wondering what are out options here, as we need to do without cookies, without POST (so there is no user real identification).

We store in the session the last displayed item for each visitor
You can pass this information as a query string in the url of the next page. Also try not to use POST for loading a next page, use GET requests.

I have use caching with Infinite scroll based on the example code provided on the github page here the part we specifically need to look at is as follows...
nextSelector: "div.navigation a:first",
navSelector: "div.navigation",
The next 'section' loaded by the infinite scroll is picked up by read a link and getting the page contents.
As far as my knowledge goes, it uses the jQuery Load Feature and that feature states the following...
Request Method
The POST method is used if data is provided as an object; otherwise,
GET is assumed.
Therefore most standard caching techniques should work fine. I hope this helps, although i'm not familiar with varnish this should point you in the right direction.
Following the code above each link picked up by nextselector can contain GET Parameters for dynamic content.

Related

dynamically load part of a website - change url

my question is about this website - http://www.bits-apogee.org/2011/
whenever you click on the link in the side navigation bar, the middle part of the website is dynamically loaded. Also, the url also changes, everything else remains unchanged. how is this done?
is this some query plugin?
I totally agree with #JMCCreative. It looks like it's an actual refresh. You should look at the following post on StackOverflow.
Modify the URL without reloading the page
The site is using Hashes (#) in the URL to denote the new content being loaded.
This is useful for people bookmarking dynamically loaded content (normally hashes load to specific areas on a page, named anchors or ID's for new browsers), because they don't trigger the page to refresh...
http://www.highrankings.com/urls-with-hashtags-307 there are drawback (SEO) concerns to this... however you will notice more and more sites doing it so i would assume the SEO robots will get better.
There are 2 possibilities:
You can use the HTML5 capabilities to change the url (history pushState), however this feature isn't available in all browsers yet. For more information, look at this SO post: Is there a way to change the browser's address bar without refreshing the page? .
You can use a hashtag (#) part as fall back for browsers who don't have above feature yet.
If you use jQuery, you can use the handy plug-in jQuery Address. This will take care of both above cases.
They're not using a plugin. They're doing an ajax request to a URL like this:
http://www.bits-apogee.org/2011/getcontent/?even=Rachel+Armstrong
and dumping the overview in the container.
The circle of this type of process is usually like this:
listen for link clicks
on click, prevent default on event.
user window.history.pushState to update url
some other code - hears new history
generates a url to get the content.
ajax load the url
dump the data into a container
2 Libraries I have used, both are easier than the above, as they rely on loading a regular html page via AJAX instead the example site you point to, which used a JSON format to get the data.
Pjax - update peices of the page, by pulling that HTML node from a different URL.
Ajaxify - easiest solution, you could do that effect on an HTML site in 10 minutes.

Ajax page fetch design requires physical address

I am creating a web app in php. i am loading content through a ajax based request.
when i click on a hyperlink, the corresponding page gets fetched through ajax and the content is replaced by the fetched page.
now the issue is, i need a physical href so that i can implement facebook like functionality and also maintain the browser history property. i cannot do a old school POSTBACK to the php page as I am doing a transition animation in which the current page slides away and the new page slides in.
Is there a way I can keep the animation and still have a valid physical href and history.
the design of the application is such:
the app grabs an rss feed.
it creates the DOM for those rss feeds.
upon clicking on any headline, the page animates and takes to the full story of the rss feed.
i need to create "like" button on the full story page. but i dont have a valid url.
While Alexander's answer works great on the client side, Facebook's linter tool does not run javascript, so it will get the old content. Neither of the two links provide a solution to this.
What amit needs to implement is server-side parsing of the url. See http://www.php.net/manual/en/function.parse-url.php. Fragment is what the server sees as the hash tag value. In your php code, render the correct og: tags for based upon the fragment.
Firstly, if you need a URL for facebook then think up a structure that gives you one, such that your server-side code will load the correct page when given that URL. This could be something like http://yourdomain.com/page.php?feed=<feedname>&link=<linknumber>, which would allow you to check the parameters using the PHP $_GET array. If you don't have the parameters then load the index page; if you do then load the relevant article.
Secondly, use something like history.js to give you cross-browser support for the HTML5 pushState() functionality so that you can set the page URL when you do the AJAX call, without requiring the browser to do a full reload.
You have to implement hash navigation.
Here is short tutorial.
Here is more conceptual introduction.
If you're using jQuery, I can recommend BBQ for hash navigation:
http://benalman.com/projects/jquery-bbq-plugin/
This actually sounds pretty straight forward to me.
You have the urls as usual, using hash (#) you can extract the info both in the client and server side.
There is only one thing that is missing, on the server side before you return the content, check the user agent string and compare it to the facebook bot (if i'm not mistaken it's something like "facebookexternalhit"), if it turns out to be the facebook bot then return what ever you want which describes the url for a like/share (open graph meta data), and if it's any other user agent string return the content as usual.

Save URL of AJAX loaded page, so it can be loaded after a refresh

We have an application writted in PHP. Its main view is for example: /pages/index.
Now when the user clicks on certain links, it pulls in other Views via ajax. ie. a call may look like /pages/publish, so the PHP outputs the relevant html for the publish section back to the index view.
The problem we have is we'd like to be able to give the user the option of refreshing and seeing the same view as before. So, my initial thought is this, when we use .load() in jQuery, to take the URL its going to load and store it somewhere to be read by the PHP if the user refreshes. Is the best way to do or can someone think of a better way to do this whole thing?
Check out jQuery.address which should solve your problems! It allows AJAX loading of new pages, and will update the address bar accordingly. If a user saves this URL and reloads it, the script on the page can then load the correct page.
Alternatively, if you're HTML5-only, then you can try history.pushState() which will modify the URL without using the hash symbol, but support isn't 100% yet. (I don't think... it certainly behaves oddly on iPad from my experience.)

Designing a container page

Are there any potentiall pitfalls with the following idea:
...I want to have one container page, index.php. The header and outlines will be constant but in the middle I want one big panel, which loads its content from external php files, one for each "slide". When a user click a link, the central div will update with the new content, the outer edge will remain unchanged.
Will I be able to use session variables, etc correctly with the set-up. I realise it will certainly break the browser history but other than some possible UI issues, are there any techincal barriers.
This is a common thing, as Jared stated. Session variables are always available through ajax or frames, so it shouldn't affect anything there, and if browser history is something you would like to continue to use, you could always change your location.hash when you load new content so that you can load previously rendered content with some javascript if someone uses the back or forward buttons.
The session should not be lost.
The browser history does not need to be lost also - please read about onPopState (and history.pushState) and onHashChange JS events. The AJAX-heavy sites can determine the content to be loaded that way.
One of the pitfails is, if you are using a lot of JS, that the events for the newly loaded content will need to be re-attached, but they can also be delegated from the container which is not replaced.
jQuery's .load() function may be also useful to you to get started.

PHP and Javascript / Ajax caching for load speed - JSON and SimpleXML

I have a site that get content from other sites with some JSON and XML API. To prevent loading problems and problems with limitations I do the following:
PHP - Show the cached content with PHP, if any.
PHP - If never cached content, show an empty error page and return 404. (The second time the page loads it will be fine "success 200")
Ajax - If a date field does not exist in the database, or current date is earlier than the stored date, load/add content from API. Add a future date to the database. (This makes the page load fast and the Ajax caches the content AFTER the page is loaded).
I use Ajax just to run the PHP-file. I get the content with PHP.
Questions
Because I cache the content AFTER it was loaded the user will see the old content. Which is the best way to show the NEW content to the user. I'm thinking automatically with Javascript reload the page or message-nag. Other prefered ways?
If I use very many API:s the Ajax loadtime will be long and it's a bigger risk that some error will accur. Is there a clever way of splitting the load?
The second question is the important one.
Because I cache the content AFTER it
was loaded the user will see the old
content. Which is the best way to show
the NEW content to the user. I'm
thinking automatically with Javascript
reload the page or message-nag. Other
prefered ways?
I don't think you should reload the page via javascript, but just use Jquery's .load(). This way new content is inserted in the DOM without reloading the entire page. Maybe you highlight the newly inserted content be adding some CSS via addClass().
If I use very many API:s the Ajax
loadtime will be long and it's a
bigger risk that some error will
accur. Is there a clever way of
splitting the load?
You should not be splitting content in first place. You should try to minimize number of HTTP requests. If possible you should be doing all the API calling offline using some sort of message queue like for example beanstalkd, redis. Also cache the data inside in-memory database like for example redis. You can have a free instance of redis available thanks to http://redistogo.com. To connect to redistogo you should probably use predis
Why not use the following structure:
AJAX load content.php
And in content.php
check if content is loaded. yes > check if date is new. yes > return content
there is content, but its older > reload content from external > return content
there is no content > reload content from external > return content.
And for your second question. It depends on how often the content of the api's needs to be refreshed. If its daily you could run a script at night (or when there are the littlest people active) to get all new content and then during the day present that content. This way you minimize the calls to external resources during peak hours.
If you have access to multiple servers, the clever way is splitting the load. have each server handle a part of the requests.

Categories