What is the best process for fetching new content using JSON? - php

What would be the best way to fetch new content on a website with json, I got the following system in my mind: (looped)
client -> (do i have new content) -> server
client <- (nope) <- server
client -> (do i have new content) -> server
client <- (yes, contentId x) <- server
client -> (get content with id x) -> server
client <- (Jim Morrison) <- server
Is this the best way? The application will be used for over 10000+ users, how does Facebook do this, and still keep everyone super fast up-to date?

One modern way to do this is to use a server-push system like WebSockets or Socket.io rather than a client-pull system like occasionally requesting an "is there something new" status.
With this type of technology there is an open connection from the client (I'm assuming it's a web browser) to the server at all times, and whenever something happens on the server, the server pushes the data out to the client directly.
For code and examples check out http://socket.io/
Not all server-side technology is compatible with this type of approach, but one very good one is Node.js which allows you to write event-driven server-side code in Javascript. See http://nodejs.org
If you choose to use Node.js on the server and want to interact with browsers, there is even Now.js (see http://nowjs.com/ ) which allows you to in effect call functions on the client from the server!

facebook uses Peer-to-peer technology, meaning its their own cloud of information. Any server can ask any server to provide information, or where to go to get that information.

There are many ways to do this, the most common is Polling (which you've outlined in your example), in which the client initiates the request and then closes the connection after receiving a response. The second "class" is called HTTP Keepalive which has a variety (and ever growing) list of implementations.
The main advantage with HTTP Keepalive is that there is no TCP connection overhead (or SSL) once the initial connection is made.
The other thing you want to look at is server clustering. This is huge topic, however I suggest that you take a look at these resources to get started.
Web Server Load Balancing:
http://httpd.apache.org/docs/2.2/mod/mod_proxy_balancer.html
http://wiki.nginx.org/HttpUpstreamModule
Scaling MySQL:
http://dev.mysql.com/doc/refman/5.5/en/replication.html
Note that there are a lot of ways to scale a site and I hope that others will post alternatives for you to evaluate.

I would use jQuery on the browser side with this really nice plugin: http://benalman.com/projects/jquery-dotimeout-plugin/
The doTimeout functionality vastly improves on the plain JavaScript setTimeout.
Then on the PHP side have a polling function that returns 0 if no new content is available. If content is available the go ahead and return it. No sense making two trips when one will do.
Just use json_encode on the php side if content is available. Unwrap your content in the browser and append as needed.
This is a rather general overview of how to do it, but looking at the jQuery.getJSON docs should give you all the detail you need: http://api.jquery.com/jQuery.getJSON/

Related

Dynamic data updating

I'm making an app which accesses a database and updates the data every view seconds over a PHP script, the problem is that it currently always updates all data, I would like to know how to program something that dynamically updates data and decides what data to update and what not, so it basically keeps track of change somehow. So how would I best go along doing something like this ?
I think that there should be some where that this question has already be asked but I couldn't find it so maybe someone can show me a website where to look.
In general, you will need to user either XHR requests, web sockets, or HTTP/2 to solve this problem. Since HTTP/2 is not universally supported on the browser side, it may not work for you. Here is the outline of the solution:
Every few seconds, javascript you provide in the browser will need to poll the server for updates using an XHR request. You can use the returned data to update the screen with Javascript. If you only want to do some simple updates, like updating some numbers, you might use raw Javascript or jQuery. If your polling will result in complex screen updates or you want to move a lot of functionality into the client, you probably want to redo your client using one of the JavaScript frameworks like React or Angular.
Use web sockets (or HTTP/2) to create a persistent connection to the server, and have the server send updates to the clients as the data changes. This probably will require some code in the application to broadcast or multicast the updates. The client code would be similar to case 1, except that the client would not poll for updates.
The polling solution is easier to implement, and would be a good choice as long as you don't have too many clients sending polls at too high a rate - you can overwhelm the servers this way.

How to share real time updates on a website between users on different computers?

I'm trying to figure out a way for users of a website (say a student and teacher) to share a secure connection where real time updates on one page are viewed by both of them.
From research I've concluded that some of the real time updates could be performed using ajax and javascript.
But I'm stumped as to how users could share a connection where only the two users would be viewing the updates that take place on the website (such as flash animations of a drawing board.) I'm also confused how you would even begin to set up a connection like this.
I've looked intp php sessions and cookies but I'm not sure I'm doing the right research.
Any pointers as to how two specific users could share a secure connection where real time updates are viewed by the both of them only. I don't want a terse response please. I'm looking for specific details like functions and syntax specific to php. I appreciate the help and will rate you up if you give me good answers!
You cannot share a secure connection (e.g. HTTPS) its one client to one server.
If both clients are logged in and have a background AJAX task running in the browser, is it acceptable to have each client "pull" every few seconds the same data to display for both users?
This would require the "drawing board" updates to also be sent continuously back to the server to share the updated data with the other client. I'm sure there will be an event you can use to trigger the post of data (e.g. on mouse up).
If performance is an issue, you'd want to use a better server technology like Java that is able to keep session state between requests without having to persist to a database.
You can look at ajax push techniques. I used comet once where an administrator posted messages and everybody who was logged in saw that message appear on their screen. I don't know if comet supports PHP. I only used it with JSP. Just search for "ajax push" in Google.
Flash allows for connections between users, I think they refer to them as sockets.
If you want to use Ajax, et al, you need a server side technology that supports push.
Node is the standard in this regard, and you can set up a Heroku instance for free.
There are others, and you need to learn tools before even beginning to learn application.
Among the many overviews, this might interest you:
http://arstechnica.com/business/2012/05/say-hello-to-the-real-real-time-web/?1
A few good examples where this is happening:
Google Docs
Etherpad
HTML5 Games: Multi player
Techniques you can use (with varying browser support)
HTML5 WebSockets (Wikipedia; MDN; HTML5 Demo)
Comet (Wikipedia)
Really pushing data to a web browser client from a server (which would do that when it receives something from another client) is only possible with WebSockets as far as I know. Other mechanism would either require browser plugins or a stand-alone application.
However with Comet (through AJAX) you can get really close to pushing data by polling the server periodically for data. However contrary to traditional polling (e.g. where a clients asks for data every 5 seconds), with the Comet principle the server will hold that periodic request hostage for, say, up to 30 seconds. The server will not send back data until either it has data or the time out is reached. That way, during those 30 seconds, any data that the server receives can be instantly pushed back to the other clients. And right after that the client starts a new 30 second session, and so forth.
Although both Comet and WebSockets should work with a PHP backend served by Apache. I'd recommend looking into NodeJS (as server technology) for this.
There is a lot of information regarding Comet on the internet, I suggest you google it, maybe start at Wikipedia.
The great thing about Comet is that it is more a principle than a technology. It uses what we already have (simple HTTP requests with AJAX), so browser support is very wide.
You could also do a combination, where you use sockets if supported and fallback to Comet.
I'm sure you have looked into this. The opinion that this can be done via ajax is misleading to believe that two users of a website can communicate via javascript.
As you are aware, javascript happens on the client and ajax is essentially 'talking to the server without a page change or refresh'.
The communication between two users of the website has to happen via the server - php and some chosen datastore.
Hope that wasn't terse.
cheers, Rob

How to Implement Real-time Form Editing, Like Google Docs

I am trying to add the feature that Google Docs has, which is real time collaboration, to an editable textarea in HTML. For example, this would allow 2 or 3 users can edit the same textarea collaboratively. How would one go about approaching this problem, or is there is a JavaScript library that can be used? (I use PHP, mySQL, and JavaScript/AJAX/jQuery).
In order to facilitate real-time updates between more than one Web client, you'll need to use a technology that either capitalizes on the request/response cycle of the Web by using a Comet or Websockets solution.
To keep the textarea updated, you'd need to establish a long-lived HTTP connection or a Websocket connection to your server from all 3 clients. Each textarea would need a keyup or keypress handler that, when invoked, sends the character across the stream to the server. When your server retrieves this data, it would then need to return a response to the other 2 connected clients.
The response would need to then be handled by updating the value property of the textarea with the most recent data.
I see you're using PHP, which I do know supports comet. You'll need to setup comet (or Websockets) in order to implement such a solution.
With that said, a more rudimentary alternative would be to use polling to achieve the desired effect. This would involve all 3 clients making requests to the server periodically to retrieve updates. As you can imagine, the faster the polling rate, the more real time the application would feel. However, with a faster the polling rate, your application would consume more bandwidth and resources.
For 3 clients, this may be feasible, but for any serious application that involved heavy usage, you would definitely want to look into Websockets or Comet.
To answer your question of JavaScript libraries, check out the Dojo Cometd library for a Comet solution on the client-side.

Caching data retrieved by jQuery

I am using jQuery's $.ajax method to retrieve some JSON from an API.
Each time the page is loaded, a call to the API is made, regardless if the user has has received this data before - which means when a large amount of users are on the page, the API limiting would come into effect.
My thought of how to deal with this would be firstly pushing the data to a database (pushing to a PHP script), and then checking the database to see if anything is cached, before going back to the API to get more up to date information if required.
Is this a viable method? What are the alternatives?
It just seems like jQuery is actually a hurdle, rather than doing it all in PHP to begin with, but as I'm learning the language, would like to use it as much as I can!
In order to help distinguish between opinions and recommended techniques, lets first break down your problem to make sure everyone understands your scenario.
Let's say we have two servers: 'Server A' and 'Server B'. Call 'Server A' our PHP web server and 'Server B' our API server. I'm assuming you don't have control over the API server which is why you are calling it separately and can't scale the API server in parallel to your demand. Lets say its some third party application like flickr or harvest or something... let's say this third party API server throttles requests per hour by your developer API key effectively limiting you to 150 requests per hour.
When one of your pages loads in the end-users browser, the source is coming from 'Server A' (our php server) and in the body of that page is some nice jQuery that performs an .ajax() call to 'Server B' our API server.
Now your developer API key only allows 150 requests per hour, while hypothetically you might see 1000 requests inside one hours to your 'Server A' PHP server. So how do we handle this discrepancy of loads, given the assumption that we can't simple scale up the API server (the best choice if possible).
Here are a few things you can do in this situation:
Just continue as normal, and when jQuery.ajax() returns a 503 service
unavailable error due to throttling (what most third party APIs do) tell your end user politely that
you are experiencing higher than normal traffic and to try again
later. This is not a bad idea to implement even if you also add in
some caching.
Assuming that data being retrieved by the API is cache-able, you
could run a proxy on your PHP server. This is particularly well
suited when the same ajax request would return the same response
repeatedly over time. (ex: maybe you are fetching some description
for an object, the same object request should return the same description response for
some period of time). This could be a PHP Pass through proxy or a
lower level proxy like SQUID caching proxy. In the case of a PHP Pass through proxy you would use the "save to DB or filesystem" strategy for caching and could even re-write the expires headers to suite your level of desired cache lifetime.
If you have established that the response data is cache-able, you can
allow the client side to also cache the ajax response using
cache:true. jQuery.ajax() actually defaults to having cache:true, so you simply need to not set cache to false for it to be caching responses.
If your response data is small, you might consider caching it client-side in a
cookie. But experience says that most users who clear their temporary
internet files will also clear their cookies. So maybe the built in caching with jQuery.ajax() is just as good?
HTML5 local storage for client-side caching is cool, but lacks wide-spread popular support.
If you control your user-base (such as in a corporate environment) you
may be able to mandate the use of an HTML5 compliant browser. Otherwise, you
will likely need a cookie based fallback or polyfill for browsers lacking
HTML5 local storage. In which case you might just reconsider other options above.
To summarize, you should be able to present the user with a friendly service unavailable message no matter what caching techniques you employ. In addition to this you may employ either or both server-side and client-side caching of your API response to reduce the impact. Server-side caching saves repeated requests to the same resource while client side caching saves repeated requests to the same resources by the same user and browser. Given the scenario described, I'd avoid Html5 LocalStorage because you'll need to support fallback/polyfill options which make the built in request caching just as effective in most scenarios.
As you can see jQuery won't really change this situation for you much either way vs calling the API server from PHP server-side. The same caching techniques could be applied if you performed the API calls in PHP on the server side vs performing the API calls via jQuery.ajax() on the client side. The same friendly service unavailable message should be implemented one way or another for when you are over capacity.
If I've misunderstood your question please feel free to leave a comment and clarify and/or edit your original question.
No, don't do it in PHP. Use HTML5 LocalStorage to cache the first request, then do your checking. If you must support older browsers, use a fallback (try these plugins).

How can I implement a realtime dashboard?

I would like to implement a realtime dashboard like an index page of www.foursquare.com
I checked foursquare.com's index page with Chrome's developer tool and surprised that they don't use xhr to get those information approx. every 5 seconds.
Using ajax polling causes memory leak in some browsers and make a server busier.
Is there any way that I can implement a realtime dashboard efficiently with PHP and jQuery(AJAX)?
(Perhaps I need an extra server something like a push server?) :|
Foursquare's homepage loads 30 items (id=recent29, id=recent28, ...) but displays only 11 at once. So you will have a real-time feeling for about 90 seconds (after that, the same items reappear).
...
$('#recent'+toShow).slideDown(1000, move(i));
$('#recent'+i).slideUp(1000, move(i));
...
For some bidirectional client server communication, take a look at websockets, even though they are not universally supported yet, they eventually become a standard.
The WebSocket API is being standardized by the W3C, and the WebSocket protocol is being standardized by the IETF.
One method to get data pushed to a client is a loop like this:
Open AJAX request in browser.
Server waits with the request open until it either times out or new data is available.
Server returns the request with either new data or a timeout, and client immediately opens a new request.
You could use comet, APE is particularly easy to setup and configure:
http://www.ape-project.org/
Back-end is written in C so it's very fast.

Categories