What method would you choose for retrieving tweets from Twitter, for example?
Using json with jQuery (like http://juitter.com/), where client does the request.
using PHP, for eg. CURL, where server does the request
What are the advantages/disadvantages of both?
Using server side approach, you may benefit from caching facilities, and having an overall faster page load (no roundtrips to twitter, or any other 3rd party service on each request).
The other benefit of the server side approach is, users coming from countries (or even companies) which twitter is filtered in, won't see a big empty pane, or even worse a mangled layout. And even when twitter goes bad, you have a cache somewhere to show the recent tweets.
But the most obvious advantage of client side approach is being hassle free.
There is a distinction between the both approaches. When using server side outside HTTP requests, you retrieve the data to the server. If you want that data to be seen to client, you have to send it with the next request the client does to your server side. With server side requests you can do native HTTP requests and cross-domain requests also.
Client side cross-domain requests retrieve the data directly to the client. You can display that to the client in the same instance when the request returns data but if you want the data on the server side (storing the tweets in db) you have to send them from the client, back to the server. Javascript can't do cross-domain XHR requests. In order to do that, you (or the libs that do that) make some workarounds: Using iframes, using including of JS files that already have the info that you need etc.
If you need to consume a web service, I advice on using the backend as the service client and either use timely pull from the client side, or use some of the "comet" techniques.
I think it depends on how frequently the stream you are pulling in gets updated. If it isvery frequently then JS is better, because you can continually run the call without refreshing the page, whereas not so frequently, and you can pull all the data in using CURL and use JS for animations.
Client-side requests are better when you have to do many requests (e.g. for use by a public site) as you lower your server load / avoid a bottleneck, possibly benefitting from content-delivery networks (CDN) caching the requests on behalf of your web clients and you move some of the liability from yourself to the users of your site as they are actually accessing the third-party API (which may be more relevant if you have really many requests; some API terms of use even limit the number of requests per time unit so that client-side requests are the only option for big sites).
Server-side requests have the advantage of not requiring JavaScript to be enabled on the client side and can also be logged easily for statistical purposes or processed further.
using server as "MITM" you may cache, alter, insert data from 3rd part before published to your users. Your users may dislike it, though...
I would prefer client side for displaying tweets rather than the server side because of the following reasons:-
Since tweets are displayed directly on the browser in real time, using jQuery complements well with this scenario. This refreshes the page quickly along with your tweets (using multiple asynchronous ajax calls ofcourse).
Well when you take the server side, it takes an extra roundtrip to the server and back to the client. Do you need that trip when all you need is to display just tweets??. Tweets are just social messages (say hello!!!) stuff and they are not supposed to contain any harmful stuff.
You generally take the server side approach when you need to do data validation before displaying on the browser. For tweets, it's not necessary.
//for client making request
jQuery.ajax({
url:'target_url',
type:'get',
dataType:'jsonp',
success:function(data){
//do something with data
}
});
//note: dataType is jsonp not JSON as calling twitter from your domain would
//not be allowed by browser(cross domain request are not allowed)
//so u have use jsonp.
Related
I'm making an app which accesses a database and updates the data every view seconds over a PHP script, the problem is that it currently always updates all data, I would like to know how to program something that dynamically updates data and decides what data to update and what not, so it basically keeps track of change somehow. So how would I best go along doing something like this ?
I think that there should be some where that this question has already be asked but I couldn't find it so maybe someone can show me a website where to look.
In general, you will need to user either XHR requests, web sockets, or HTTP/2 to solve this problem. Since HTTP/2 is not universally supported on the browser side, it may not work for you. Here is the outline of the solution:
Every few seconds, javascript you provide in the browser will need to poll the server for updates using an XHR request. You can use the returned data to update the screen with Javascript. If you only want to do some simple updates, like updating some numbers, you might use raw Javascript or jQuery. If your polling will result in complex screen updates or you want to move a lot of functionality into the client, you probably want to redo your client using one of the JavaScript frameworks like React or Angular.
Use web sockets (or HTTP/2) to create a persistent connection to the server, and have the server send updates to the clients as the data changes. This probably will require some code in the application to broadcast or multicast the updates. The client code would be similar to case 1, except that the client would not poll for updates.
The polling solution is easier to implement, and would be a good choice as long as you don't have too many clients sending polls at too high a rate - you can overwhelm the servers this way.
I'm trying to figure out a way for users of a website (say a student and teacher) to share a secure connection where real time updates on one page are viewed by both of them.
From research I've concluded that some of the real time updates could be performed using ajax and javascript.
But I'm stumped as to how users could share a connection where only the two users would be viewing the updates that take place on the website (such as flash animations of a drawing board.) I'm also confused how you would even begin to set up a connection like this.
I've looked intp php sessions and cookies but I'm not sure I'm doing the right research.
Any pointers as to how two specific users could share a secure connection where real time updates are viewed by the both of them only. I don't want a terse response please. I'm looking for specific details like functions and syntax specific to php. I appreciate the help and will rate you up if you give me good answers!
You cannot share a secure connection (e.g. HTTPS) its one client to one server.
If both clients are logged in and have a background AJAX task running in the browser, is it acceptable to have each client "pull" every few seconds the same data to display for both users?
This would require the "drawing board" updates to also be sent continuously back to the server to share the updated data with the other client. I'm sure there will be an event you can use to trigger the post of data (e.g. on mouse up).
If performance is an issue, you'd want to use a better server technology like Java that is able to keep session state between requests without having to persist to a database.
You can look at ajax push techniques. I used comet once where an administrator posted messages and everybody who was logged in saw that message appear on their screen. I don't know if comet supports PHP. I only used it with JSP. Just search for "ajax push" in Google.
Flash allows for connections between users, I think they refer to them as sockets.
If you want to use Ajax, et al, you need a server side technology that supports push.
Node is the standard in this regard, and you can set up a Heroku instance for free.
There are others, and you need to learn tools before even beginning to learn application.
Among the many overviews, this might interest you:
http://arstechnica.com/business/2012/05/say-hello-to-the-real-real-time-web/?1
A few good examples where this is happening:
Google Docs
Etherpad
HTML5 Games: Multi player
Techniques you can use (with varying browser support)
HTML5 WebSockets (Wikipedia; MDN; HTML5 Demo)
Comet (Wikipedia)
Really pushing data to a web browser client from a server (which would do that when it receives something from another client) is only possible with WebSockets as far as I know. Other mechanism would either require browser plugins or a stand-alone application.
However with Comet (through AJAX) you can get really close to pushing data by polling the server periodically for data. However contrary to traditional polling (e.g. where a clients asks for data every 5 seconds), with the Comet principle the server will hold that periodic request hostage for, say, up to 30 seconds. The server will not send back data until either it has data or the time out is reached. That way, during those 30 seconds, any data that the server receives can be instantly pushed back to the other clients. And right after that the client starts a new 30 second session, and so forth.
Although both Comet and WebSockets should work with a PHP backend served by Apache. I'd recommend looking into NodeJS (as server technology) for this.
There is a lot of information regarding Comet on the internet, I suggest you google it, maybe start at Wikipedia.
The great thing about Comet is that it is more a principle than a technology. It uses what we already have (simple HTTP requests with AJAX), so browser support is very wide.
You could also do a combination, where you use sockets if supported and fallback to Comet.
I'm sure you have looked into this. The opinion that this can be done via ajax is misleading to believe that two users of a website can communicate via javascript.
As you are aware, javascript happens on the client and ajax is essentially 'talking to the server without a page change or refresh'.
The communication between two users of the website has to happen via the server - php and some chosen datastore.
Hope that wasn't terse.
cheers, Rob
So i'm working on an A/B tester website, similar to http://www.optimizely.com/ and i'm quite new to web development. An A/B tester pretty much allows clients to create variants of their website to make optimizations based on user response (mouse clicks, etc.). So once the variants are made on our website (e.g. larger button size), my job is to send a package to the client which allows them to access and run the javascripts of the variants on the clients end when their page loads. Do i need an ajax call to send this data or can it be done via https request and what are the pros and cons for what i need done? (We're using mysql, hadoop and php). Thanks.
This question doesn't make a lot of sense to me.
HTTPS is a communication protocol. AJAX is a programming pattern (or, perhaps more cynically, buzzword). AJAX most often would use HTTPS to accomplish the actual secure communication between client and server.
If I understand correctly what you mean....
Depends of what data is being sent - if it is personal data, always use HTTPS calls (ie. request the data from https://yourdomain.com/your_script.php), otherwise HTTP (ie. http://yourdomain.com/your_script.php) will be ok (both of these can be done via ajax, so that's not a problem).
I am using jQuery's $.ajax method to retrieve some JSON from an API.
Each time the page is loaded, a call to the API is made, regardless if the user has has received this data before - which means when a large amount of users are on the page, the API limiting would come into effect.
My thought of how to deal with this would be firstly pushing the data to a database (pushing to a PHP script), and then checking the database to see if anything is cached, before going back to the API to get more up to date information if required.
Is this a viable method? What are the alternatives?
It just seems like jQuery is actually a hurdle, rather than doing it all in PHP to begin with, but as I'm learning the language, would like to use it as much as I can!
In order to help distinguish between opinions and recommended techniques, lets first break down your problem to make sure everyone understands your scenario.
Let's say we have two servers: 'Server A' and 'Server B'. Call 'Server A' our PHP web server and 'Server B' our API server. I'm assuming you don't have control over the API server which is why you are calling it separately and can't scale the API server in parallel to your demand. Lets say its some third party application like flickr or harvest or something... let's say this third party API server throttles requests per hour by your developer API key effectively limiting you to 150 requests per hour.
When one of your pages loads in the end-users browser, the source is coming from 'Server A' (our php server) and in the body of that page is some nice jQuery that performs an .ajax() call to 'Server B' our API server.
Now your developer API key only allows 150 requests per hour, while hypothetically you might see 1000 requests inside one hours to your 'Server A' PHP server. So how do we handle this discrepancy of loads, given the assumption that we can't simple scale up the API server (the best choice if possible).
Here are a few things you can do in this situation:
Just continue as normal, and when jQuery.ajax() returns a 503 service
unavailable error due to throttling (what most third party APIs do) tell your end user politely that
you are experiencing higher than normal traffic and to try again
later. This is not a bad idea to implement even if you also add in
some caching.
Assuming that data being retrieved by the API is cache-able, you
could run a proxy on your PHP server. This is particularly well
suited when the same ajax request would return the same response
repeatedly over time. (ex: maybe you are fetching some description
for an object, the same object request should return the same description response for
some period of time). This could be a PHP Pass through proxy or a
lower level proxy like SQUID caching proxy. In the case of a PHP Pass through proxy you would use the "save to DB or filesystem" strategy for caching and could even re-write the expires headers to suite your level of desired cache lifetime.
If you have established that the response data is cache-able, you can
allow the client side to also cache the ajax response using
cache:true. jQuery.ajax() actually defaults to having cache:true, so you simply need to not set cache to false for it to be caching responses.
If your response data is small, you might consider caching it client-side in a
cookie. But experience says that most users who clear their temporary
internet files will also clear their cookies. So maybe the built in caching with jQuery.ajax() is just as good?
HTML5 local storage for client-side caching is cool, but lacks wide-spread popular support.
If you control your user-base (such as in a corporate environment) you
may be able to mandate the use of an HTML5 compliant browser. Otherwise, you
will likely need a cookie based fallback or polyfill for browsers lacking
HTML5 local storage. In which case you might just reconsider other options above.
To summarize, you should be able to present the user with a friendly service unavailable message no matter what caching techniques you employ. In addition to this you may employ either or both server-side and client-side caching of your API response to reduce the impact. Server-side caching saves repeated requests to the same resource while client side caching saves repeated requests to the same resources by the same user and browser. Given the scenario described, I'd avoid Html5 LocalStorage because you'll need to support fallback/polyfill options which make the built in request caching just as effective in most scenarios.
As you can see jQuery won't really change this situation for you much either way vs calling the API server from PHP server-side. The same caching techniques could be applied if you performed the API calls in PHP on the server side vs performing the API calls via jQuery.ajax() on the client side. The same friendly service unavailable message should be implemented one way or another for when you are over capacity.
If I've misunderstood your question please feel free to leave a comment and clarify and/or edit your original question.
No, don't do it in PHP. Use HTML5 LocalStorage to cache the first request, then do your checking. If you must support older browsers, use a fallback (try these plugins).
Is it possible for a web page using Javascript to get data from another website? In my case I want to get it for calculations and graphing a chart. But I'm not sure if this is possible or not due to security concerns. If it is considered a no no but there is a work around I would appreciate being told the work around. I don't want to have to gather this information on the server side if possible.
Any and all help is appreciated.
Learn about JSONP format and cross-site requests (http://en.wikipedia.org/wiki/JSON#JSONP).
You may need to use the "PHP-proxy" script at your server side which will get the information from the websites and provide it to yours Javascript.
The only reliable way is to let "your" webserver act as a proxy. In PHP you can use curl() to fire a HTTP request to an external site and then just echo the response.
You can't pull data from another server due to the same origin policy. You can do some tricks to get around it, such as putting the URL in a <script> tag, but in your case it wouldn't work for just parsing HTML.
Use simple_dom_html, to parse your data server side. it is much easier than doing it in JavaScript anyways.
A simple way you might be able to do this is to use an inline iframe. If the web page you are getting the data from has no headers, or you can isolate the data being pulled in (to say an image or SWF), this might work.
cross-domain javascript used to be impossible, using a (php-)proxy was a workaround for that.
jsonp changes this entirely, it allows to request javascript from another server (if it has an API that supports jsonp, a lot of the bigger webplayers like google, twitter, yahoo, ... do), specifying the callback-function in your code that needs to be triggered to act on the response.
the response in javascript will contain:
a call to a callback-function you defined
the actual payload as a javascript-object.
frameworks like jquery offer easy support for jsonp out of the box.
once you have the raw data you could tie into google chart tools to create graphs on the fly and insert them in your webapp.
Also worth considering is support for XMLHttpRequest Access Control which is support in some modern browsers.
If the service provider that you are trying to access via a web page has this set up, it is a very simple call to XMLHttpRequest and you will get access to the resources on that site without the need for JSONP (especially useful for requests that are not GET, i.e. POST, HEAD etc)