Please see following examples first, and which one is better?
Could you compare some aspect such as performance, programming, design, load, user experience, maintenance, security or any other aspect which I have not expected.
Can these aspect help me to decide which one is better?
<script>
$(document).ready
(
function()
{
var message={..something..};
$.ajax
(
{
url:'get_content.php',
data:message,
type:'post',
cache:false,
success:function(data)
{
foreach(data.array)
{
$('#content').append($('<tr><td>'+data.array[key]+'</td></tr>'));
}
},
error:function(){alert('error');}
}
);
}
);
</script>
<table id="content">
</table>
<table id="content">
<?php
query sql;
while(row=fetch result)
{
echo '<tr><td>'+row[field]+'</td></tr>';
}
?>
</table>
You're comparing apples to oranges.
PHP -- Server-Side Language
AJAX -- Client-Side Language (JavaScript)
AJAX is about your only (reasonable) means to grab data from the server on behalf of the client without a full page refresh. However, you can use an assortment of different languages server-side to render the data necessary for AJAX to proceed. (ASP, PHP, ...).
It's up to you and what the rest of your site is developed in really. If you want an absolute (no failure) method of producing content, a PHP dump is about the best way. It's guaranteed to be visible and won't break depending on client support (maybe they have NoScript plugin?).
However, if there is a lot of data, it's sometimes better to spread the load over multiple calls so that the client has at least a semi-complete portion of the page visible, and the data comes later. (This is generally how Facebook proceeds--they give you the general layout in the first fraction of a second, then the rest of the content comes once the framing is complete).
One thing to note though is that it's not an either/or kind of decision. You can use AJAX with a PHP fall-back, you just have to have code in place to test the client and relay that information back to the server for an informed decision.
Personally, I'd take PHP. It takes so little time that the user will never notice a delay, whereas with AJAX there's a whole separate request to make.
AJAX is good for fetching data after the page is loaded. I don't think it should be used for fetching data that should be there in the first place.
Related
my problem is - for now - not about specific code but more about basic understanding - I think.
I want to create a formular and use the data without refreshing the page, so that brings me to AJAX.
Now do I always have to create a seperate file that works with the data that AJAX sends? Can't I just "grab" the data and work with it on the same page?
I think I missunderstood some basic concepts.
I thought about something like this:
<form id="load_filters_form">
..
</form>
<?php
var_dump($_GET); // values from <form>
?>
<!-- AJAX, jQuery -->
<script>
$("#load_filters_form").submit(function(event){
event.preventDefault();
$.ajax({
type: 'get',
data: $(this).serialize()
success: function() {
$("#load_filters_form")[0].reset();
}
});
});
</script>
What you're proposing is certainly possible, it's exactly how AJAX works. You make an AJAX request from JavaScript code, sending any data the server-side code will need, and handle the response from the server in your JavaScript code.
The problem with what you're proposing is that you're making it unnecessarily complex for yourself. Consider what your code in the question would return to the JavaScript code in the AJAX response. It returns an entire HTML page, most of which is already on the client.
Why re-transmit all of that data that the client already has? Why have code on the client to parse out the data it's looking for from all of the unnecessary markup around that data?
Keep your operations simple. If you need a server-side operation which receives data, performs logic, and returns a result then create an operation which does exactly that. Call that operation in AJAX and use the resulting data.
Now maybe that response is structured JSON data, which your client-side code can read and update the UI accordingly. Or maybe that response is raw HTML (not an entire page but perhaps a single <div> or any kind of container which presents an updated version of a section of the page), which your client-side code can swap into the UI directly.
The AJAX interactions with the server should generally be light. If you're intentionally re-loading the entire page in an AJAX operation then, well, why use AJAX in the first place? The point is to send to the server only the data it needs, and receive back from the server only the data you need. For example, if all you need is to update a list of records displayed on the page then you don't need the whole page or even the HTML table of records, you just need the records. JSON is useful for exactly that, returning structured data and only structured data to the client. Then the client-side code can render that data into the structure of the page.
Now do I always have to create a separate file that works with the data that AJAX sends?
Yes and no. You may choose not to have a specific file which your ajax is pulling, but you do need some sort of Routing and Controller relationship as most frameworks build it.
You could in theory create a request to self (the same page) but that's bad logic. You are going to mix backend logic with frontend and will get messy - very quickly. You really need to separate all three elements,
PHP takes the data and process it
JavaScript takes the data and displays it
Your html should be code free; Just a pretty finalized product.
The best design pattern is to separate those files in their proper environment.
Can't I just "grab" the data and work with it on the same page?
Not really, at least not consistently. You also have to keep in mind it's a potential issue in attempting to serve two separate contents from the same route/file:
if ajax
// do something
else
// do the other thing
Ajax does not want fully rendered HTML files, it takes too long; it's best to serve JSON objects/arrays which will be rendered in your frontend via JavaScript; which was also used to make the request - in the user's browser without the latency caused by their network or your server.
There's no sure way of knowing which request is what since no data from the client is trustworthy, including HTTP headers; they are easy to fake and could potentially lead to security/unwanted results.
Thus, the best solutions is to have a foreign file which you would make the requests to, instead of doing it from itself.
I have heard about technologies like socket.io, but it is too advanced for me.
I created PHP file that displayes JSON formatted data in array.
I use the jQuery function .get() with the URL datafile.php to get the string and display it in the pages. I use timer loop for this .get() and it is run every few seconds. This is how I simulate updating texts without refreshing the page.
But I really think if it is the right way of doing this. is there some better approach?
Here is my current script (the highlights classes are only to make little flashing on the element to show that value has been changed):
setInterval(function() {
$('.total-xp .number').addClass('highlighted');
setTimeout(function() {
$('.total-xp .number').removeClass('highlighted');
}, 1500);
$.get("data.php", function(data) {
$(".total-xp .number")
.text(data.total_xp_data)
}, "json");
}, 10000);
Two comments. You should look at socket.io. Socket.ios allows your server to push data updates to subscribed clients when they happen, thereby saving on wasteful polling for updated data from every attached client, this is pretty much accepted practice. I've no personal experience on socket.io however I've use Signal R in .NET for these kind of problems and found it works well.
Second. I would be tempted to return json to your client, then your client can render the data as appropriate. Then if you need another interface to present the data you don't have to write another server side method, or if you need to change your UI you don't need to reploy the server side components.
If your really can't face the effort of socket.io then your solution will work, it will be hard to get anyone to agree it is best practice.
Facebook has introduced a ticker which shows live news scrolling down. How can I have this same time of functionality on my site? I don't care to use an iframe and have it refresh because it will A flicker and B make the page loading icon come up (depending on browser). How can this be done?
For this you'd want to fetch the data you're looking for with AJAX every X seconds.. Also known as polling.
Here's the break down: Every X seconds, we want to query our database for new data. So we send an asychronous POST to a php page, which then returns a data set of the results. We also declare a callback function (native to jQuery) that will be passed the data echo'd from our PHP.
Your PHP:
if (isset($_POST['action'])){
if ($_POST['action'] == 'pollNewData'){
pollNewData();
}
}
function pollNewData(){
$term = $_POST['term'];
$sql = "select * from TABLE where TERM = '$term'";
$result = get_rows($sql);
echo json_encode(array('status'=>200, 'results'=>$results));
}
Your front end javascript:
setTimeout(pollForNewData, 10000);
function pollForNewData(){
$.post('url/ajax.php',{
action: 'pollNewData',
term: 'your_term'
}, function(response){
if (response.status == 200){
$.each(response.results, function(index, value){
$("#container").append(value.nodeName);
});
}
}, 'json');
}
What essentially is going on here is that you will be posting asynchronously with jQuery's ajax method. The way you trigger a function in your PHP would be by referencing a key-value item in your post depicting which function you want to call in your ajax request. I called this item "Acton", and it's value is the name of the function that will be called for this specific event.
You then return your data fetched by your back end by echo'ing a json_encoded data set.
In the javascript, you are posting to this php function every 10 seconds. The callback after the post is completed is the function(response) part, with the echo'd data passed as response. You can then treat this response as a json object (since after the function we declared the return type to be json.
Pretty much the only way you can do it is with some sort of Asyncronous javascript function. The easiest way to do it is to have javascript priodically poll another http resource with that information and replace the current content in the dom with the new content. Jquery and other javascript frameworks provide AJAX wrappers to make this process reasonably simple. You aren't limited to using XML in the request.
It's a good idea to make sure that even without javascript enabled that some content is available. Just use the javascript to 'update it' without having to refresh the page.
You can do this kind of ticker using Ajax... using AJAX you can poll a URL that returns JSON/XML containing the new updates and once you get the data, you can update the DOM.
you can refer this page for introduction to Ajax.
Ajax is the best method but that's what everyone else already mentioned. I wanted to add that although I agree ajax is the best method, there are other means too, such as Flash.
There are two approaches possible for obtaining updates like this. The first is called push and the second is called pull.
With push updates, you rely on the server to tell the client when new information is available. In your example, a push update would come in the form of Facebook telling your site that something new happened. In general, push schemes will tend to be more bandwidth friendly because you only transmit information when something needs to be said (Facebook doesn't contact your site if nothing is going on). Unfortunately, push updating requires the server to be specially configured to support it. In other words, unless the service you are asking for updates from (ex. Facebook) has a push update service available to you, you cannot implement one yourself. That's where pull techniques come in.
With pull updating, the client requests new information from the server when it deems necessary. It is typically implemented using polling such that the client will regularly and periodically query the server to see what the latest information is. Depending on how the service is oriented, you may have to do client-side parsing to determine if what you receive as a response is actually new. The downside here is of course that you likely will consume unnecessary amounts of bandwidth for your polling requests where no useful information is obtained. Similarly, you may not be as up-to-date with new information as you would like to be, depending on your polling interval.
In the end, since your site is web-based and is interfacing with Facebook, you will likely want to use some sort of AJAX system.
There are a few hybrid-ish approaches, such as long polling via Comet, which are outlined rather well on Wikipedia:
Push technology: http://en.wikipedia.org/wiki/Push_technology
Pull technology: http://en.wikipedia.org/wiki/Pull_technology
I've always wondered how to decide on choosing between using server-side code versus client-side code to build HTML pages. I'll use a very simple php vs javascript/jquery example to further explain my question. Your advice and comment is very much appreciated.
Say I'm about to present a web page to a user to select a type of report in my web page. Which makes more sense?
For server-side creation, I'd do this:
<div id="reportChoices">
<?php
// filename: reportScreen.php
// just for the sake of simplicity, say a database returns the following rows
// that indicates the type of reports that are available:
$results = array(
array("htmlID"=>"battingaverage", "htmlLabel"=>"Batting AVG report"),
array("htmlID"=>"homeruntotals", "htmlLabel"=>"Home Run Totals report"),
);
foreach ($results AS $data)
echo "<input type='radio' name='reportType' value='{$data['htmlID']}'/>{$data['htmlLabel']}";
?>
</div>
Using client-side code, I'd get the javascript to build the page like the following:
<!-- filename: reportScreen.html -->
<div id="reportChoices">
</div>
<!-- I could put this in the document.ready handler, of course -->
<script type="text/javascript">
$.getJSON("rt.php", {}, function(data) {
var mainDiv = $("#reportChoices");
$.each(data, function(idx, jsonData) {
var newInput = $(document.createElement('input'));
newInput
.attr("type", "radio")
.attr("name", "reportType")
.attr("value", jsonData["htmlID"])
mainDiv.append(newInput).append(jsonData["htmlLabel"]);
});
};
</script>
All I would need on the server is a data dump php script such as:
<?php
// filename: rt.php
// again, let's assume something like this was returned from the db regarding available report types
$results = array(
array("htmlID"=>"battingaverage", "htmlLabel"=>"Batting AVG report"),
array("htmlID"=>"homeruntotals", "htmlLabel"=>"Home Run Totals report"),
);
echo json_encode($results);
?>
This is a very simple example, but from this, I see pros and cons in different area.
1 - The server-side solution has the advantage of being able to hide most of the actual programming logic behind how everything is built. When the user looks at the page source, all they see is the already-built web page. In other words, the client-side solution gives away all your source code and programming logic on how certain things are built. But you could use a minifier to make your source look more cryptic.
2 - The client-side solution transfers the "resource load" onto the client system (i.e. the browser needs to use the client's computer resources to build most of the page) whereas the server side solution bogs down, well, the server.
3 - The client-side solution is probably more elegant when it comes to maintainability and readability. But then again, I could have used php libraries that modularize HTML controls and make it a lot more readable.
Any comments? Thanks in advance.
Con (client solution): The client-side solution relies on the client to execute your code properly. As you have no control over what client system will execute your code, it's much harder to ensure it will consistently give the same results as the server-side solution.
This particular problem doesn't really seem to need a client-side solution, does it? I'd stick with the server-side solution. The only extra work there is a foreach loop with one echo and that's not really so resource heavy is it (unless you've profiled it and know that it IS)? And the resulting code is all in one place and simpler.
I'm sceptical that moving the report generation on to the client side really saves any resources - remember that it's still doing an HTTP request back to your (?) server, so the database processing still gets done.
Also, giving away your database schema on the client side could be a recipe for database attacks.
Perhaps you should use a model-view-controller pattern to separate the business logic from the presentation on the server? At least this keeps all the code in one place but still lets you logically separate the components. Look at something like Zend Framework if this sounds useful to you.
Typically, it's best not to depend on Javascript being enabled on the client. In addition, your page will not be crawled by most search engines. You also expose information about your server/server-side code (unless you explicitly abstract it).
If you want to transform data into the view, you might want to take a look at XSLT. Another thing to read up on if you have not already, is progressive enhancement.
http://alistapart.com/articles/understandingprogressiveenhancement/
In the first client-side solution you presented, it's actually less efficient because there's an extra HTTP request. And the second one is possibly not very efficient as well, in that all the data must be processed with json_encode.
However, if what you're working on is a rich web application that depends on Javascript, I see no problem with doing everything with Javascript if you want to.
You can maintain a better separation of concerns by building it on the client side, but that can come at a cost of user experience if there is a lot to load (plus you have to consider what FrustratedWithForms mentioned). To me it's easier to build it on the server side, which means that becomes a more desirable option if you are on a strict timeline, but decide based on your skill set.
I am a little bit new to the PHP/MYSQL arena, and had an idea to be able to interact with my Database by using a hidden Iframe to run PHP pages in the background(iframe) on events without having to leave the current page?
Good? Bad? Common Practice? Opinions?
This is most of the time bad, but sometimes inevitable.
The common practice to do it is to use AJAX, it's so common that even W3School has an article about it.
The advantages of using AJAX over IFrame is that AJAX can be multi-threaded. You can send several requests in a row, which is more troublesome to implement with IFrames. Moreover, AJAX supports status code so you can detect errors, where with IFrames you'd have to rely on scraping the page's HTML and hope you've determined the correct status by looking at the error page's HTML code.
AJAX is more JavaScript idiomatic and event driven, which means your callback will get notified automatically when there is a response. With IFrame you'd have to setTimeout() and keep polling the IFrame for a response, which may break easily.
IFrame is sometimes inevitable in cases like where you want to upload a file without leaving the current page. But that's probably not your scope since you mentioned only database interactions.
Learn to use XMLHttpRequest, which is the foundation of AJAX. After you've become familiar with that, try making it fun by using a JavaScript framework such as jQuery, Dojo, etc.
I'd guess something is supposed to happen when your database does something, right? I.e. your page should give some sort of feedback, maybe update a number or some text.
So you're going to use Javascript anyway. In that case, skip the iframe and just send off AJAX requests.
This is commonly accomplished using AJAX. The jQuery javascript library makes this easy
I don't think using iframes is a good way to accomplish this. You would still need javascript enabled to change the location of the iframe, and if javascript is available, why not just use AJAX?
If you use the iframe, you wouldn't be able to receive a response from the server in any meaningful way without doing a lot of workarounds. For example -- using jQuery, you could submit some information to the server with a single function call, and then when that request completes, a callback function can be invoked with response information from the server:
$.post("ajax.php", { var1: "data", var2: "moredata" },
function(data){
alert("Server says: " + data);
});
In this example, when the request completes, an alert box appears with the output of ajax.php.
With an iframe, you might do something like change the location of the iframe to server.com/iframe.php?var=data&var2=moredata&var3=moredata, then wait for a bit, and grab the contents of the iframe (if this is even possible) and do something with that.
Not to mention, when you run into problems doing this, you'll probably ask for advice on SO. and each time, people will probably say "drop that and use jQuery!" :) may as well skip all the pain and suffering and do it the Right Way to begin with
The hidden iframe method was used before the adoption of XMLHttpRequest api (Maybe you have heard of it as Ajax).
Years ago I was using a former implementation using rslite but nowadays this technique has, to me, just an historical value.
You can get directions on using Ajax techniques in plain javascript at http://www.xul.fr/en-xml-ajax.html or, better, you can choose to use a common library, jquery or mootools among others, to deal with the different implementations in different browser.