Javascript based graphing / charting toolkits vs PHP based ones - php

I'm contemplating using http://pchart.sourceforge.net/ for our graphing / charting requirements but another developer suggested the use of a javascript/jquery based one like - http://dojotoolkit.org/
While the look-n-feel for both are different, and perhaps a javascript based one is more easily manipulatable - I'm not convinced it would be the faster solution.
Wouldnt a PHP based toolkit be faster anyday, for the end user, with less data going back and forth between our server and the client machine?
Our charting requirements are for reporting purposes - we dont require users to manipulate the charts 'live' at all.

Without considering presentation aspects, the 2 approaches generally have their pros and cons, whereas one's pro being the other's con.
The PHP approach:
arguably more consistent results.
resulting chart(s) could be saved if data is constant once entered,
this would avoid having the same process run multiple times,
resulting better overall performance.
The Javascript approach:
less computing power needed on the server.
resulting chart(s) could be generated dynamically delivering a more
interactive user experience.

I believe if the view is done at the client side it means less processing at the server. So i believe that client side graphing/charting would be better (using javascript)

with the requirement that you have stated the pHP one seems like a batter approach. Like you already mentioned if your chart is static and you are not sending over the data to the client side then it doesnt make sense to render the chart in js. Your php server will any day outperform the browser rendering speed as then your dependent on the client using the latest browsers in the market and they have adequate memory on client side to do the same.
Why is your teammate thinking of the JS approach. Does he have a reason for the same?

I have recently found jqPlot and used it in a corporate project to my boss' delight. A PHP solution will not give you a solution as dynamic as a Javascript one. Moreover, PHP graph approach are sometimes bulky and require more code.
Once the Javascript code is downloaded unto the client's machine, it is cached for further requests, so only the graph's initialization code is downloaded (a few bytes + the series data), this results in fewer bytes being sent, thus a lower bandwidth consumption.
As for the report data, using caching mechanisms is probably the best solution as you can reuse the report data for different views (downloadable as CSV, etc.)
All and all, I personally usually prefer to keep the server side processing for data, and client side processing for the view representation.

Related

How to avoid duplicating a formula in both Javascript and PHP?

I'm currently writing a web application in HTML5/Javascript and PHP.
Some employees of the company will need to use it to enter their work schedule. The application will calculate in real time (using Javascript) some complicated legal infos and display the result at the bottom of the page. When he's done, the user clicks the "save" button and everything is sent to the database.
The problem is that I need both the user to see the output in real time and the managers to get the same output from what was saved in the database. I also need to support the case where the user has Javascript disabled. In other words, I need to do the same calculus in both Javascript and PHP.
To make things more complicated, the formula is very complex and will change often (every month or so), so I'd like to avoid maintaining two different versions. Testing one will already be hard enough.
Also I'd like to avoid AJAX to ask the server for the output, because:
the user will often build its schedule according to the result of the real-time calculation, so even a 1-2 seconds lag could really be annoying for him
if possible I have to support HTML5's offline features, so the user can load the app on his mobile phone, fill its schedule while offline, and then upload it when online
For the moment the only solution I found is to write the formula in a language-agnostic way, then use some way to turn it into PHP code and JavaScript code, but that's not simple.
It sounds like you have essentially two approaches:
You can write a language-independent formula (in a database, or config file, since it changes so frequently) and two code generators (Javascript, PHP). This is actually less scary than it sounds, as long as you pick your input format sanely and you actually mean a formula, not some arbitrary computation.
You can use server-side JavaScript, and write the formula once in JavaScript. Wikipedia has a comparrison of server-side JavaScript implementations. There are even JavaScript interpreters written in PHP, including at least phpjs and J4P5.
Persisting data with language agnostic storage techniques and interpretative conversions are not the first thoughts that come to mind when the term "real-time" is mentioned. It is very possible, but usually much more challenging and time consuming to design infinitely scalable and easily maintainable software.
Personally, I weight the cost versus benefit associated with every known solution and decide on an approach that, first and foremost, benefits the end-user and meets or exceeds key objective deliverables. Once I've made a decision that satisfies major stakeholder's expectations, it is then my job to design and develop the most scalable and maintainable solution as possible within the confines of these expectations.
Based on the details you've provided I would absolutely write the equations using native JavaScript and PHP for optimal performance. If the software specifications literally (not figuratively) denote the necessity for real-time communications, I would then provide this by implementing WebSockets, otherwise long pulling would suffice.
Please note: Not all Web browsers support WebSockets, which may require end-users to conform to a supported Web browser. It is possible to avoid this conundrum by employing a seamless client-side failover (or altogether) by using something like Socket.IO.
This morning I got the idea to try XSLT for the calculation. This is really interesting, because it is supported by both Javascript and PHP, it's something reliable (contrary to a code generator I would have to write myself), and appropriate because the input data was already in XML.
Example input data :
<schedule>
<element start="2011-19-09 08:00:00" end="2011-19-09 17:00:00" type="work" />
</schedule>
Example XSLT (written by heart, may not work) :
<xsl:stylesheet>
<xsl:template match="/">
<output>
<out name="totalWorkHour"><xsl:value-of select="sum(/element[#type='work'](#end-#start))" /></out>
</output>
</xsl:template>
</xsl:stylesheet>
I'll try to code everything with XSLT, and I'll keep this solution if I manage to do that.

How do I process large amounts of HTML data in an AJAX-heavy application?

Should I have my server return JSON data and then have the JavaScript parse it to create / render HTML directly or should I have my server-side code return HTML directly, which can be directly placed by the JavaScript.
Thoughts?
Render the code server side (e.g. as it is done in Rails' AJAX), then return the view to the client where it will just be placed.
Then profile your code. If it turns out to be too slow, return the JSON and think of a way to render it client-side.
Your priority for this should be to not make the whole thing too copmlicated.
I'm not a fan of returning generated HTML. IMHO I'd return JSON and use something like JQOTE 2 to handle the rendering. Let the clients resources handle that work.
(Side note: JQOTE is an amazing tool!)
I think that if you won't need the data later, e.g. for filtration, on-the-fly search, etc, then you should return HTML.
Premature optimization is the root of all evil. Start with whatever is easier. If it's just too slow, find a way to optimize (perhaps by using an alternative).
If one is not easier than the other to you, go with the server side. I can't imagine a circumstance where a server side scripting language operation would be slower than javascript in a browser.
If all you have to do is render HTML, then it's probably much easier to do it directly with the server (php). Otherwise, you have to convert it to JSON with php first, then convert it back with JS later. That's at least one extra step and extra work for the javascript side.
I'll vote for your first proposed approach.
JSON serialized data size is smaller than (X)HTML one and your first approach saves a lot of CPU cicles, network traffic, memory and speeds-up your client, which ends in a responsive user interface.
Just send data in JSON format and parse it in JavaScript in the client-side so things will be more simpler in the server and rendering things will be delegated to client web browser.
There is no one right answer; it depends on your expectations.
If you want the application to be accessible (ie. processed by a screen reader), picked up by search engine bots or want the UI to be cacheable between requests and users, you will have to use server generated HTML and no dynamic loading. If you use a cache for the generated HTML, you get a lot of mileage without the constant re-rendering. There are more server side tools than client side but that is becoming less of a true statement as JS grows.
OTOH, producing JSON that is rendered by the client using some JS library can really help your server reduce load. You're distributing the work of rendering to the client but that does take control out of your hands. If you have a JS heavy solution and the client can't process JS (screen readers, search engine bots, etc), then the site should degrade gracefully or expect to have some audience that can't view it. That audience might be minuscule for you but it's something to know. As long as you manage the rendering well, (set min size for areas, wait icons, etc) then you can have client side rendering that is as smooth as server side (when comparing visual rendering steps). Producing JSOn also gives you more flexibility as more interfaces are defined or other non-UI clients become important.
It depends on what you are trying to achieve. If you are writing a mobile application you may want to save bandwidth and work with client-side templates (just as an example: John Resig's micro templates). If bandwith is not that important to you I would just use server-side templates to generate the HTML you need.
In my opinion it's all about responsiveness. Your server is ALWAYS going to be able to process data faster than the UA, however the difference between the two may be negligible. If that's the case, then I'd recommend passing JSON to the UA and then use client-side code to do the dirty work. That way, you'll have clear separation of concerns between the server and the client, allowing you to deliver JSON data to different client endpoints in the future without having to modify your server-side code.
However, if there is a significant performance hit with doing the data processing on the client side then it might make more sense to deliver HTML directly to the client. HOWEVER I highly recommend that you still deliver JSON, only to your server-side HTML creation function (rather than the UA) so that you can still deliver JSON data to multiple endpoints without having to alter core code in the future.

Is it better to generate html for an ajax function in the JS handler or in the PHP ajax function?

I'm designing some UI's for a product and we want to utilize jQuery and PHP. The content to generate is a list of checkboxes (10-100) that a user will be modifying (removing multiple at a time and changing the entries). I thought I'd try something new and ask StackOverflow (read: you) what is preferred: generate the html in the php call and return, or return JSON data that jQuery can than go and generate the html checkboxes using.
I appreciate the feedback! My preferred method so far is to let PHP generate html because it knows more about the data at the time of modification (it's interacting with the database and it could build the html easy enough without having to pass back id's, names, etc in JSON).
Thanks!
[Edit] Bandwidth is not a constraint. This is an internal intranet application. The information needing to be printed to the user will not require dom modification after the fact (outside of checkboxes, but that's built in to the browser...) some good points have been made on the amount of data that's being passed back though:
passing back
Label
vs.
{
"Label": "Unique_ID"
}
is obviously a lot of redundancy.
There's really no right/wrong way to do this. Passing JSON back, and then using client-site processing to turn that into HTML uses less bandwidth, but more local processing power. Passing HTML back, uses more bandwidth and less local processing (these are seriously minor points, only if you're talking extremely popular or frequently changing sites might it even be relevant).
Return Flexibility - HTML
One of the benefits to HTML passing is you can return anything if the request causes an error, or could generate different types of data you just return different HTML. If you're returning JSON, the parsing script has to deal with these alternate structures (ie error handling, and/or multiple structure parsing algorithms).
Local Processing - JSON
If you're localizing, sorting, or framing the data from the user's point of view, it may well be simpler to return JSON and then use client side scripts to interpret. For example when user=2, reporting "You" instead of "Mike" might be a nice personalization touch. You could do this server side, but now the script needs to take that into account, so the same query needs to return different data based on context (again not impossible). You can keep your server code more generic by using client side scripts to perform this.
Local Presenting - JSON
Perhaps a single command collects the data, but there's multiple parts of the page that should be updated with what's returned. With an HTML approach, you either need separate queries, or some sort of delimiter in your return (with escapes!), and a local processing script to decide what goes where... with a JSON approach, the local processing script can update the locations from the same single source as it's retrieved.
You could approach the question both from the aspect of server burden and in terms of client performance.
If your server is having to dynamically generate the HTML output for every user, it will endure a somewhat higher burden than if you delegated the content-generation to client-side JavaScript. Clients have abundant computing power at their disposal, so feel free to have them collectively shoulder the burden rather than having your server do all the work (which could easily add up, depending on how busy your server is).
Likewise, generating the HTML markup on the server results in a significantly larger page download for the client. The markup for a hundred check-boxes could easily add kilobytes to the size of the page, while the data itself--which is all you would send using the JSON approach--is much smaller. Of course, larger page downloads mean longer download times for the client. We as web developers often forget that there are still quite a few people who still have dial-up internet connections.
For these reasons, I would personally opt for sending the data via JSON and doing DOM-modification via JavaScript.
Cheers,
Josh
The answer is: it depends. If you are going to be doing DOM manipulation on the new data, then you pretty much have to append the elements using jQuery. If there is no such manipulation needed, then you can just print it out with php and add the blob.
I think that the latter is much easier and simpler, so if you don't need to do DOM manipulation on the elements, you can just add the html blob from php.

Client-side or server-side processing?

So, I'm new to dynamic web design (my sites have been mostly static with some PHP), and I'm trying to learn the latest technologies in web development (which seems to be AJAX), and I was wondering, if you're transferring a lot of data, is it better to construct the page on the server and "push" it to the user, or is it better to "pull" the data needed and create the HTML around it on the clientside using JavaScript?
More specifically, I'm using CodeIgniter as my PHP framework, and jQuery for JavaScript, and if I wanted to display a table of data to the user (dynamically), would it be better to format the HTML using CodeIgniter (create the tables, add CSS classes to elements, etc..), or would it be better to just serve the raw data using JSON and then build it into a table with jQuery? My intuition says to do it clientside, as it would save bandwidth and the page would probably load quicker with the new JavaScript optimizations all these browsers have now, however, then the site would break for someone not using JavaScript...
Thanks for the help
Congratulations for moving to dynamic sites! I would say the following conditions have to be met for you to do client-side layout (it goes without saying that you should always be doing things like filtering DB queries and controlling access rights server side):
Client browser and connection capabilities are up to snuff for the vast majority of use cases
SEO and mobile/legacy browser degradation are not a big concern (much easier when you synthesize HTML server side)
Even then, doing client-side layout makes testing a lot harder. It also produces rather troublesome synchronization issues. With an AJAX site that loads partials, if part of the page screws up, you might never know, but with regular server-side composition, the entire page is reloaded on every request. It also adds additional challenges to error/timeout handling, session/cookie handling, caching, and navigation (browser back/forward).
Finally, it's a bit harder to produce perma-URLs in case someone wants to share a link with their friends or bookmark a link for themselves. I go over a workaround in my blog post here, or you can have a prominent "permalink" button that displays a dynamically rendered permalink.
Overall, especially when starting out, I would say go with the more kosher, better supported, more tutorialed, traditional approach of putting together the HTML server side. Then dip in some AJAX here and there (maybe start out with form validation or auto-completion), and then move on up.
Good luck!
It is much better to do the heavy lifting on the server side.
In CodeIgniter you create a view, looping through all the rows in the table adding in the classes or whatever else you would need. There is no reason at all to do this in Javascript.
Javascript is a sickly abused language with unfortunate syntax. Why on earth would you want to load a page, and then issue a AJAX call to load up some JSON objects to push into a table is beyond me. There is little reason to do that.
Javascript (and jQuery) is for end user enhancement. Make things slide, flash, disappear! It is not for data processing in even the mildest of loads. The end user experience would be crap because you're relying on their machine to process all the data when you have a server that is infinitely more capable and even designed for this specifically.
It depends on your target market and the goal of your site.
I strongly believe in using the client side where ever you can to offload work from the server. Obviously its important you do it correctly so it remains fast for the end user.
On sites where no-js support is important (public websites, etc), you can have fallbacks to the server. You end up doubling code in these situations but the gains are very beneficial.
For advanced web applications, you can decided if making JS a requirement is worth the trade of losing a (very) few users. For me, if I have some control over the target market, I make it a requirement and move on. It almost never makes sense to spend a ton of time to support a small percentage of potential audience. (Unless the time is spent on accessibility which is different, and VERY important regardless of how many people fit into this group on your site.)
The important thing to remember, is touch the DOM as little as possible to get the job done. This often means building up an HTML string and using a single append action to add it to the page vs looping through a large table and adding one row at a time.
It's better to do as much as possible on the server-side because 1) you don't know if the client will even have JavaScript enabled and 2) you don't know how fast the client-side processing will be. If they have a slow computer and you make them process the entire site, they're going to get pretty ticked off. JavaScript/jQuery is only supposed to be used to enhance your site, not process it.
You got the trade-off correctly. However, keep in mind that you can activate compression in the server side, which will probably make adding repetitive markup to format the table a small bandwidth cost.
Keep also in mind that writing Javascript that works in all browsers (including hand-helds) is more complicated than doing the same server side in PHP. And don't forget that the "new JavaScript optimizations" do not apply to the same extent to browsers of handheld devices.
I do a great deal of AJAX app development and I can tell you this from my experience. a good balance between the two is key.
do the raw data server-side but use javascript to make any modifications that you would need to it. such as paging, column sorting, row striping, etc.
I absolutely love doing everything in AJAX heh.. but there are some short falls to doing it using AJAX, and that's SEO. search engines do not read javascript, so for the sake of your website's page rank, I would say have all data served up server side and then formatted and made look cool client-side.
The reason I love AJAX so much is because it drastically speeds up your APP usage by the user as it only loads the data you need to load where you need to load it, versus load the entire page every time you do something... you can do a whole bunch of stuff, such as hide/show rows/columns (we are talking about table manipulation here because you mentioned a table) and even with these show/hide actions add delete actions where when you click a delete row or button it deletes that row not only visually but in the database all done via AJAX calls to server-side code.
in short.
raw data: server-side sending to the client the raw data in html layout (tables for table structured data, however I do everything else in divs and other flexible html tags, only do tables for column/row style data)
data formatting: client-side which also includes any means of interacting with the data. adding to it, deleting from it, sorting it differently etc. This achieves two things. SEO, and User Experience (UX).

Performance considerations of JSON vs. XML

I am using a webservice which provides a large result set either in XML or JSON format.
Which format will be faster or better (perfomance based)? Also which language should I use to parse the XML/JSON? Should I use PHP or JavaScript?
"PHP or JavaScript" sounds like an odd choice to offer: PHP is usually used as a server-side language, whereas JavaScript is usually used as a client-side language.
What's your situation? What makes you suggest those two languages in particular? If you could give more information about what you're trying to do, that would help a lot. (We don't know whether you're developing a web app, a batch processing tool, a GUI application, etc.)
I suspect JSON will be a bit more compact than XML, although if they're compressing the data you may well find they end up taking the same bandwith (as a lot of the "bloat" of XML is easily compressible).
As ever, the best way to find out is to test the specific web service with some realistic data. Generalities aren't a good basis for decision-making.
both have their advantages:
JSON
easy to handle: $dataStructure = JSON_decode($serializedString);, done.
XML
partial data handling: if your result-set is too big to be processed (parsed) at once, this may be the way to go. note: SimpleXML is the easier to work with xml lib, but also parses the whole xml-file at once, so in this case there's no benefit over JSON.
the question which language to handle your result set with is a bit non-sensical. javascript is client-side*, php is server side. so, it depends on what you want to do with the result set.
you can pass the result directly on to the browser/js without doing anything on the server side, and let the client do the filtering and rendering. this may make sense in certain situations, but normally it's not what you want.
my advice: if possible, use JSON.
ad *: you can use javascript on the server side (rhino, v8cgi, ...), but that's not what you have in mind.
I would go for JSON, you're not paying the "angled bracket tax". The choice between PHP and Javascript is related to the amount of processing required on the data (I'm taking a leap here).
Lots of processing, use PHP so it's server side. Little processing use Javascript for a more responsive page (load data via AJAX).
Although performance aspects really vary a lot between language/tool combinations, in the end xml and json tend to have similar performance characteristics when using best tools of the platform. That is, you won't find one more twice as fast or more; theoretical limits are similar for textual formats. Both are "good enough" in this regard for almost any use case.
So I would focus more on tool support, for the task you have. Other than that, format choice is unlikely to be the most important aspect to consider.
And like Jon mentioned, comparison of PHP and Javascript really sounds odd... apples and oranges or so.
One thing that has perhaps been missed is that a JavaScript client does not have to parse JSON, so you will get a performance win there. The browser will parse the XML into a DOM and hand this to your callback, you then need to extract from that DOM the info you need. With JSON you get back an object and the need to extract from the DOM is gone.
I think you'll have to measure it yourself. The performance will depend on:
the size of the data
the complexity of the data
its format (JSON or XML)
So you can see there are a number of variables.
Does the web service that you're using take longer to assemble the data in one format vs. another ?
There are a sizable number of options for parsing JSON and XML, so don't restrict yourself to PHP or Javascript (if you have a choice). And finally, if you're requesting from a webservice, you'll have the overhead of network transport costs, connection setup etc. So any time savings in parsing performance may be negligible. Your efforts may be better spent elsewhere.
I don't think I've answered your question, other than give you more things to think about!
You should use python, preferably :)
As for format, I guess JSON would be a bit faster, but it depends on the details, and this task would be network-bound anyway.
If you are using application with ajax then you should choose Javascript to process data on the client side which will reduce the usage of php on the server means more efficient server side. you can store your resultset in a json file and then can call it on client side with javascript this will be the best possible way because this will not consume you resources on server and data will be processed in the client side. here i will give preference to json over xml because xml takes more space to store than json because of its style of tags and json is like an array in javascript which will be faster than xml. same thing in server side, you can easly convert your json array to php array (just refer to json_decode function in PHP).
Now days json is in fashion because of it is easy to use and is faster.
for faster performance you should reduce the data processing on the server side and should use client side resource instead this approach will give your application speed and cost effectiveness.

Categories