In an admin panel made with php, what is the best way for paging? ClientSide (jquery) or Serverside ?
Short answer, yes.
A bit longer answer, it depends on what you are pageing
If you are pageing large amounts of data, I'd go with a combination, using ajax to fetch the data and letting PHP sort out which data should be sent.
If you are simply pageing a couple of panels with controls I'd just do it with javascript, but thats my personal preference.
For 9/10 solutions, go with whatever you are more comfortable with.
Think again your question: what happens when (and how)?
Client side:
To achieve client side pagination you should serve all rows (data) to your client because at this time you haven't any info about which page needed. Then with the help of a (possibly) javascript solution splitting all of the served data into smaller chunks mimicking pagination.
Server side:
You should serve only the first (or actual) page.
I'd go with server side pagination of course.
Your decision will depend on how the data is likely to be used.
When a user is on that page, are they likely to stay the first page of results most of the time? In that case, server side paging works well, as you only go through the trouble of assembling that one page of data.
Or are they likely to constantly page back and forth through the results? In that case, you might as well build all the results efficiently in single shot and let the client do the paging, since you're going to need all the data eventually anyway.
It depends on several factors: (incomplete list)
How often do people want to see other pages than the first?
If they often browse around you'd want the interaction to be entirely on the client => instant feedback.
If, OTOH, the rest of the pages are seldom used there's no reason to send it to the client in the first place.
I.e., send what most users want, nothing more.
How big is the total data set?
How do you weight initial load time (higher with client side) vs. time to serve a new page (higher with server side)?
...
I'd go for server side, but:
Serve first page only, thus minimizing initial load time
Fetch other pages, when requested, via AJAX - serving only what's necessary in order to minimize the "page" load time
And it's an admin panel, so you should really follow #Kristoffer S Hansen's advice: Do whatever you're more comfortable with.
Related
I need some advice on website design.
Lets take example of twitter for my question. Lets say I am making twitter. Now on the home_page.php ,I need both, Data about tweets (Tweet id , who tweeted , tweet time etc. etc) and Data about the user( userId , username , user profile pic).
Now to display all this, I have two option in mind..
1) Making separate php files like tweets.php and userDetails.php. By using AJAX queries, I can get the data on the home_page.php.
2) Adding all the php code (connecting to db, fetching data ) in the home_page.php itself.
In option one, I need to make many HTTP requests, which (i think) will be load to the network. So it might slow down the website.
But option two, I will have a defined REST API. Which will be good of adding more features in the future.
Please give me some advice on picking the best. Also I am still a learner, so if there are more options of implementing this, please share.
In number 1 you're reliant on java-script which doesn't follow progressive enhancement or graceful degradation; if a user doesn't have JS they will see zero content which is obviously bad.
Split your code into manageable php files to make it easier to read and require them all in one main php file; this wont take any extra http requests because all the includes are done server side and 1 page is sent back.
You can add additional javascript to grab more "tweets" like twitter does, but dont make the main functionality rely on javascript.
Don't think of PHP applications as a collection of PHP files that map to different URLs. A single PHP file should handle all your requests and include functionality as needed.
In network programming, it's usually good to minimize the number of network requests, because each request introduces an overhead beyond the time it takes for the raw data to be transmitted (due to protocol-specific information being transmitted and the time it takes to establish a connection for example).
Don't rely on JavaScript. JavaScript can be used for usability enhancements, but must not be used to provide essential functionality of your application.
Adding to Kiee's answer:
It can also depend on the size of your content. If your tweets and user info is very large, the response the single PHP file will take considerable time to prepare and deliver. Then you should go for a "minimal viable response" (i.e. last 10 tweets + 10 most popular users, or similar).
But what you definitely will have to do: create an API to bring your page to life. No matter which approach you will use...
I develop a website based on the way that every front end thing is written in JavaScript. Communication with server is made trough JSON. So I am hesitating about it: - is the fact I'm asking for every single data with http request query OK, or is it completely unacceptable? (after all many web developers change multiple image request to css sprites).
Can you give me a hint please?
Thanks
It really depends upon the overall server load and bandwidth use.
If your site is very low traffic and is under no CPU or bandwidth burden, write your application in whatever manner is (a) most maintainable (b) lowest chance to introduce bugs.
Of course, if the latency involved in making thirty HTTP requests for data is too awful, your users will hate you :) even if you server is very lightly loaded. Thirty times even 30 milliseconds equals an unhappy experience. So it depends very much on how much data each client will need to render each page or action.
If your application starts to suffer from too many HTTP connections, then you should look at bundling together the data that is always used together -- it wouldn't make sense to send your entire database to every client on every connection :) -- so try to hit the 'lowest hanging fruit' first, and combine the data together that is always used together, to reduce extra connections.
If you can request multiple related things at once, do it.
But there's no real reason against sending multiple HTTP requests - that's how AJAX apps usually work. ;)
The reason for using sprites instead of single small images is to reduce loading times since only one file has to be loaded instead of tons of small files at once - or at a certain time when it'd be desirable to have the image already available to be displayed.
My personal philosophy is:
The initial page load should be AJAX-free.
The page should operate without JavaScript well enough for the user to do all basic tasks.
With JavaScript, use AJAX in response to user actions and to replace full page reloads with targeted AJAX calls. After that, use as many AJAX calls as seem reasonable.
So, I'm new to dynamic web design (my sites have been mostly static with some PHP), and I'm trying to learn the latest technologies in web development (which seems to be AJAX), and I was wondering, if you're transferring a lot of data, is it better to construct the page on the server and "push" it to the user, or is it better to "pull" the data needed and create the HTML around it on the clientside using JavaScript?
More specifically, I'm using CodeIgniter as my PHP framework, and jQuery for JavaScript, and if I wanted to display a table of data to the user (dynamically), would it be better to format the HTML using CodeIgniter (create the tables, add CSS classes to elements, etc..), or would it be better to just serve the raw data using JSON and then build it into a table with jQuery? My intuition says to do it clientside, as it would save bandwidth and the page would probably load quicker with the new JavaScript optimizations all these browsers have now, however, then the site would break for someone not using JavaScript...
Thanks for the help
Congratulations for moving to dynamic sites! I would say the following conditions have to be met for you to do client-side layout (it goes without saying that you should always be doing things like filtering DB queries and controlling access rights server side):
Client browser and connection capabilities are up to snuff for the vast majority of use cases
SEO and mobile/legacy browser degradation are not a big concern (much easier when you synthesize HTML server side)
Even then, doing client-side layout makes testing a lot harder. It also produces rather troublesome synchronization issues. With an AJAX site that loads partials, if part of the page screws up, you might never know, but with regular server-side composition, the entire page is reloaded on every request. It also adds additional challenges to error/timeout handling, session/cookie handling, caching, and navigation (browser back/forward).
Finally, it's a bit harder to produce perma-URLs in case someone wants to share a link with their friends or bookmark a link for themselves. I go over a workaround in my blog post here, or you can have a prominent "permalink" button that displays a dynamically rendered permalink.
Overall, especially when starting out, I would say go with the more kosher, better supported, more tutorialed, traditional approach of putting together the HTML server side. Then dip in some AJAX here and there (maybe start out with form validation or auto-completion), and then move on up.
Good luck!
It is much better to do the heavy lifting on the server side.
In CodeIgniter you create a view, looping through all the rows in the table adding in the classes or whatever else you would need. There is no reason at all to do this in Javascript.
Javascript is a sickly abused language with unfortunate syntax. Why on earth would you want to load a page, and then issue a AJAX call to load up some JSON objects to push into a table is beyond me. There is little reason to do that.
Javascript (and jQuery) is for end user enhancement. Make things slide, flash, disappear! It is not for data processing in even the mildest of loads. The end user experience would be crap because you're relying on their machine to process all the data when you have a server that is infinitely more capable and even designed for this specifically.
It depends on your target market and the goal of your site.
I strongly believe in using the client side where ever you can to offload work from the server. Obviously its important you do it correctly so it remains fast for the end user.
On sites where no-js support is important (public websites, etc), you can have fallbacks to the server. You end up doubling code in these situations but the gains are very beneficial.
For advanced web applications, you can decided if making JS a requirement is worth the trade of losing a (very) few users. For me, if I have some control over the target market, I make it a requirement and move on. It almost never makes sense to spend a ton of time to support a small percentage of potential audience. (Unless the time is spent on accessibility which is different, and VERY important regardless of how many people fit into this group on your site.)
The important thing to remember, is touch the DOM as little as possible to get the job done. This often means building up an HTML string and using a single append action to add it to the page vs looping through a large table and adding one row at a time.
It's better to do as much as possible on the server-side because 1) you don't know if the client will even have JavaScript enabled and 2) you don't know how fast the client-side processing will be. If they have a slow computer and you make them process the entire site, they're going to get pretty ticked off. JavaScript/jQuery is only supposed to be used to enhance your site, not process it.
You got the trade-off correctly. However, keep in mind that you can activate compression in the server side, which will probably make adding repetitive markup to format the table a small bandwidth cost.
Keep also in mind that writing Javascript that works in all browsers (including hand-helds) is more complicated than doing the same server side in PHP. And don't forget that the "new JavaScript optimizations" do not apply to the same extent to browsers of handheld devices.
I do a great deal of AJAX app development and I can tell you this from my experience. a good balance between the two is key.
do the raw data server-side but use javascript to make any modifications that you would need to it. such as paging, column sorting, row striping, etc.
I absolutely love doing everything in AJAX heh.. but there are some short falls to doing it using AJAX, and that's SEO. search engines do not read javascript, so for the sake of your website's page rank, I would say have all data served up server side and then formatted and made look cool client-side.
The reason I love AJAX so much is because it drastically speeds up your APP usage by the user as it only loads the data you need to load where you need to load it, versus load the entire page every time you do something... you can do a whole bunch of stuff, such as hide/show rows/columns (we are talking about table manipulation here because you mentioned a table) and even with these show/hide actions add delete actions where when you click a delete row or button it deletes that row not only visually but in the database all done via AJAX calls to server-side code.
in short.
raw data: server-side sending to the client the raw data in html layout (tables for table structured data, however I do everything else in divs and other flexible html tags, only do tables for column/row style data)
data formatting: client-side which also includes any means of interacting with the data. adding to it, deleting from it, sorting it differently etc. This achieves two things. SEO, and User Experience (UX).
I have a large database of links, which are all sorted in specific ways and are attached to other information, which is valuable (to some people).
Currently my setup (which seems to work) simply calls a php file like link.php?id=123, it logs the request with a timestamp into the DB. Before it spits out the link, it checks how many requests were made from that IP in the last 5 minutes. If its greater than x, it redirects you to a captcha page.
That all works fine and dandy, but the site has been getting really popular (as well as been getting DDOsed for about 6 weeks), so php has been getting floored, so Im trying to minimize the times I have to hit up php to do something. I wanted to show links in plain text instead of thru link.php?id= and have an onclick function to simply add 1 to the view count. Im still hitting up php, but at least if it lags, it does so in the background, and the user can see the link they requested right away.
Problem is, that makes the site REALLY scrapable. Is there anything I can do to prevent this, but still not rely on php to do the check before spitting out the link?
It seems that the bottleneck is at the database. Each request performs an insert (logs the request), then a select (determine the number of requests from the IP in the last 5 minutes), and then whatever database operations are necessary to perform the core function of the application.
Consider maintaining the request throttling data (IP, request time) in server memory rather than burdening the database. Two solutions are memcache (http://www.php.net/manual/en/book.memcache.php) and memcached (http://php.net/manual/en/book.memcached.php).
As others have noted, ensure that indexes exist for whatever keys are queried (fields such as the link id). If indexes are in place and the database still suffers from the load, try an HTTP accelerator such as Varnish (http://varnish-cache.org/).
You could do the ip throttling at the web server level. Maybe a module exists for your webserver, or as an example, using apache you can write your own rewritemap and have it consult a daemon program so you can do more complex things. Have the daemon program query a memory database. It will be fast.
Check your database. Are you indexing everything properly? A table with this many entries will get big very fast and slow things down. You might also want to run a nightly process that deletes entries older than 1 hour etc.
If none of this works, you are looking at upgrading/load balancing your server. Linking directly to the pages will only buy you so much time before you have to upgrade anyway.
Every thing you do on the client side can't be protected, Why not just use AJAX ?
Have a onClick event that call's an ajax function, that returns just the link and fill it in a DIV on your page, beacause the size of the request an answer is small, it will work fast enougth for what you need. Just make sure in the function you call to check the timestamp, It is easy to make a script that call that function many times to steel you links.
You can check out jQuery, or other AJAX libraries (i use jQuery and sAjax). And I have lots of page that dinamicly change content very fast, The client doesn't even know is not pure JS.
Most scrapers just analyze static HTML so encode your links and then decode them dynamically in the client's web browser with JavaScript.
Determined scrapers can still get around this, but they can get around any technique if the data is valuable enough.
I need a kayak.com like, functionality.
That is, the user enters a keyword and I will need to display results as the become available. The important thing is that data should be displayed AS it BECOMES available. Kind of progressive display? Don't know if this is right term.
Kayak.com displays or gives the impression that data is displayed as they become available after an asynchronous call.
Can anyone give directions on this topic? (php on the server side) Is this a case of PUSH ?
Thank you.
Push is a nice way of keeping your clients up to date, however, the more users the push server is pushing data to, the more resource intensive it gets. If this is an application where you expect to have multiple users, pushing may be a bit intensive for your needs, whereas polling regularly might be better. But give push a try, these guys (icefaces) have a nice ajax push implementation.
Hope this helps!
It seems that the server is sending its response in paginated sections. This enables the client side to begin rendering much sooner. After the client receives the first page, it renders it and begins the request for the subsequent page, and so on, until there are no more pages remaining.
You may also be interested in reading up on Comet which can be thought of as an "AJAX Push". Not sure on any PHP implementations but I know there are several other solutions out there which you could potentially tie PHP into.
I've thought about this before but never really built anything. My thought process on this was to have your script grab the latest record and get some kind of unique identifier (ID or date). You log this id with javascript and have a request try to get results that are newer than the current ID every second (or longer if you don't need the data instantly). If results are returned, you once again log the latest result for the next request.