I'm building a web app, the way I started off the app for testing purposes is to load lots of data in to session arrays from my database so I can use the values easily throughout the pages. I have one page the has numerous selects on it, and each time the php page loops through all the variables, chooses the selected one, and outputs the dropdown. One of my arrays though has just under 3000 values and loading this dropdown slows the page down from about 300ms to 1-1.2s. Not terrible but easy to tell that it is less responsive. So I'd like to know if there is anyway for me to improve the load speed, or any thoughts on a substitute for the dropdown.
What I have tried so far:
Session arrays hold all the values, when the page is loaded through jquery ajax method the php page loops through these values and echos the dropdowns.
Php include - create php or html pages of all the values pre written as selects, this creates a ~100kb page for the problem dropdown and this is then included with include. Takes roughly the same amount plus I'd have to then use javascript to set the value, but I'd do this if it could be improved. I thought perhaps some caching could provide improvements here. There seemed to be no significant difference between html and php pages for include but I'd assume html would be better. I'm also assuming that I cannot use regular caching because I am using a php function to include these pages.
I have tried just loading in the html page and it takes about 1 sec on first load, after browser caching it is back down to 100-350ms so I imagine caching could provide a huge boost in performance.
I have considered:
Creating a cached version of the whole page but this will be quite the pain to implement so I'd only do it if people thought it is the right way to go with this. I would have to use ajax to retrieve some data for the inputs which I was originally doing with php echos.
Just removing the problem dropdown.
Just to clarify something I've never had clarified, am I correct in thinking php pages can never be cached by the browser, and so by extension any php included files can't be either. But then how come a javascript file linked to in a php file can be cached, because it is using an html method?
The data being returned and parsed into a dropdown is probably your bottleneck. However, if the bottleneck is actually the PHP code you could try installing an optcode cache like APC at http://php.net/manual/en/book.apc.php. It will speed up your PHP. (Zend Optimizer is also available at: http://www.zend.com/en/products/guard/runtime-decoders)
If your bottleneck is the database where the items in the dropdown is coming from, you may want to try setting MySQL to cache the results.
You may also want to try an alternative dropdown that uses AJAX to populate the dropdown as the user scrolls down, a few records at a time. You could also create it as a text field that prompts the user for possible matches as they type. These things may be faster.
I suspect the problem is the raw size of the data you're transmitting, based on the results of number 2 in "What I have tried so far." I don't think you can rely on browser caching, and server-side caching won't change the size of the data transmitted.
Here are a couple of ideas to reduce the amount of data transmitted during page load:
Load the select box separately, after the main page has been
delivered, using an asynchronous javascript call.
Break the choice into a hierarchical series of choices. User
chooses the top-level category, then another select box is populated
with matching sub-categories. When they choose a sub-category, the
third box fills with the actual options in that sub-category. Something like
this.
Of course, this only works if those 2nd and 3rd controls are filled-in using an async
javascript call.
Either way, make sure gzip compression is enabled on your server.
Edit: More on browser caching
The browser caches individual files, and you typically don't ask it to cache PHP pages because they may be different next time. (Individual php includes are invisible to the browser, because PHP combines their contents into the HTML stream.) If you use a browser's developer console (hit f12 on Chrome and go to Network, for example), you can see that most pages cause multiple requests from the browser to the server, and you may even see that some of those files (js, css, images) are coming from the cache.
What the browser caches and for how long is controlled by various HTTP response headers, like Cache-Control and Expires. If you don't override these in php by calling the header function, they are controlled by the web server (Apache) configuration.
Related
I've just started learning PHP and just done with $_POST/$_GET.
Now I want to know, what is the pro's and con's of having the PHP to process the data from a form inside the same file, or send the data to another file (action="anotherfile")?
Logically I will think that sending it to another file would increase the time process it, but is that true?
When I have the PHP script inside the same file, the page doesnt seem to reload when I hit the submit button (but the content changes). Or does it? If it does, wouldn't the only difference would be that I would have to type the script for the menu (lets say you have the same menu on all pages) in both files? Which would lead to more coding/less space?
what is the pro's and con's of having the PHP to process the data from a form inside the same file, or send the data to another file (action="anotherfile")?
You are conflating files and urls.
By having the logic split between different files (and then included where appropriate) you seperate concerns and make your code easier to manage.
By having a single URL be responsible for both displaying the form and processing the form data you don't end up in the awkward situation where the result of processing the form data requires that you redisplay the form with error messages in it. If you used two different URLs there you would need to either display the form on the processing URL (so you have two different URLs which display the form) or perform an HTTP redirect back to the original URL while somehow passing details of the errors to it.
Logically I will think that sending it to another file would increase the time process it, but is that true?
No. It makes no difference on the time scales being dealt with.
When I have the PHP script inside the same file, the page doesnt seem to reload when I hit the submit button (but the content changes).
It does reload.
If it does, wouldn't the only difference would be that I would have to type the script for the menu (lets say you have the same menu on all pages) in both files?
That's what includes are for.
In any language we always try to write clean code. That's why we follow MVC.
Logically I will think that sending it to another file would increase the time process it, but is that true? I think NO.
Because when we send data to another page and on another page at the top we echo that post data and exit. you will see it will not take time. it take time when we redirect/load some html page after that.
It does not matter where we sending data (same page or another page). matter is what is loading after that.
There is no difference about speed.
Whetever you post the content of your form in standard submit, this data will be sent to the server and a response (after processing ) will be downloaded.
The only difference is about organization of your code. The logic that draws themplate of page (menu or other fixed parts) should be stored in some file that you can include separately or call by a function.
Is also true that when you post your data you do for some reason, register a user for example. Is a good pratice that the php file that handles user registration will do that and output the messages and not other functions.
If your file has some logic switches that make it output either an empty form or a a registration message based on the presence of post or get variables, you will notice that when you scale to more complex tasks this will add complexity and make code mantainment harder.
I'll try to make sure I understand your question by restating it.
If you have a form (/form.php), and the "action" of that submit button leads you to a separate php page (/form_action.php), there is absolutely no difference in speed. Each HTTP request (form.php and form_action.php) is independent - "form_action.php" doesn't remember anything about "form.php" unless you pass that information through (as parameters). This is what people mean when they say that HTTP is stateless. It's worth learning about how HTTP works in general alongside the details of PHP.
If you have a PHP script which in turn includes other PHP scripts, there is a tiny performance impact - too small to measure in pretty much any case I've ever come across.
However, using includes allows you to separate your markup (the HTML) from the logic (the PHP). This is a really good thing if you are doing anything other than tinkering. It allows you to re-use functionality, it makes it much easier to change and maintain the code over time, and it helps you think through what you're trying to achieve.
There are many different ways people have solved the "how do I keep my code clean" puzzle; the current orthodoxy is "Model-View-Controller" (as #monty says). There are also PHP frameworks which make this a little easier to implement - once you've got the basics of the language, you might want to look at Zend or TinyMVC (there are several others, each with their benefits and drawbacks).
I'm developing a website where you can order a product. This product has a lot of options etc so the user has to go through 10 steps to complete his/her order.
At the moment i have 1 file (index.php) which handles all steps. The code is a mess and the file is getting pretty big. I thought maybe splitting everything up in seperate files would be better. However, i don't want the user to reload the page at every step.
My first thoughts were PHP's file_get_contents() or using Ajax to load every file. But i don't know what method achieves the best performance? Or maybe you have an alternative?
keep 1 file, make a class and each step is a method of said class, you then invoke the right method upon reloading it via ajax. if you create several files it will be even messier.
If the pages are just a bunch of forms/options, you could use jQuery/JavaScript to AJAX them in. And each time the user advances to another step, you could save the data into a cookie. Then at the end of all of the steps, submit all of the data at a single time to PHP. That would make your PHP code less scattered and would offload most of the processing work, for the individual steps, to the client's browser, which would reduce the server load caused by PHP processing each individual step (for all users buying your product).
You could store each form's HTML in separate HTML files on the server, or you could echo the HTML for a given step, from your PHP script based on a $_GET['step'] value, etc.
Instead of hiding/displaying your divs with jQuery, this method would actually replace the HTML for a given step with the next/previous step's HTML.
Using PHP's include function is not going to improve performance because it acts the same as if the code of the included file was in place of the include directive, however it requires PHP to have some additional overhead because it has to load and evaluate the specified file.
Apologies in advance for the lengthy post, just trying to explain the situation clearly.
I've created a PHP-driven website for searching a big (millions of records) MySQL database of people. On the search page you have your usual form for search criteria. Due to the way the people often use the site, the search criteria are saved into session variables so that if the search page is reloaded the previous criteria remain in the form fields (there's a button to manually reset the criteria, of course). This in itself works fine.
I also have two language selection links that store the language selection in a session variable (making the page header load an appropriate localization file), and as above, this in itself also works fine.
What's problematic is that when a user gets the search result, a list of people, and wants to open up detailed info on a person (thus going from search.php to info.php) and then wants to go back to the people listing via the back button, it takes too long to reload the previous page as the page re-sends the MySQL query etc, instead of going back to a cached page. It can take even 5 seconds or more sometimes as the queries produce up to 5000 results - but even say, 200-500 results takes long to reload because the database itself is big and not the fastest in the world. And limiting the number of results isn't really practical.
The obvious solution at first glance would SEEM to be enabling the browser cache. Which is exactly what I did via PHP header and pragma controls. And all seemed well, as going back to the list was basically instantaneous. However, I realized that enabling the cache means the updated session variables don't work. New search criteria doesn't properly replace the old ones when reloading the search page after having been to a different page, and even though you select another language, pages open up in the language you previously were using, because that's the way they were cached! You can force the language to update via F5, but that doesn't seem to help the search criteria much. But even if it did, F5-spam isn't really an answer, it needs to work automatically.
So long story short, how do I make the search result list open quickly without making session variables useless? Or will I simply have to make do with sluggish page loads when using back button, thus annoying users? I really don't want to open the info.php in a new page, either.
Have you considered caching the database results on the file system? I have found the Zend Framework caching class to be very easy to use. You can use any information you want, to differentiate cached results from other cached results. So the caching can be as fine grained as required.
http://framework.zend.com/manual/1.12/en/zend.cache.introduction.html
You don't need to use the whole of Zend Framework to use the class. You can use it on its own.
I'm considering doing capability/feature tests with JS and sending the results back to the server so it knows what it can/cannot send to client.
It's the same idea as modernizr server - https://github.com/jamesgpearce/modernizr-server
The problem I'm running into is communicating the JavaScript results back to the server on initial page load. I'm not sure what options are available but I know there are not a lot. I've tested setting a cookie and instantly reloading the page so PHP can read the results yet I'm concerned from an SEO standpoint. It just seems like instantly reloading a page will have some adverse affects, also I'm particularly worried if the refresh is on a page that has a form. Once the cookie is set and the user goes to another page, it works fine, it's just figuring out how to serve the content on the initial page load based on the capability tests. I've had a few different thoughts like just using JS to serve the markup on the initial page load and let PHP read the cookie on subsequent page loads, but I'm just not sure what might be the best solution.
I'm just at a loss as to what other options there are. I don't know which direction I should be looking in or if there is any direction at all. I don't know if AJAX would be able to help with this or not. I feel like I'm close, but figured maybe if I asked the community someone might have a good solution.
Thanks!
modernizr-server uses the method you described, it sets a cookie and then reloads the page. (Actually, it doesn't do any content output, the only thing on the initial page load is the JavaScript itself. Has an exit; statement if the cookie it's looking for isn't found.) Anyone who has JavaScript off could probably expect a blank page with it.
It looks like you have a couple of options (this is non-exclusive, there are more.):
Set a cookie and then reload.
Set a cookie, and then use AJAX to fetch the initial page's content. (As you mentioned)
Set an expected baseline of support (perhaps, expect no JavaScript support whatsoever), and serve that on your initial page load. If they have JavaScript on, you can either reload, or use AJAX to tell your server what things it supports and then reload chunks (or all) of the initial page.
Serve no-javascript content to just search engines, use option 1 or 2 for everyone else.
Option three here is the most work intensive, but probably the most inclusive option. (Edit: 3 and 4 will make sure search engines see your content)
I have a web app where I need to change a drop down list dynamically depending on another drop down list.
I have two options:
Get all the data beforehand with PHP and "manage" it later with Javascript.
Or get the data the user wants through AJAX.
The thing is, that the page loads with all the data by default and the user can later select a sub category to narrow the drop downs.
Which of the two options are better (faster, less resource intensive)?
The less resource intensive option is clearly AJAX since you're only transferring the required information and no more.
However, AJAX can make the page less responsive if the latency is high for the client (having to wait for connections to fetch data between drop-down choices).
So: Load everything up front if latency is a bigger issue, and use AJAX if bandwidth is more of an issue.
It depends of your main goal:
1.
with ajax you'll be able to get the data you want without page refresh, and getting it as needed, thus you app will run faster...
It will also allow you to have a single block of code on an independent file to be "called by ajax" when needed, thus ussing that code across your app without loading it constantly!
2.
With php, you will have to prep the data beforehand thus writing a little more code, thus making your app slower...
Performance is nothing a user will see, unless we are talking about a big amount of data.
Concluding, ajax is the best way when talking about performance and code effectiveness!
Ps: Personal opinion of course!
If there is a considerable number of possible select options, I would use AJAX to get them dynamically. If you only have a very small set of select options, it would be worth considering embedding them in the page. Embedding in the page means no latency, and a snappier interface.
However, as stated previously, dynamic retrievals are very useful if you have a large set of options, or if the options are subject to changing dynamically.
As with any ajax request, remember to display some form of visual feedback while the request is underway.