I have a web app where I need to change a drop down list dynamically depending on another drop down list.
I have two options:
Get all the data beforehand with PHP and "manage" it later with Javascript.
Or get the data the user wants through AJAX.
The thing is, that the page loads with all the data by default and the user can later select a sub category to narrow the drop downs.
Which of the two options are better (faster, less resource intensive)?
The less resource intensive option is clearly AJAX since you're only transferring the required information and no more.
However, AJAX can make the page less responsive if the latency is high for the client (having to wait for connections to fetch data between drop-down choices).
So: Load everything up front if latency is a bigger issue, and use AJAX if bandwidth is more of an issue.
It depends of your main goal:
1.
with ajax you'll be able to get the data you want without page refresh, and getting it as needed, thus you app will run faster...
It will also allow you to have a single block of code on an independent file to be "called by ajax" when needed, thus ussing that code across your app without loading it constantly!
2.
With php, you will have to prep the data beforehand thus writing a little more code, thus making your app slower...
Performance is nothing a user will see, unless we are talking about a big amount of data.
Concluding, ajax is the best way when talking about performance and code effectiveness!
Ps: Personal opinion of course!
If there is a considerable number of possible select options, I would use AJAX to get them dynamically. If you only have a very small set of select options, it would be worth considering embedding them in the page. Embedding in the page means no latency, and a snappier interface.
However, as stated previously, dynamic retrievals are very useful if you have a large set of options, or if the options are subject to changing dynamically.
As with any ajax request, remember to display some form of visual feedback while the request is underway.
Related
I've just started learning PHP and just done with $_POST/$_GET.
Now I want to know, what is the pro's and con's of having the PHP to process the data from a form inside the same file, or send the data to another file (action="anotherfile")?
Logically I will think that sending it to another file would increase the time process it, but is that true?
When I have the PHP script inside the same file, the page doesnt seem to reload when I hit the submit button (but the content changes). Or does it? If it does, wouldn't the only difference would be that I would have to type the script for the menu (lets say you have the same menu on all pages) in both files? Which would lead to more coding/less space?
what is the pro's and con's of having the PHP to process the data from a form inside the same file, or send the data to another file (action="anotherfile")?
You are conflating files and urls.
By having the logic split between different files (and then included where appropriate) you seperate concerns and make your code easier to manage.
By having a single URL be responsible for both displaying the form and processing the form data you don't end up in the awkward situation where the result of processing the form data requires that you redisplay the form with error messages in it. If you used two different URLs there you would need to either display the form on the processing URL (so you have two different URLs which display the form) or perform an HTTP redirect back to the original URL while somehow passing details of the errors to it.
Logically I will think that sending it to another file would increase the time process it, but is that true?
No. It makes no difference on the time scales being dealt with.
When I have the PHP script inside the same file, the page doesnt seem to reload when I hit the submit button (but the content changes).
It does reload.
If it does, wouldn't the only difference would be that I would have to type the script for the menu (lets say you have the same menu on all pages) in both files?
That's what includes are for.
In any language we always try to write clean code. That's why we follow MVC.
Logically I will think that sending it to another file would increase the time process it, but is that true? I think NO.
Because when we send data to another page and on another page at the top we echo that post data and exit. you will see it will not take time. it take time when we redirect/load some html page after that.
It does not matter where we sending data (same page or another page). matter is what is loading after that.
There is no difference about speed.
Whetever you post the content of your form in standard submit, this data will be sent to the server and a response (after processing ) will be downloaded.
The only difference is about organization of your code. The logic that draws themplate of page (menu or other fixed parts) should be stored in some file that you can include separately or call by a function.
Is also true that when you post your data you do for some reason, register a user for example. Is a good pratice that the php file that handles user registration will do that and output the messages and not other functions.
If your file has some logic switches that make it output either an empty form or a a registration message based on the presence of post or get variables, you will notice that when you scale to more complex tasks this will add complexity and make code mantainment harder.
I'll try to make sure I understand your question by restating it.
If you have a form (/form.php), and the "action" of that submit button leads you to a separate php page (/form_action.php), there is absolutely no difference in speed. Each HTTP request (form.php and form_action.php) is independent - "form_action.php" doesn't remember anything about "form.php" unless you pass that information through (as parameters). This is what people mean when they say that HTTP is stateless. It's worth learning about how HTTP works in general alongside the details of PHP.
If you have a PHP script which in turn includes other PHP scripts, there is a tiny performance impact - too small to measure in pretty much any case I've ever come across.
However, using includes allows you to separate your markup (the HTML) from the logic (the PHP). This is a really good thing if you are doing anything other than tinkering. It allows you to re-use functionality, it makes it much easier to change and maintain the code over time, and it helps you think through what you're trying to achieve.
There are many different ways people have solved the "how do I keep my code clean" puzzle; the current orthodoxy is "Model-View-Controller" (as #monty says). There are also PHP frameworks which make this a little easier to implement - once you've got the basics of the language, you might want to look at Zend or TinyMVC (there are several others, each with their benefits and drawbacks).
I have a PHP page with a simple form. One input text field & a button. Input text field accepts user queries & on button click an HTTP GET request is made to the server & the result has to be shown back in the same page containing the form. That's too simple to do. I can do this in two ways. One is AJAX & other one is the good old sodding form-submit method.
My question is simple- Which method should I use? Since both of the roads lead us to the same place, which one should I choose to travel?
First of all, let me talk about form submit method. I can use <?php echo $_SERVER['PHP_SELF'] ; ?> as the action of the form for submitting the values of my form to the same page. Once I store those values into some random variables, I can make a GET request & obtain the result & show it to the world. This method is easy to use. Happy Down Voting to all of you.
Or I can make a GET request using AJAX and jQuery or JavaScript or whatever you wish to use & obtain the same result as in the previous case. Output is same. Only the mode of execution is different.
So which one is better? Which one fetches result faster? And why should I use that? Is there any difference? GET, POST, PUT or whatever- it doesn't really matter. AJAX or form-submit?
There shouldn't be any significant, genuine speed difference between them.
The Ajax approach will load a smaller amount of data (since you aren't loading an entire HTML document), but by the time you take into account HTTP compression and the fact that (if your system is sensibly configured) your dependancies (images, scripts, stylesheets, etc) will be cached, it won't be significantly smaller.
By using JavaScript to create a loading indicator and not refreshing the entire window in front of the user, you can create the illusion of a faster load time though. So if feeling faster was the only concern, then Ajax is the way forward.
Using JavaScript, however, is more complicated and slightly more prone to failure. The consequences of failure states are more severe because, unless your code detects and does something with them, the user will (not) see it fail silently. For example, if a normal page load times out because the user is on a train and went through a tunnel, they'll see an error page provided by their browser suggesting that they refresh and try again. With Ajax, you need to write the error handling code yourself. This does give you more flexibility (such as allowing you to simply try again a few times) but the work isn't done for you.
The other consequence of using Ajax is that the address bar will not update automatically. This means that the results won't be bookmarkable or sharable unless you do something explicit the make that possible. The usual way to do that is pushState and friends, but again, it is more work.
You should also make the site work without JavaScript so that if the JS doesn't run for any reason then the site won't break completely. If you use pushState then you have to do this for the URLs you are setting the address bar to point to to be useful.
The short answer: Use a regular form submission, then consider layering JavaScript over the top if you think it will give your visitors a worthwhile benefit.
I Should stick to an Ajax request when possible.
This because you then don't really have to load every single item on the page again ( like all the images, menu and so on ). You can just give the relevant HTML back and JQuery can place it inside the relevant holder.
But that is just my humble opinion...
If you have to retrive simple data from server without reload the page my advice is use jquery .get o .post
also it provides you a very large API that allows you to reduce your programming time.
http://api.jquery.com/
obviously the execution time increase but in my experience the user cant fell the differce with a simple ajax request.
so in my opinion if jquery allow you to obtain the results, this is the best solution because halves your work time!
See the edited one it may help you.
I think that AJAX should be used for displays updates and form submissions should be done via a page reload. Reasoning?
When submitting forms, you are telling the application to do something. Users tend to want to feel that it was done. When a page doesn't reload, users are often left wondering "Did that work?". Then they have to check to make sure what they did was right.
but when you are displaying a table or something, and the user says to "display x data....now x1 data" for instance, they aren't "doing" something (creating new entities, sending emails, etc). So AJAX can provide a nice user interface in this case. Page reloads would be annoying here.
In conclusion, I think form submission should be done via page reloads (let the user see it working), whereas display updates should use AJAX (prevent annoying page reloads).
Of course, this is a preference thing. Some of my company's applications use AJAX all over. But those are the applications that are the most difficult to maintain and debug. ;)``
I'm building a web app, the way I started off the app for testing purposes is to load lots of data in to session arrays from my database so I can use the values easily throughout the pages. I have one page the has numerous selects on it, and each time the php page loops through all the variables, chooses the selected one, and outputs the dropdown. One of my arrays though has just under 3000 values and loading this dropdown slows the page down from about 300ms to 1-1.2s. Not terrible but easy to tell that it is less responsive. So I'd like to know if there is anyway for me to improve the load speed, or any thoughts on a substitute for the dropdown.
What I have tried so far:
Session arrays hold all the values, when the page is loaded through jquery ajax method the php page loops through these values and echos the dropdowns.
Php include - create php or html pages of all the values pre written as selects, this creates a ~100kb page for the problem dropdown and this is then included with include. Takes roughly the same amount plus I'd have to then use javascript to set the value, but I'd do this if it could be improved. I thought perhaps some caching could provide improvements here. There seemed to be no significant difference between html and php pages for include but I'd assume html would be better. I'm also assuming that I cannot use regular caching because I am using a php function to include these pages.
I have tried just loading in the html page and it takes about 1 sec on first load, after browser caching it is back down to 100-350ms so I imagine caching could provide a huge boost in performance.
I have considered:
Creating a cached version of the whole page but this will be quite the pain to implement so I'd only do it if people thought it is the right way to go with this. I would have to use ajax to retrieve some data for the inputs which I was originally doing with php echos.
Just removing the problem dropdown.
Just to clarify something I've never had clarified, am I correct in thinking php pages can never be cached by the browser, and so by extension any php included files can't be either. But then how come a javascript file linked to in a php file can be cached, because it is using an html method?
The data being returned and parsed into a dropdown is probably your bottleneck. However, if the bottleneck is actually the PHP code you could try installing an optcode cache like APC at http://php.net/manual/en/book.apc.php. It will speed up your PHP. (Zend Optimizer is also available at: http://www.zend.com/en/products/guard/runtime-decoders)
If your bottleneck is the database where the items in the dropdown is coming from, you may want to try setting MySQL to cache the results.
You may also want to try an alternative dropdown that uses AJAX to populate the dropdown as the user scrolls down, a few records at a time. You could also create it as a text field that prompts the user for possible matches as they type. These things may be faster.
I suspect the problem is the raw size of the data you're transmitting, based on the results of number 2 in "What I have tried so far." I don't think you can rely on browser caching, and server-side caching won't change the size of the data transmitted.
Here are a couple of ideas to reduce the amount of data transmitted during page load:
Load the select box separately, after the main page has been
delivered, using an asynchronous javascript call.
Break the choice into a hierarchical series of choices. User
chooses the top-level category, then another select box is populated
with matching sub-categories. When they choose a sub-category, the
third box fills with the actual options in that sub-category. Something like
this.
Of course, this only works if those 2nd and 3rd controls are filled-in using an async
javascript call.
Either way, make sure gzip compression is enabled on your server.
Edit: More on browser caching
The browser caches individual files, and you typically don't ask it to cache PHP pages because they may be different next time. (Individual php includes are invisible to the browser, because PHP combines their contents into the HTML stream.) If you use a browser's developer console (hit f12 on Chrome and go to Network, for example), you can see that most pages cause multiple requests from the browser to the server, and you may even see that some of those files (js, css, images) are coming from the cache.
What the browser caches and for how long is controlled by various HTTP response headers, like Cache-Control and Expires. If you don't override these in php by calling the header function, they are controlled by the web server (Apache) configuration.
For example, you are building a dictionary app where the entries are objects, and all the values are stored in a server-side database. The entries are visible client-side, so they should eventually be JavaScript objects. However, since the data is server-side, there are two ways to construct the object:
Construct the entries objects via PHP, then pass the result to a .js script, which makes JavaScript objects from it.
Construct the entries via JavaScript, calling AJAX methods on the object to request the specific information about the entry (e.g. definition, synonyms, antonyms, etc.) from the server.
The first way ends up constructing each entry twice, once via PHP and once via JavaScript. The second way ends up calling several AJAX methods for every construction, and opening and closing the database connection each time.
Is one preferable to the other, or is there a better way to do this?
I use a rule of thumb, the less AJAX (on a page opener) the better.
If you can push all information on the page load to the user, do it. Then use AJAX on subsequent calls. Otherwise the user-experience will suffer from AJAX (rather than benefit) as the page will take longer to load.
Another option would be, if you're not tied to PHP, to have a JS-based back-end like Node.js. This way you can transmit everything in one format. In some cases you can even store the JS object directly on the database. An example of this kind back-end would be Node.js + Mondo DB, if document database is suitable to your needs.
If you're tied to PHP/JS, i'd go for minimizing AJAX calls. Making asynchronous transfer (duplicating objects) should achieve improved user experience, and the choices made should aim for this. Too many HTTP-requests usually end up making the site slow to react, which is one of the things we usually try to get rid of by using AJAX.
One way that's sometimes useful is also to render JS object by PHP, that could be used if the data is going to be needed but that should not be directly (or at all) shown to the user.
Totally depends on the project. There are just too many variables to say 'you should do it this way'.
Instead, test it. Do it one way, push it to a high number of requests, and profile it. Then switch and try the other way. Keep it easy to switch the output from the PHP by following the MVC pattern.
The only general rule is 'minimise the number of HTTP requests', as HTTP is by far the biggest bottleneck when a page is loading.
I am hitting a lot of different sites to get a list of information and I want to display this information as I get it. Right now I am using a Smarty Template and what I would like to do is:
Pseudo code:
{foreach $page}
$smarty_var = use AJAX to call a PHP function
Render out a new table row on the fly w/ the newly assigned var
<tr><td>{$smarty_var}</td></tr>
{/foreach}
I don't know much about AJAX, I used it a long time ago, and it was similar to this, but not quite, there was user action taken. No I don't have a JS Framework in place. Am I way off here on how this should go? Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
Sure, I will tell you about what I am trying to do: http://bookscouter.com/prices.php?isbn=0132184745+&x=19&y=6
If you click on the 'Click to view prices from all 43 links' at the bottom on that page you will see. I am using cURL to get all the pages I want a price from. Then for each page I want to get the price. So each page is gonna fire off a function that runs some fun code like this:
function parseTBDOTpageNew($page, $isbn)
{
$first_cut = preg_split('/<table[^>]*>/', $page);
$second_cut = preg_split('/<td[^>]*>/', $first_cut[2]);
if(strstr($second_cut[4], "not currently buying this book") == true)
{
return "\$0.00";
}
$third_cut = preg_split('/<b[^>]*>/', $second_cut[9]);
$last_cut = preg_split('/</', $third_cut[3]);
return $last_cut[0];
}
This function is called from another function which puts the price returned from the function above, the name of the company, and a link in an array to be added to another bigger array that is sent to smarty. Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
I will take your advice on Jquery, what I have started is an onload function that receives the $pages to be parsed, and I was just in the middle of writing: foreach page get the info and spit some html w/ the info on the page.
Also the function that calls the function to get the price is in a php file, so I need the request to hit a function within a php file and NOT just call file.php?param1=foo, I need to it to actually hit the function in the file. I have Jquery in place, now just trying to figure it out and get it to do what I need, ugh. I am searching, any help would be appreciated.
No I don't have a JS Framework in place
Fix that first. You don't want to juggle XMLHTTPRequests yourself. jQuery is SO's canonical JS library, and is pretty nifty.
Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
How many rows will you be dealing with? Do they all have to be loaded asynchronously?
Let's tackle this in a braindead, straightforward way. Create a script that does nothing more than:
Take a site ID and fetch data from the corresponding URL
Render that data to some data transport format, either HTML or JSON.
Then it's a simple matter of making the page that the user gets, which will contain Javascript code that makes the ajax calls to the data fetcher, then either shoves the HTML in the page directly, or transforms the data into HTML and then shoves that into the page.
You'll note that at no point is Smarty really involved. ;)
This solution is highly impractical for anything more than a trivial number of sites to be polled asynchronously. If you need rows for dozens or hundreds of sites, that means each client is going to need to make dozens or hundreds of requests to your site for every single normal pageview. This is going to slaughter your server if more than one or two people load the page at once.
Can you tell us more about what you're doing, and what you're trying to accomplish? There are lots of ways to mitigate this problem, but they all depend on what you're doing.
Update for your question edit.
First, please consider using an actual HTML parser instead of regular expressions. The DOM is very powerful and you can target specific elements using XPath.
Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
So, here's the ultimate problem. You want to do something asynchronously. PHP does not have a built-in generalized way to perform asynchronous tasks. There are a few ways to deal with this problem.
The first is as I've described above. Instead of doing any of the curl requests on page load, you farm the work out to the end user, and have the end user's browser make requests to your scraping script one by one, filling in the results.
The second is to use an asynchronous work queue, like Gearman. It has excellent PHP support via a PECL extension. You'd write one or more workers that can accept requests, and keep a pool of them running at all times. The larger the pool, the more things you can do at once. Once all of the data has returned, you can throw the complete set of data at your template engine, and call it good.
You can even combine the two, having the user make only one or two or three extra requests via ajax to fetch part of the returned data. Heck, you can even kick off the jobs in the background and return the page immediately, then request the results of the background jobs later via ajax.
Regardless of which way you handle it, you have a giant, huge problem. You're scraping someone's site. You may well be scraping someone's site very often. Not everyone is OK with this. You should seriously consider caching results aggressively, or even checking with each of the vendors to see if they have an API or data export that you can query against instead.