Should I only be loading Javascript files on pages I need them? - php

So I am starting to get interested in Optimization and I was wondering if it is worth it to only include the Javascript files in the pages I need and exclude them from the rest using PHP?
Thanks

Loading only the scripts you need will definitely help.
Another huge improvement can be made by combining all javascript to one file and (if possible) minimizing and caching the output. The less requests a browser has to make, the faster your page will load.

Ideally, yes.
Each page should not load more than it needs. Minimising traffic is always good.
It is a good idea to have a framework for generating pages, and only add the resources you need for each page.

If you include all or most of the JavaScript that your site needs on your main page (in an minimized and compressed form like what is provided by YUI Compressor) it will increase the load time of your main page but will decrease the load time of every other page because the JavaScript will already be in cache. A lot of web applications will use this technique so that the user experience after the initial load is smoother.
Either technique has its merits, it really just depends on the type of application/page you are building. But like dirkbonhomme said you should be minimizing, compressing, and caching the JavaScript (and CSS) no matter which way you go.

Related

create html page from php

I am the webmaster of a dynamic website, and because of plenty of complicated queries that I have to use on the front page and some other pages, the server suffers sometimes from overload, when the number of visitors of our website is elevated.
So, I got the Idea to generate periodically (every 2 minutes) an html static snapshot of these pages. This would charge the server just once per 2 minutes by just one user.
My question is: Is this a good Idea? because I plan to generalize it over many other pages, and I don't want to be surprised and have to go back again.
If it isn't, is there any good ideas to avoid this charge?
Thank you in advance
PS: I would maybe publish the method I use to do this, to see if there is a better way.
I don't think it's a bad idea, but you should use an existing caching solution rather than implementing your own. Why not to use memcached? I think that's what you are looking for, just use it for those parts of your code that are taking long time.
Caching is a good idea to protect your server from overloading. Many CMS (Content Management System) use this technique.
Sure, it's called caching :)
However, most sites are caching just parts of their content. You can't cache a whole page if you are using user specific content, for example the name of a logged-in user. But you can cache the heavy parts of your site and combine it with a dynamic page.
Your idea is really good and many big websites are using this concept. You can also using Caching techniques, if you want avoid database hit then you can use caching technique it will be better. you can use Memcached http://memcached.org/.

How to reduce php page loading time?

I have created webpages for a Magazine in PHP. I have called header and footer files in each page. My pages are having only HTML codes except header and footer files. Even though my page loading time is too long. Can any one help me to solve my problem. I am very new to PHP so i dont know the tricks.
use a good web server with caching frameworks like opcache. Reduce the file sizes of images, javascript, css. Avoid unneeded loops.
That are some examples. It depends on what you do.
You can try this page: http://www.webpagetest.org/ it measures the performance of your generated output (html).
Everything else like slow php execution times cannot be measured by observing the frontend properly. You will have to gain some skills and tools to profile your php application and see what is actually slow. The standard procedure is googling for optimizing php scripts or profiling a php script, etc. You probably will need to invest money to get some proper tools.
PHP must not even be the culprit here. Long server response, bad mysql queries, etc may impact execution and delivery speed.

Loading multiple embedded flash apps onto an HTML page- problem with ordering

We need to load an embedded version of a site written in Flash, and not originally designed to load multiple instances of itself, on a HTML page. The specific issue is how to get them to load in order when embedded, given that they are all being opened by the same instance of the flash player.
It's a complicated mapping application, and at the moment, the maps and data get intermixed as the session variables are overwritten by another instance starting to load before the previous one has finished. We need a way to have them load sequentially, one finishing before another starts to load.
The most we can specify in the URL is an &order=1 or similar. We have PHP and SQL on the backend.
Edit:
The embedded versions are being loaded in an iFrame of a parent site. One php file loads one swf, as many times as the parent site desires.
I would use Javascript and swfobject to load the Flash apps sequentially onto the page. The javascript code would contain the order it wants to load the flash apps in and each Flash app should notify the javascript on the containing page when it has completed loading by calling a function in the javascript via ExternalInterface, which would then trigger the loading of the next Flash app in the list.
I feel sorry to tell my solution mustn't seem the most simple but may be the most realistic to keep a stable version up, alas. Probably you may spend a couple of days re-factoring the setup core of the application so it matches the current requirements.
It's often not a good thing to find out an external 'hack' to make it kind-of-work as it may get wasted and complicate things even more further on. By experience, all summed up you always spend more time (and painful one) by tweaking such kind of 'quick-fixes' than you would have done sitting down thinking it over for a while.

When is it appropriate to use AJAX?

When is it appropriate to use AJAX?
what are the pros and cons of using AJAX?
In response to my last question: some people seemed very adamant that I should only use AJAX if the situation was appropriate:
Should I add AJAX logic to my PHP classes/scripts?
In response to Chad Birch's answer:
Yes, I'm referring to when developing a "standard" site that would employ AJAX for its benefits, and wouldn't be crippled by its application. Using AJAX in a way that would kill search rankings would not be acceptable. So if "keeping the site intact" requires more work, than that would be a "con".
It's a pretty large subject, but you should be using AJAX to enhance the user experience, without making the site totally dependent on it. Remember that search engines and some other visitors won't be able to execute the AJAX, so if you rely on it to load your content, that will not work in your favor.
For example, you might think that it would be nice to have users visit your blog, and then have the page dynamically load the newest article(s) with AJAX once they're already there. However, when Google tries to index your blog, it's just going to get the blank site.
A good search term to find resources related to this subject is "progressive enhancement". There's plenty of good stuff out there, spend some time following the links around. Here's one to start you off:
http://www.alistapart.com/articles/progressiveenhancementwithjavascript/
When you are only updating part of a page or perhaps performing an action that doesn't update the page at all AJAX can be a very good tool. It's much more lightweight than an entire page refresh for something like this. Conversely, if your entire page reloads or you change to a different view, you really should just link (or post) to the new page rather than download it via AJAX and replace the entire contents.
One downside to using AJAX is that it requires javascript to be working OR you to construct your view in such a way that the UI still works without it. This is more complicated than doing it just via normal links/posts.
AJAX is usually used to perform an HTTP request while the page is already loaded (without loading another page).
The most common use is to update part of the view. Note that this does not include refreshing the whole view since you could just navigate to a new page.
Another common use is to submit forms. In all cases, but especially for forms, it is important to have good ways of handling browsers that do not have javascript or where it is disabled.
I think the advantage of using ajax technologies isn't only for creating better user-experiences, the ability to make server calls for only specific data is a huge performance benefit.
Imagine having a huge bandwidth eater site (like stackoverflow), most of the navigation done by users is done through page reloads, and data that is continuously sent over HTTP.
Of course caching and other techniques help this bandwidth over-head problem, but personally I think that sending huge chunks of HTML everytime is really a waste.
Cons are SEO (which doesn't work with highly based ajax sites) and people that have JavaScript disabled.
When your application (or your users) demand a richer user experience than a traditional webpage is able to provide.
Ajax gives you two big things:
Responsiveness - you can update only parts of a web page at a time if need be (saving the time to re-load a page). It also makes it easier to page data that is presented in a table for instance.
User Experience - This goes along with responsiveness. With AJAX you can add animations, cooler popups and special effects to give your web pages a newer, cleaner and cooler look and feel. If no one thinks this is important then look to the iPhone. User Experience draws people into an application and make them want to use it, one of the key steps in ensuring an application's success.
For a good case study, look at this site. AJAX effects like animating your new Answer when posted, popups to tell you you can't do certain things and hints that new answers have been posted since you started your own answer are all part of drawing people into this site and making it successful.
Javascript should always just be an addition to the functionality of your website. You should be able to use and navigate the site without any Javascript involved. You can use Javascript as an addition to existing functionality, for example to avoid full-page reloads. This is an important factor for accessibility. Javascript should never be used as the only possibility to reach or complete a request on your site.
As AJAX makes use of Javascript, the same applies here.
Ajax is primarily used when you want to reload part of a page without reposting all the information to the server.
Cons:
More complicated than doing a normal post (working with different browsers, writing server side code to hadle partial postbacks)
Introduces potential security vulnerabilities (
You are introducing additional code that interacts with the server. This can be a problem on both the client and server.
On the client, you need ways of sending and receiving responses. It's another way of interacting with the browser which means there is another point of entry that has to be guarded. Executing arbritary code, posting data to a non-intended source etc. There are several exploits for Ajax apps that have been plugged over time, but there will always be more.
)
Pros:
It looks flashier to end users
Allows a lot of information to be displayed on the page without having to load all at the same time
Page is more interactive.

Creating a two-pass PHP cache system with mutable items

I want to implement a two-pass cache system:
The first pass generates a PHP file, with all of the common stuff (e.g. news items), hardcoded. The database then has a cache table to link these with the pages (eg "index.php page=1 style=default"), the database also stores an uptodate field, which if false causes the first pass to rerun the next time the page is viewed.
The second pass fills in the minor details, such as how long ago something(?) was, and mutable items like "You are logged in as...".
However I'm not sure on a efficient implementation, that supports both cached and non-cached (e.g., search) pages, without a lot of code and several queries.
Right now each time the page is loaded the PHP script is run regenerating the page. For pages like search this is fine, because most searches are different, but for other pages such as the index this is virtually the same for each hit, yet generates a large number of queries and is quite a long script.
The problem is some parts of the page do change on a per-user basis, such as the "You are logged in as..." section, so simply saving the generated pages would still result in 10,000's of nearly identical pages.
The main concern is with reducing the load on the server, since I'm on shared hosting and at this point can't afford to upgrade, but the site is using a sizeable portion of the servers CPU + putting a fair load on the MySQL server.
So basically minimising how much has to be done for each page request, and not regenerating stuff like the news items on the index all the time seems a good start, compared to say search which is a far less static page.
I actually considered hard coding the news items as plain HTML, but then that means maintaining them in several places (since they may be used for searches and the comments are on a page dedicated to that news item (i.e. news.php), etc).
I second Ken's rec of PEAR's Cache_Lite library, you can use it to easily cache either parts of pages or entire pages.
If you're running your own server(s), I'd strongly recommend memcached instead. It's much faster since it runs entirely in memory and is used extensively by a lot of high-volume sites. It's a very easy, stable, trouble-free daemon to run. In terms of your PHP code, you'd use it much the same way as Cache_Lite, to cache various page sections or full pages (or other arbitrary blobs of data), and it's very easy to use since PHP has a memcache interface built in.
For super high-traffic full-page caching, take a look at doing Varnish or Squid as a caching reverse proxy server. (Pages that get served by Varnish are going to come out easily 100x faster than anything that hits the PHP interpreter.)
Keep in mind with caching, you really only need to cache things that are being frequently accessed. Sometimes it can be a trap to develop a really sophisticated caching strategy when you don't really need it. For a page like your home page that's getting hit several times a second, you definitely want to optimize it for speed; for a page that gets maybe a few hits an hour, like a month-old blog post, it's a bad idea to cache it, you only waste your time and make things more complicated and bug-prone.
I recommend to don't reinvent the wheel... there are some template engines that support caching, like Smarty
For server side caching use something like Cache_Lite (and let someone else worry about file locking, expiry dates, file corruption)
You want to save the results to a file and use logic like this to pull them back out:
if filename exists
include filename
else
generate results
render to html (as string)
write to file
output string or include file
endif
To be clear, you don't need two passes because you can save parts of the page and leave the rest dynamic.
As always with this type of question, my response is:
Why do you need the caching?
Is your application consuming too much IO on your database?
What metrics have you run?
Your are talking about adding an extra level of complexity to your app so you need to be very sure that you actually need it.
You might actually benefit from using the built-in MySQL query cache, if the database is the contention point in your system. The other option is too use Memcache.
I would recommend using existing caching mechanism. Depending on what you really need, You might be looking for APC, memcached, various template caching libs... It easier/faster to tune written/tested code to please your need than to write everything from scratch. (usually, although there might be situations when you don't have a choisce)

Categories