My website is taking a long time processing and opening pages. Why? - php

I have developed a web portal in Joomla, which is completely images oriented, hence there are lots of images on the site. My website is taking a long time processing and opening the pages. What could be the reason, and how can I resolve it?

Get firebug for firefox, it is a very useful utility that I use everyday. It will show you exactly what is loading, how large it is and how long it is taking.
Yahoo has a good article on best practices.
Here
There is a ton of things you can do, but I would start at the link above and then come back and ask any questions you still have.

One major thing to check is simply that you are optimizing all your images. The link that Nick provided in a previous answer to Yahoo best practices will provide some good information. If you are working with Joomla out of the box as they say, the most likely culprit are the images you are using. Make sure you're images are the correct display size, not larger. Don't use the html to size an image with the height and width command like this:
<img src="someimage.jpg" height="400" width="600" >
Use a photo editor to make the image the actual size you need.

My first test would be to run the website hosted on a local network and then check the performance. If it still works slow, its nothing to do with the images. There's something wrong with the code. You will need to trace and find out which path is taking too long.
If images are the problem, suggest you host images on a fast server like Amazon's AWS. Even then images are cached by the browser and subsequent visits must be fast.
Also use google chrome and find out what is taking time. Press Ctrl-Shift-J on the chrome window. You'll get some pretty interesting statistics of what's taking time and network resources.

Related

How bad is it to have image strings embedded directly in my HTML?

Let's say I'm making some kind of PHP-based single-page image board that shows 100 small (<10kb) user-uploaded images. The oldest image is deleted from the server as soon as a new image is uploaded. Because the images come and go so rapidly I feel like there's not much of an advantage to caching them. It occurs to me that I could just embed the images as strings directly in the page's HTML, and that would also minimize the number of requests to my server. However, I feel like this must be a "bad thing". I'm wondering what input you knowledgeable folks have? Thank you!
It will be faster as much less connections are used. But you need to ask yourself if one particular user browsing through your site will see those images more than once, if yes than let the Web Server do the rest ( 304 response or similar )

how can i restrict a user to see only the slideshow and can't download it

i am developing a cake-php application, under this i want to show powerpoint slideshow for the end user but the condition is that the user can only be able to see the show, not be able to download the slideshow.
Can any one plz suggest the best way to do it.
If the slideshow is based on images you can split each image into 9,16 or more squares and display the tiled image. That way if the user decides to 'Save-as' the image he will get only 1/9, or 1/16-th of the real image. If the slideshow is quite big it will be a pain to put together all the pieces and will discourage the users to try and save the slide.
You can see such implementation here - http://whatismycar.com/info/16540/ - the 4 images below the header are in fancybox and if you try to 'Save-as' one of them you will save only a small tile of the original image.
Hope this helps.
It is impossible prevent downloading images from internet, but you can make it hard for users with this. Also you can hide source html image path with php check it here
While I am no expert on this subject, something worth noting is what Youtube seems to be doing.
Ever notice how the whole video never loads if you pause it?
Upon monitoring the network tab during a video you will see that they are actually making hundreds or even thousands of requests for video segments from their server and most likely using JS to clear the cache of parts you've watched.
^^ this is why going back to an earlier point in the video causes it to stall for a bit while it re-downloads the segment which you wish to see.
At the end of the day, PrtScn will trump all of your efforts because the web browser does not have the privilege to control the keyboard outside of it's own environment.

How do sites like Bing Search, Imgur, and Reddit generate a thumbnail of the website from a URL?

In Imgur, you can input an image URL and a few seconds later, there's a thumbnail of the image. Or in Bing Search, you can (or used to) be able to view a thumbnail of the website in the search results before visiting it.
I would love to implement something similar for my website, but I can't wrap my head around on how it is done. Moreover, are there not security concerns? I'd imagine the servers have to at least download the website, render it and take a screenshot. What if it's a malicious website, and you download something malicious on your server?
A headless Web browser engine like PhantomJS can be used for this. See example on their wiki. Yes, it would be prudent to run this in some sort of a sandbox, feeding a queue of URLs into it, then taking the generated thumbnails from the file system.
While I don't know the internal workings of any of the aforementioned services, I'd guess that they download/create a local copy of the images and generate a thumbnail from that.
Imgur, as an image hosting service, definitely needs a copy of the image prior to being able to generate thumbnails or anything else from it. The image may be stored locally or just in memory, but either way, it must be downloaded.
The search engines displaying screenshots of the sites likely have services that periodically take a screenshot of the viewable area when the content is getting indexed, and then serve those screenshots (or derivatives) along with the search results. Taking a screenshot really isn't dangerous, so there's nothing to worry about there, and whatever tools are used to load/parse/index the websites will obviously be written with security considerations in mind.
Of course, there are security concerns about the data you're downloading, too; the images can easily contain executable code (such as PHP) in their EXIF data, so you need to be careful about what you do with the images and how.

Gallery Images Ideas

Since multiple requests can slow down the speed at which a site loads, I was thinking that in the case of a gallery, would one large image containing all the thumbnails be better than loading individual thumbnails?
The large image would then use PHP to "chop up" the thumbnails and place them in the relevant locations on the page.
My main concern is would this have a negative impact on SEO? Since Google would only see one large image file, instead of many smaller ones. Would a way around this be to set the src of all thumbnails to redirect to the script that handles the thumbnail generation, where the image file name refers to a particular set of coordinates for that image?
As a rule of the thumb; for buttons/icons/stufflikethat use image sprites (one large image combining all images, uses css to only show a part between specific coordinates), for 'real' content images, just use separate images.
This has several reasons; icons, buttons and so on are images that appear on sometimes every page of your site and often multiple times on the same page. So it is really useful to combine them, as ie. it is really inefficient to start a new http connection to download an icon of 1kb (or less), imagine what will happen if you use hundreds. Furthermore this type of images are not important at all for your seo rank, only for the look of your site (but google doesn't care if your site is ugly as hell or beautiful as a princess)
But on the other hand, 'content' images, such as thumbnails, photo's of your holiday or baseball tournament are often big enough to rule out the efficiency part. As you can see in the chrome developer tools or firebug the browser will start the download of all images simultaneously. So downloading one image is pretty much as fast as downloading a hundred. But if you combine a hundred images, the download will be slower, as you have to download a larger bit of data in one piece. In comparison; pushing 2 gallons of water trough one hose will take longer than pushing the same 2 gallons trough 10 hoses. (offcourse this metaphore has it's holes, but it illustrates my point).
But more importantly; google reads out the img tags and uses the filename (src), the title and (less importantly) the alt attributes to determine how your image should relate to your seo rank. Images do have a relevant effect on your seo rank! But google also knows if it is the same image showing, or a different one, so a sprite wouldn't help you here. A script, with parameters saying which part of the image has to be loaded wouldn't help you at all, I believe if you think it over you can figure out why ;)
So don't bother about merging the thumbnails and stuff like that. If you want to improve speed, move your attention to caching and speeding up transmission. Some really simple improvements can be implemented by using for example gzip compression (google .htaccess gzip for that), proper caching headers etc.
You got it right, it is always better to download one large image and get all the image from there, i guess you meant javascript with the chop out thing, because you have to do that on the client side. This in terms of performance is a very good idea and lot of sites do it. Another idea is to use smaller images and resize them in the client side.Just be careful with the resizing not affecting the resolution of the image.
Im not so sure about this being negative to SEO, and as far as i know google doesnt execute any javascript function, so that work around, i dont think it would work. But in all honesty im not so sure about this, in my last job we never considered images as a big impact in SEO.

How can I speed up image load time in my web site?

I am currently developing a web site with PHP + MySQL and jQuery. So far I have been doing it in my local machine. I notice that when I see the page the images on it take some time to load (few time but its visible). All images are small (PNG's with less than 3 KB). Now, when I load the page, there are some database connections happening in order to get some data that I will display.
I am not sure if this loading time issue has something to do with the amount of images, or with the time that the PHP script + the DB connections take to execute. (I have very little data in my database so I wouldn't assume this case.)
My question is: Is it a good approach to preload all the images in the beginning of each page? I tried it with jQuery and it works fine. I'm just not sure which disadvantages I can get with it. For example, to do so, I need to include the jQuery library in the beginning of the page? I thought it was a bad practice.
If these PNGs are stored in the database as BLOBs — not clear from your question — don't do that. Serving images from a DB through PHP is not as efficient as letting the web server serve them straight from the filesystem. If the images are tied to particular records, just name the PNG after the row ID, so you can find it in a directory dedicated to storing those images. The PHP code then just generates the URL that points to the PNG file on disk, so the web server can send them statically.
I don't think preloading the images within the same page is going to buy you anything. If anything, it might slow the apparent overall page load time because the browser can only retrieve a fixed number of resources concurrently, typically 2-4. Loading images at the top of the <body> means there are other things at the top of the page "above the fold" that have to wait for some HTTP connection slots to free up. Better to let the images load in their natural order.
Preloading makes sense in two situations:
The image isn't shown by default, but is expected to be needed as the user interacts with the page. Good examples of this are the hover and click state images for rollovers.
The image isn't used on this page, but will be needed on the next. Good examples of this are any site where there is a clear progression from one page to the next, like in a shopping cart.
Either way, do the preload at the very bottom of the <body>, so everything else loads first.
Having addressed those two issues, run YSlow on your site. It started out as a plugin for Firebug, which in turn is a plugin for Firefox, but it's since been ported to all major browsers except IE.
The beauty of YSlow is that it detects common slowdowns automatically, just by loading the page while the extension is active. It then gives you a clear grade for the page, so you can judge when you're "done" optimizing. If you're below an A, you're not done yet. :) It's not uncommon to see sites rating a D or worse, because the default configuration for web servers is conservative to avoid causing problems. Fixing YSlow warnings is generally pretty easy, but you have to be careful to avoid creating caching and other problems, which is why the default server config doesn't do these things.
Another answer recommended the Google PageSpeed offering. It's available as a plugin for Chrome and Firefox, as a server-side Apache module, and as a Google-hosted service. I have no idea how it compares to YSlow, but it looks interesting.
Also consider using the browser's debugger to get a waterfall graph of resource load times:
In Firebug you get this in the Net tab.
In Safari, you get to it via the Develop menu, which is normally disabled in Preferences. Turn it on if needed, then say Develop > Start Timeline Recording. That puts you into the Network Requests instrument. You can also get to it through Develop > Show Web Inspector.
In Chrome, say View > Developer > Developer Tools, then go to the Network tab.
IE has a very weak form of this, via Tools > Developer Tools > Profiler. It just gives a table of numbers, rather than a waterfall graph, so the information is there, but you can't just visually scan for long bars to find the slowest resources.
You should use page speed plugin from google to check what data takes most of the time to load. It will show separate load times for images as well.
If you're using lots of small pngs I suggest you combining them into one image and manipulating the display via css background property since they are part of styling and not information. In that case - instead of few images only one will be loaded and reused through all elements. In this case even preloading of one image is really easy.
Have you considered using CSS Sprites to combine all of your images into a single download? There are a number of tools online to help you do this, and it will significantly reduce the number of HTTP requests required by your page.
Make sure you have set the correct Expires header on your images to allow them to be cached.
Finally, take a look at YSlow which can provide you with futher optimisation tips.

Categories