i'm building a travel blog (Php) where I might be loading dozens of pictures (size 500x375 weight 150-200kb) so that the page weights more than 4-5Mb.
Which is the way to go apart from caching/gzip to decrease waiting time and make a better user experience?
I'm on a shared server as my budget is very low
thanks
Some options:
split up the images across multiple pages
use a 'lazy load' script that will only request images as they come into the viewport
use AJAX to request images as needed via a user action
leverage external hosting of the images (flickr, etc) to split the server requests amongst different servers.
If you're displaying dozens of images on one page, I would consider just showing small images / thumbnails that get enlarged when the visitor clicks on them.
There are some points that solve this issue
1) Show few images and below that show more link or icon
2) After clicking on that give ajax call and show other images
3) Also you use 'jQuery lazy loading plugin'(it's very easy to integrate..click here to see integration step)
Related
i'm building a form page for create an ad, and this requires the addition of 1 or more images, so the user must compile the form, add the images, and send it with POST for go to the preview page before the ad is published.
The problem is the time taken by POST process for load the images, it's too much, is there a way for reduce this time? i know there is a method for resize the images with Canvas before the uploading, but what about the original file sent by the form with POST?
Sometimes i/o speed of virtual machines is not good. Sometimes the webservers allow only one connection from browser. You have to analyze what's going on on your server.
Check here.
I am creating a website where there are many pages to scroll through using next and prev buttons. I started by creating a java slideshow, but then there where many problems creating hashtags and subsequent social like buttons for each image. I ditched that framework because 1) users with JS turned off could not enjoy the images at all and 2) it was a pain in my ass.
What I have now is a collection of many pages with an image on it where each next button has a link to the next page.
What is the best way to make the page transition look like nothing changes except for a preloaded image (header, asides and footer don't 'blink')? Much like funnycatpix.com/_pics/Cat_Lookout.htm
Keep in mind I don't want to use iframes or ajax and I want to keep jquery to a minimum. I have PHP 5 on my server, as well.
Do I cache the images of the next page? Do I cache the entire next page? Do I change the header controls to extended cache lifetime?
Without seeing any code I can't offer a lot of specific help. You can cache an image very easily just by using this:
$("<img>").attr('src', imageSource);
This can be used to preload images, so what you can do is preload the images that will appear on the "next" page when the page loads or each time "next" is clicked.
I have a web application something like image gallery for the user with carousel at the bottom of the application. I was thinking of
a scenario for example what if the user uploaded 1000 images on the application image gallery.
I was just wandering what is the proper way or technique to load 1000 images that is fast so that the user can view it immediately.
Please help me.
1000 images is a lot. I would definitely not load them all at once, but rather in batches somehow. Kind of like Facebook does (I think). It fills up the view, and when you scroll down it keeps loading more, but only as you scroll down. Could also simplify it by using a "Load more"-button.
If it really is a carousel as you say, I would expect it to only show a certain number of images. In that case I would just load the ones visible and for example twice that in a buffer. Then when the user goes to the next page you can replace the current ones with the first ones in the buffer and load some more into the buffer.
Loading 1000 images at one time would be a ridiculous waste of bandwidth, not to mention it would cause major lag for the client. Some client-side JavaScript/jQuery would be required rather than PHP to load the images dynamically as the user scrolls through them, so no more resources/images are requested than needed at the specific time, say 10 at a time. Using the .click() event in jQuery on an element and checking for the end of the 'carousel' so to speak, and then using something like $("#element").attr("src", "imagePath.jpg"); to swap the images shown on the carousel should work.
I'm working on a website for a specific client. And he wants to be able to add link to the website, and on mouse hover to have a image of that website appear.
Now, he doesen't want to take an image of the website, he only wants to input the link and have the website do everything else.
So my question is ->
Is there a way (eg. google API) to get a website image only by providing the url via php?
Sort of like in google, when you hover over a lik of a page, a tooltip pops up to the right with an image.
Any help is, as always, appriciated :)
Here is a list of 10 free thumbnail services
http://www.webresourcesdepot.com/10-free-website-thumbnail-generation-services/
You can simply refer to the URLs of these services, e.g.
<img src="http://SnapCasa.com/Get.aspx?code=[code]&size=[size]&url=[url]" />
or make a CURL call from one of your PHP scripts and temporarily store/permanently save the image that was generated.
Have recently developed Thumbnailspro.com. It is currently free to use while in beta testing as we work out the bugs, but so far its getting quite popular, you can request thumbnails directly from your website using the code below :
http://thumbnailspro.com/thumb/http://msn.com&s=150
s=Size, size can be anywhere from 10 to 1000 pixels just add s=300 to display a thumbnail 300 pixels in width. We are trying to add more options as we go for thumbnail requests and at the same time trying to keep it as simple as possible so you don't have to enter something like the code below to get your thumbnails :
http://somethumbnailsite.com/viewurl.php?url=http://msn.com&x=200&y=300&bwidth=1024&bheight=768&rotate=76&what_the_hell%20_is_all_this_crap!
So is much more effecient!
Like the service or have any bugs contact us at admin#thumbnailspro.com!
No. The only way to do this is to request the HTML for the page, render the page and then create a thumbnail from that page render. Google does this because in the process of spidering the web, they already get all that data, and they've got a nice optimized rendering engine (Chrome) that they can put the data through, and then they've got tons of online storage space to store the cached image. There's a lot of work there, though.
So I have my own webpage here, which is a sortable thumbnails page. The load() event activates each thumbnail when the first related image is loaded. Since I'm grabbing <img> tags and text content from a hidden div on the page, the thumb activation prevents the user from clicking through to a yet-unloaded image and then waiting while the preload takes place in the background.
The call is pretty simple:
$('#content img:first-child').load(activateThumb).each(function(){
if(this.complete || this.complete === undefined)this.load();});
the .each() catches any cached images and manually triggers the load() event. Worked great and was a fast and lean website. Now, as the site continues to grow, there are over 100 <img> tags in the single HTML file and I'm wondering if there's a conventional limit that I'm approaching. Should I split the page onto 35 different html files? Should I lose the tags and the slick preloading effect in favor of a server-side request for the images on demand?
What's your instinct, as a good programmer?
Well there is no clear limit. You can continue doing the way you have done on your site for as many images as possible.
But it's just that the user might get frustrated while waiting for all the images to get 'activated'.
So in turn what you could do is to have pagination & display say 20 images per page. This way you make the image loading relatively faster.
Also after you load page1, if the user is still in page1 you could start pre-fetching page2. So as the user clicks page2 he sees a very responsive site :)
There is no one rule here. In fact if you see Google Images now a days they do something like what you have done.
You can do it like many Web 2.0 sites do it:
In the beginning, load the images displayed on currently visible part of the page.
Then load other images when user scrolls the page down.