I'm building a web app where users can store links, with 200x200 pictures associated. By default, I'd like to crawl the link for images and then return thumbnails of the biggest ones (from which the user can select the "official" thumbnail). I want this all to happen via AJAX. My question is: what is the best way to do this?
Currently, I'm using the PHP Simple HTTP Parser to scan a URL. I then find the src attribute of all the <img> tags, use getimagesize to store the image size located at that URL, sort the array from biggest to smallest and return the top 5 biggest image URL's via AJAX to the client. Then the client sends a different AJAX request for each one which makes a server-side ImageMagick script download and cut the image to a thumbnail, save it in a temporary folder and then return the URL of this thumbnail, which the client finally loads on his browser.
Needless to say, this is a little complicated and probably really inefficient. Running this process on http://en.wikipedia.org takes about 10-15 seconds from start to finish. I'm not certain there are any more efficient ways, however.
I'd do it in one AJAX request, with the script automatically resizing the biggest 5 images on the first pass, saving them, and returning a JSON array with the resized image URLs for the client.
You should probably use PHP's DOMDocument class to grab/parse the html page.
getimagesize() means you have to download each image and process them. Perhaps you should consider simply showing the user ALL images, by simply placing img tags that link back to the original HTML page. You could size these however you like using the tags. This way you do not have to download/process a single image until the user has actually selected one for the thumbnail.
Interested if / how you solved this?
In the end I looped through the images doing getimagesize() until both height and width were over a certain size, and then broke the loop.
This way its slightly more efficient as it only downloads as many images as it needs
Related
I'm currently rewriting a website that need a lot of different sizes for each images. In the past I was doing it by creating the thumbnails images for all sizes on the upload. But now I have a doubt about is performance. This is because now I have to change my design and half of my images are not of the right size. So I think of 2 solutions :
Keep doing this and add a button on the backend to re-generate all the images. The problem is that I always need to know every sizes needed by every part of the site.
Only upload the real size image, and when displaying it, put in the SRC tag something like sr="thumbs.php?img=my-image-path/image.jpg&width=120&height=120". Then create the thumb and display it. Also my script would check if the thumb already exists, if it does it doesn't need to recrate it so just display it. Each 5 Days launch a script with a crontask to delete all the thumbs (to be sure to only use the usefull ones).
I think that the second solution is better but I'm a little concern by the fact that I need to call php everytime an image is shown, even if it's already created, it's php that give it to display...
Thanks for your advises
Based on the original question and subsequent comments, it would sound like on-demand generation would be suitable for you, as it doesn't sound like you will have a demanding environment in terms of absolutely minimizing the amount of download time to the end client.
It seems you already have a grasp around the option to give your <img> tags a src value that is a PHP script, with that script either serving up a cached thumbnail if it exists, or generating it on the fly, caching it, and then serving it up, so let me give you another option.
Generally speaking, utilizing PHP to serve up static resources is not a great idea as you begin to scale your site as
This would require the additional overhead of invoking PHP to serve these sorts of requests, something much more optimized with the basic web server like Apache, Nginx, etc. This means your site is going to be able to handle less traffic per server because it is using extra memory, CPU, etc. in order to serve up this static content.
It makes it hard to move those static resources into a single repository outside of the server for serving up content (such as CDN). This means you have to duplicate your files on each and every web server you have powering a site.
As such, my suggestion would be to still serve up the images as static image files via the webserver, but generate thumbnails on the fly if they are missing. To achieve this you can simply create a custom redirect rule or 404 handler on the web server, such that requests in your thumbnail directory which do not match an existing thumbnail image could be redirected to a PHP script to automatically generate the thumbnail and serve up the image (without the browser even knowing it). Future requests against this thumbnail would be served up as a static image.
This scales quite nicely as, if in the future you have the need to move your static images to a single server (or CDN), you can just use an origin-pull mechanism to try to get the content from your main servers, which will auto-generate them via the same mechanism I just mentioned.
Use the second option, if you don't have too much storage and first if you don't have too much CPU.
Or you can combine these: generate and store the image at the first open of the php thumbnails generator and nex time just give back the cached image.
With this solution you'll have only the necessary images and if you want you can delete sometimes the older ones.
I've build a CMS with photo album. Pretty simple, most stuff static, static HTML pages, no database, just (as little as possible) text files containing some JSON stuff.
The webinterface for the admin panel is all in jQuery with a PHP (Zend Framework) based backend. A much as possible is done within the browser so the backend is pretty bare.
Now the photo album works currently like this:
Clicking link 'Media'
Fetching a JSON string from the backend containing an object with all albums and for every album all the photo's
Rendering an unordered list with all the albums
Rendering an unordered list within each album list item with all the pictures
Uploading:
Drop one or more jpeg/png files into the browser to create a new album
Drop one or more jpeg/png files into an album to append those files to the album under the cursor
Send all dropped files (using this jQuery drag drop upload plugin) to the backend
Backend receives all files (while displaying a nice progress bar)
Backend loops through all uploaded files, while the webinterface displays a nice spinner
Each file is resized to a maximum size specified and renders a thumbnail at max 133x133 px
Each file is appended to an array with the serverside filename and thumbnail name
[Not yet implemented: rendering the (updated) static html page for the album overview and for each image]
Array with all newly uploaded files is converted to JSON and sent to client
Webinterface appends all new files as list items (displaying the thumbnail)
Uploading done
This is all going pretty well, upto +- 600 images or +- 900MB. That's fine by me, if the user wants to upload more files at once, well, do it in two stages. The problem is, the backend processing is a bitch. Converting 100+ images at a good size (1.5MB each) to the maximum size and generating the thumbnail is taking way to long. I'm doing this with PHP GD. Didn't take me too much time (or no time at all), to find out that that's the problem. My guess is that there is no way I'm going to speed this up within PHP.
So here are a few questions:
Will ImageMagick be faster? I'm not a fan, so please so no, also, I don't want to install this on my server that badly..
Is there a really, really lightweight command-line program that does the same with just a few commands (and I know that I'm not alluding to ImageMagick)?
If the answer to the previous question is no: what would be the best way to do this? Don't say Java, I'm no that big of a fan of Java also. Some C(-dialect)? Preferably one with a powerful, yet lightweight image library for the nearest neighbor, bilinear and bicubic interpolation algorithms.
Could my architecture be changed? At this moment, the images start appearing in the browser once the thumbnail is inserted, thus after the whole JSON array is received, causing the entire action having to complete and generating all the image data before any kind of feedback is received in the browser. This means that the spinner (without any indication of how long the process is going to take or how many images have been completed) will be displayed for a long, long time. Is it an idea to use Javascripts FileReader to preload the images from the users system, generate the thumbnails in the browser and display them after uploading is done immediately? And on the backend: just receiving the file, writing them to disk, executing a command-line command, immediately send response to browser and converting in the background?
How do I prevent client side abort event of an AJAX request? When uploading and converting, a warning should be displayed when the user want to close the page or when the #hash is being tried to change.
Thanks. Hope you guys can help me. Just so you know: the client side is pretty complex with way to much code. I'd rather change the backend.
I'm creating a web app that allows users to specify an image url to display an image along with their submission.
To avoid some potentially large images and odd sizes, i'd like to use PHP to limit/resize the image before it gets output to the browser (I know i could use max-width in CSS but this is unnecessary downloading)
Is this possible?
PHP has a library named GD. It's fairly easy to check the size of an image with the getimagesize() function.
You can find the complete code for it with 2 module/extensions: http://www.fliquidstudios.com/2009/05/07/resizing-images-in-php-with-gd-and-imagick/
a comparison of them can be found here: http://dreamfall.blogspot.com/2008/02/php-benchmarks-gd-vs-imagemagick.html which may be of use if you want optimize features.
I have read a few similar questions and answers, but none completely address my issue.
Here is My Scenario:
I have what is similar to a tinyMCE (a home-brew version though) kind of editor. It lets users enter some text, and an image or two, etc. I have code that takes the items in there and renders them into a smaller div (what is essentially a thumbnail) in real time.
Here is What I Want to Do
Ultimately, the user may want to use their 'page' somewhere else, so I would like to let them go to a screen, view thumbnails of each page, and pick one.
Here is the Problem
Obviously, I could just use the same thumbnail code to render each page thumbnail. However, it can be bandwidth intensive (each page could have several images, not to mention the calculation would have to be performed many times - we are talking perhaps 40 to 50 thumbnails on a preview page).
So, I wanted to try to take the thumbnail div, and somehow create a png or jpg when they save the page in the editor (so the code for the page, and also a thumbnail image), and push it up to my PHP script to save the image to the server.
My first thought was that maybe canvas could do that, but there is the issue of translating the text and images onto the canvas first, which may or may not be possible.
So there it is. I am interested in any and all options, including commercial libraries if available that will do this -- only thing is, would like it to be in javascript.
You may want to look at:
http://html2canvas.hertzen.com/
A similar question was already asked:
Screen Grab with PHP and/or Javascript?
I'm optimizing my website according to Google's site optimization standards:
http://code.google.com/speed/page-sp...mageDimensions
For those who are familiar with it, I'm using Firebug -> "Page Speed" tool to analyse my site's 'weak' areas.
About the above link, image dimensions, the question is - I have lots of dynamic images on my site that are uploaded via a CMS and therefore vary in height/width. Therefore, how important is it to specify image dimensions? If a page had say 5-10 images from the CMS, which option below is better:
a) Don't specify image dimensions
b) Use PHP's getimagesize function to get the image dimension dynamically, and put it in the IMG tag as a "width" and "height"
c) Update our database to store the width/height per image (I'm already querying other info from the image table in my database, such as the text for the "alt" attribute) and then access those columns for the IMG tag on the front end?
I think "c" is the best option but I'd like to hear if anyone has any recommendation or statistics on which option is better. Obviously "c" means we need to query more data from the database.
We're also using a separate database server, and making use of browser caching. (http://code.google.com/speed/page-speed/docs/caching.html)
Thanks
c
Setting the image size will give you
a better page load behaviour.
If you do not set the size some
parts of the page will resize while
the page loads, and that is just
plain ugly.
If you already query your database
for the alt text, I doubt you will
notice any performance problems if
you load two more columns.
I actually tried option B on a large page as a test and found it created massive performance issues. I would say that you're best to run with option C. Update your CMS to add these dimensions with each image added and maybe run a cron during a quiet time to update all existing images without dimensions