Users on my site upload images of various dimensions. Some might be huge some might be small so resizing to a default image might work but small images being resized would look ugly. Is there anything i can do to resize all images to one size and make the quality clear without making it look crappy?
You will not be able to make small images "High Definition". No image can be scaled to a larger size without loosing clarity (with the exception of vector graphics, but I'm guessing that isn't what you're working with). You can, however, resize larger graphics into smaller ones.
Try phpThumb for an excellent image manipulation library, or see JohnPS's answer for a good solution for re-sizing larger images.
There is a PHP PEAR package Image_Transform that you can install to do basic resizing of images.
This package includes a function fit that will shrink an image to fit inside a box (width and height), but leave it the same size if it's already small enough. See scaling function docs.
Related
I have the following working fine:
User uploads an image and the image is resized using imagecopyresampled and imagejpeg (keeping the proportions) so it fits the cropping DIV.
The user then crops the image with JCrop and then the resized above image is then cut into the 5 required crop sizes using imagecopyresampled and imagejpeg.
The problem I've found is by cropping from the resized image doesn't take advantage of the very original image that is often larger and better quality.
So I want to display an image that fits the DIV but crop from the very original image to get the best quality.
When I just change the cropping to the original image the crop is of the wrong part of the image as the sizing is different. This is obvious I assume. Diff sized images.
How can I display an image that fits the DIV for Jcrop but then actually crop off the original which is often larger?
Ideas I'm testing:
displaying the original image in the DIV (not resized by php but resized by CSS). I'm not sure if this will make any difference.
try to use some math based equation to get the correct crop. Unsure how I'll do this currently.
Ideas would be great...
thx
Personally I would go with the maths based solution if you have to work with the full sized image server side. Re-scaling in the browser directly or through CSS is not ideal IMO and differs greatly between browsers. If you use the math based option you could calculate the co-ordinates on the smaller image crop and scale that up to get the right portion of the original image.
On a side note you may find (depending on the image sizes and image types you allow uploaded) that copying and resizing on the PHP side is very CPU and memory intensive. A png file for example of roughly 1024 x 768 in high resolution can take up to 60MB to 80MB of RAM just to resize (per image re-sample due to compression on the png file) and that is regardless of which PHP image manipulation modules (ImageMagick, etc) you use. There is no perfect way to handle image uploads from the PHP server end (other than throwing heaps of memory and CPU at the task). There are some good jQuery solutions however that resize on the client side before the upload (e.g. Plupload, etc) which means you will only be working with a reduced size image on the server side. There are also some JQuery client side cropping scripts which are good. Either way a combination of PHP and jQuery would be the best IMO.
I currently have a website that aggregates images, the problem is those images are very large, and when I display them at smaller sizes, they still eat up a lot of computation.
I'm curious as to how to actually force reduced size/quality without just smooshing it into a <div> element.
Here is my site in question, you can see how 'laggy' it gets when you produce images:
http://newgameplus.nikuai.net/TEST/index.html
I was using timthumb.php to resize the images, but the host doesn't allow that script for some reason.
The best way to do this is to use some sort of image re-factoring service.
I have written my own one that uses ffmpeg and imagemagik to resize images on the fly and to generate arbitrarily sized thumbnails from videos. I memcache the results to make subsequent requests super snappy, and have some interesting additions such as automatic Point of Interest detection using Face Detection and Image Entropy, with the aim being "nice thumbnails, no matter the size"
An example of such a service is src.sencha.io - the documentation for this service is here but I have included the important bits below.
Specify Image Size
<img src='http://src.sencha.io/320/200/http://yourdomain.com/path/to/image.jpg'
alt='My constrained image'
width='320'
height='200' />
This will take your image (http://yourdomain.com/path/to/image.jpg) and run in through the resizing service, returning a 320x200 version of your image. You cannot set the gravity/point-of-interest using this service though (as far as I can tell).
You can also use this service to change formats, resize dataurls, do percentage resizes and use the resizing service to detect the width/height of the user agent requesting the image.
There are many such services available on the web.
I agree with slash: It depends on how the images are being resized. One thing I do for a site is use photoshop (or GIMP) to resize the image to the exact dimensions i need for the page i'm using the image for. Then i also include the same dimensions in the width-height attributes on the image itself.
Additionally, you can use your photo editing software to check the size of your image if you were to save it with a different file extension, and (specifically with jpeg and png files) photoshop will let you reduce the quality, which lowers file size and speeds up page loading.
How does Facebook and other image intensive sites maintain a thumbnail size of the full image without shrinking or distorting the thumbnail?
Are these thumbs cropped versions of the original and stored so when the thumb is clicked they reference the full size image.
My images are stretched or shrunk if I simply try to confine them to a preset size in my img tag.
Ideally I would like to crop each image to fit a preset size without distorting the aspect ratio. If this can be done on the fly is this an efficient way to handle images in high volumes?
It is considered bad practice to resize images with your HTML markup or CSS styles. Scaling them up means bad quality, scaling them down means that your users have to download a larger file than necessary, which hurts speed.
There are libraries built for image resizing and caching for almost any language or framework. They mostly feature cropping as well, in order to maintain a standard aspect ratio. For PHP, have a look at phpThumb, which is based on GD/ImageMagick.
The resulting resized versions of your images are saved in a cache folder, so they don't need to be regenerated every time the image is requested. This way, the performance is almost as good as serving static files. The only overhead is a small check if the resized version exists in the cache.
I can't speak directly for facebook, but most sites upload a large image, and then smaller, preset sizes are automatically recreated (usually by the scripting language and some kind of library, like PHP/GD) and saved with a similar file name pattern, so that you can use as little bandwidth as possible, improve loading times, and avoid manipulating images with css.
I'm implementing an image upload facility for my website. The uploading facility is complete but what I'm working on at the moment is manipulating the images. For this task I am using PHPThumb (http://phpthumb.gxdlabs.com).
Anyway as I go along I'm coming across potential issues, to do with resizing and compression. Basically I want to acheive the following results:
The ideal image dimensions are: 800px width, 600px height. If an uploaded image exceeds either of these dimensions, it will need be resized to meet the requirements. Otherwise, leave as it is.
The ideal file size is 200kb. If an uploaded image exceeds this then it will need to be compressed to meet this requirement. Otherwise, leave as it is.
So in a nutshell: 1) Check the dimensions, resize if required. 2) Check the filesize, compress if required.
Has anybody done anything like this / could you give me some pointers? Is PHPThumb the correct tool to do this in?
As long as I can see just from the docs, phpThumb should be good enough for this task.
phpThumb can be used in an easy way like this:
Call phpThumb() just like you would a normal image.
Examples:
<IMG SRC="phpThumb.php?src=/image.jpg&w=100">
<IMG SRC="phpThumb.php?src=http://example.com/foo.jpg">
See the "demo" link on
http://phpthumb.sourceforge.net for
more usage examples). Parameters that
can be passed are listed below under
"URL Parameters".
The phpThumb() website says that it has the following image-processing feature:
Quality can be auto-adjusted to fit a certain output byte size.
This looks like what you want. It works by passing a maximum byte size parameter. The readme tells you how:
maxb = MAXimum Byte size - output quality is auto-set to
fit thumbnail into "maxb" bytes (compression
quality is adjusted for JPEG, bit depth is adjusted
for PNG and GIF)
edit: it seems that we were talking about different utilities with the same name. I did not found the same functionality in the other one, so I can only recommend using what I've found.
See the gd extension, in particular the functions getimagesize and imagecopyresized.
Enforcing a file size is more difficult, the best you can do is stick with jpeg and some arbitrary level of compression; for a certain combination of dimensions+compression, you can more or less predict the final file size of the image.
As to PHPThumb, I have no idea, as I've never used it.
Lot of user opening my site in mobile,
Tell me which image type will load quickly in mobile device ,
jpg,gif,png ,
Which one is best ?
Go with PNG8 where possible and limit the color palette. Try to only use as many colors as strictly needed not more.
There are PNG tweaking tools which allow you to get rid of unnecessary chunks of that PNG. For more information on PNG chunks there is the PNG chunks specification found here.
http://www.libpng.org/pub/png/spec/1.2/PNG-Chunks.html
The tweaking tool I was talking about can be found here. It runs on Linux using wine as far as I have tested it.
http://entropymine.com/jason/tweakpng/
Additionally TinySVG is a pretty interesting format. SVG graphics allow lossless rescaling and because it is in fact a XML file you can modify it in a programmatic way.
EDIT: One note on JPEG graphics. If the file size exceeds 10kb save it progressive if it is under 10bk save it baseline. It is a small optimization for JPEGs.
There is no single "best" image type. The image that loads the quickest is the smallest one. But quick loading does you no good if your image is distorted and unidentifiable.
Pick image format base on the type of image you are trying to render:
JPEG is hands down the best for photos 99% of the time when you consider perceived image quality relative to file size; but JPEG's lossy compression algorithm relies on the existence of noise (lots of soft/subtle color transitions) and a lack of sharp lines/edges in the source image for the best effect. Never use JPEG for rasterized text!
PNG is the best option for rasterized images comprised of sharp edges and smooth/solid color blocks. Basically, anytime you need clean/lossless images (logos, icons, UI elements, text, etc.—anything but photos/paintings basically), you'd use PNG. It is also the only real option when you need alpha channel/partial transparency.
GIF is the only universally support animated image format, but aside from that I would just use PNG in most cases.
SVG is for vector images, but I'm not sure how many mobile browsers support it.
From a quick read, it seems that jpeg would be the best option. However, that is dependent on any specific needs you may have for some images (ie jpeg does not do transparent backgrounds, etc.)
Also, especially for a mobile site, if the images are frequently used parts of the website's layout (ie buttons, etc.) then using CSS Sprites is a good practice as it reduces the number of elements being downloaded to the user (of course, at the tradeoff that the image(s) which are downloaded are larger).