Getting an image with PHP - php

Is it bad practise to retrieve images this way? I have a page to call this script about 100 times (there are 100 images). Can i cause server overload or too many http requests or something? I have problems with the server and i dont know if this is causing it :(
// SET THE CONTENT TYPE HEADER
header('Content-type: image/jpeg');
// GET THE IMAGE TO DISPLAY
$image = imagecreatefromjpeg( '../path/to/image/' . $_SESSION[ID] . '/thumbnail/' . $_GET[image]);
// OUTPUT IMAGE AND FREE MEMORY
imagejpeg($image);
imagedestroy($image);
I call the script from regular tags. The reason I call them through PHP is that the images are private to the user.
All help greatly appreciated!!

With this, you are :
Reading the content of a file
Evaluating that content to an in-memory image
Re-rendering that image
If you just want to send an image (that you have on disk) to your users, why not just use readfile(), like this :
header('Content-type: image/jpeg');
readfile('../path/to/image/' . $_SESSION[ID] . '/thumbnail/' . $_GET[image]);
With that, you'll just :
Read the file
and send its content
Without evaluating it to an image -- eliminating some useless computations in the process.
As a sidenote : you should not use $_GET[image] like that in your path : you must make sure no malicious data is injected via that parameter !
Else, anyone will potentially be able to access any possible file on your server... they just have to specify some relative path in the image parameter...

Yes, it's very bad. You're decoding a .jpg into a memory-based bitmap (which is "huge" compared to the original binary .jpg. You then recompress the bitmap into a jpeg.
So you're wasting a ton of
a) memory
b) CPU time
c) losing even more image quality, because jpg is a lossy format.
why not just do:
<?php
header('Content-type: text/jpeg');
readfile('/path/to/your/image.jpg');
instead?

To answer two particular questions from your question
Can i cause server overload or too many http requests or something?
yes, of course.
by both numerous HTTP requests and image processing.
You have to reduce number of images and implement some pagination to show images in smaller packs.
You may also implement some Conditional GET functionality to reduce bandwidth and load.
If things continue getting bad, and you have some resources to dispose, consider to install some content distribution proxy. nginx with X-Accel-Redirect header is a common example
I have problems with the server and i dont know if this is causing it :(
You shouldn't shoot in the dark then. Profile your site first.

Related

What is the fastest way to get images from external webpage?

I need a way to get the biggest 5 images from a generic external webpage.
I know that I can't do this with only ajax ( maybe I am wrong ) due cross-site security.
So I must use php+javascript.
I have just written this PHP code to get all images from external url:
$html = file_get_contents($link);
$dom = new domDocument;
$dom->loadHTML($html);
$dom->preserveWhiteSpace = false;
$images = $dom->getElementsByTagName('img');
foreach ($images as $image) {
echo $image->getAttribute('src');
}
So now what is the fastest way to get only the biggest 5 images of that page ?
With biggest I mean images with highest resolutions.
If you mean "biggest" as in in largest file size, then I think you are somehwat on the right track already. You would just need to find all the images in the source document, then likely make a HEAD request to the server where the image lies to (hopefully) get the file size information from the headers without downloading the file.
If "fastest" really is your concern, you could use cURL which has "multi" support for making parallel requests. Once you get the header information from the requests, you can determine the 5 biggest files and display the URL to them.
If the URL you are calling doesn't change much, you could probably cache the results locally to prevent the need to parse through the page and/or make HEAD requests on the images.
If "biggest" as in largest image size, then you are likely going to need to inspect the images on your server using an image library.
What is the fastest way to get images from external webpage?
With any method that you use, the network connection is by far your limiting factor. It makes no sense to optimize.
I need a way to get the biggest 5 images from a generic external webpage.
A HTTP HEAD request should give you information about how many bytes need to be transfered to download the image. The response to a HEAD request should be the HTTP header, that would have been sent if it where a GET request. Especially the HTTP body (which contains the actual image data) is omitted. Notice the word should instead of the (IMHO more preferable) word must.
Furthermore, the number of bytes is not an adequate measure for the number of pixels in the image. You might employ some heuristics based on the contant type (PNG has a different size than GIF has a different size than JPEG for the same number of pixels). I don't know if this is accurate enough for you. For example JPEG images can vary widely due to different compression levels.

faster fopen or file_get_contents?

i am running multiple websites with high traffic , as a requirement , all images are downloaded via image.php?id=IMAGE_ID_HERE .
If you ever done that before , you know that that file will be reading the file image and echoing it to the browser with special headers .
My problem is , the load on the server is very high (150-200) and TOP command shows multiple instances of image.php , so image.php is running slow !
the problem probably is fopen loading the image to the memory before sending it to the client. How to read a file and pass it through directly?
Thank you guys
UPDATE
After you optimized the code, used caching wherever possible, do create a CDN . couple of servers, sync methods, load balancers and no need to worry about requests anymore :)
fopen and file_get_contents are nearly equivalent
to speed up with consistence the page load you can use
http://www.php.net/fpassthru
or, even better
http://www.php.net/readfile
with those functions, content of file is printed directly, byte per byte
as opposed to file_get_contents, for example, where you store the whole data inside a variable
$var = file_get_contents();
so, to make these work correctly you will need to disable output buffering (otherwise it would make readfile() pointless) in the page that serves the images
hope this helps!
Why dont you cache the image content with apc ?
if(!apc_exists('img_'.$id)){
apc_store('img_'.$id,file_get_content(...));
}
echo apc_fetch('img_'.$id);
this way image content will not be read from your disk more than once.

Resize an image before user downloads it?

So this will no doubt come as a stupid question from an ignorant person, but I was wondering if there is any easy way out there of resizing an image BEFORE a user downloads it?
I am pulling in images from a 3rd party database, which I have no control over. I'm also not allowed to cache anything from it under their T&C.
They give a few different sizes for each image. But I am ending up resizing half of them on my pages with CSS.
So I was wondering if maybe using php or javascript or something! (I really have no clue do I), I could resize these images before my users waste time downloading much bigger versions.
The only reason I ask really, is that I know the Manchester United website kind of does it (with the aid of a piece of Adobe stuff I think), so I thought that maybe there might be something out there that anyone could use?
http://www.manutd.com/en/News-And-Features/Football-News/2011/May/Sir-Alex-Blackburn-reaction.aspx
http://www.manutd.com/~/media/64B766EE4A37488AA65DC7B08E5ABC1B.ashx?h=179&la=ar-SA&w=480&rgn=0,78,1200,524
-> 18kb compared to 160kb -> http://www.manutd.com/~/media/64B766EE4A37488AA65DC7B08E5ABC1B.ashx
(obviously I don't want the cropping technique)
If the images are coming from a server you don't control, the short answer is NO. You can't resize an image until you've downloaded it. Without caching a resized version, you are at the mercy of the 3rd party server. Unless you use a server side proxy program, yet this is probably more trouble than it is worth.
Yet as I've pointed out in the comments, http://www.manutd.com will resize their images for you. In the link h=height, w=width and rgn=region (left,top,right,bottom)
http://www.manutd.com/~/media/64B766EE4A37488AA65DC7B08E5ABC1B.ashx?h=80&la=ar-SA&w=120&rgn=0,0,1200,800
You only need the h and the w. If your h and w don't match the aspect of the image it will crop rather than skew. look at both of these. The image is 1200x800 aspect ratio 3x2
w=240, h=160 3x2 (whole image)
http://www.manutd.com/~/media/64B766EE4A37488AA65DC7B08E5ABC1B.ashx?w=240&h=160
w=160, h=160 1x1 (cropped)
http://www.manutd.com/~/media/64B766EE4A37488AA65DC7B08E5ABC1B.ashx?w=160&h=160
After I've played with it more, you can get by with just the width(w), and I'm assuming this also applies to just the height. (EDIT: yes it does)
Whole image, 480px wide...
http://www.manutd.com/~/media/64B766EE4A37488AA65DC7B08E5ABC1B.ashx?w=480
MORE EDITING: Understand, any time you see a '?' in a url, you are requesting a page from a program, and the stuff after the '?' are parameters for the program, and '&' seperates the parameters. The server at manutd.com is using a program to resize their images, just like a proxy program would resize images for you. If you did resort to a proxy program, if it was a decent one, it would take a link like http://YourServer.host/proxyProgram.php?img=imageHost.org/imageName.jpg&w=240&h=160 given a link such as that there are all sorts of server side solutions to resize the image.
Yet without a cache there is the possibility that you will resize the same image many times, and just the thought of that turns me off.
I'm gonna quit editing now!
Have fun!
Skip
Try using ImageMagick for PHP. It allows you to resize and modify images server-side. There are some examples here
Just off the top of my head, here's a pretty ugly way to get it done:
Create a PHP page that takes all of your requests. So instead of doing this...
<img src="http://other.domain/img.jpg" />
Do this:
<img src="http://your.domain/images.php?name=img.jpg" />
Then have your PHP page grab the image from the 3rd party site, and recreate it at whatever size you need:
$newe = imagecreatetruecolor($width, $height);
$old = imagecreatefromjpeg($fullpath);
$oldSizes = getimagesize($fullpath);
imagecopyresampled($newe, $old, 0, 0, 0, 0, $width, $height, imagesx($old), imagesy($old));
header('Content-type: image/jpeg');
echo $newe;
Some stuff that might trip this up:
If the T&C doesn't allow caching of images, they might not allow you to access their images with server-side code.
You'll have to adjust some of the code if you're doing other image types (gif, png)
This has to be done on the server. The technique used depends on what you're using on your server. If it's .NET, there is built-in functionality to handle resizing.
# Taze well server side, but he gets the images from another server and it seems you have not read it what he is doing and what he is allowed to do.
theoretically you could cache it (what you are not allowed to) on your own server, resize it with gdlib or imagemagick and give it to the user and then delete the copy from the server
but thats against the rules of the 3rd party

Having an image stripped of metadata upon upload in PHP

A certain site I know recently upgraded their bandwith from 2,5 TB monthly to 3,5 TB.
Reason is they went over the 2,5 limit recently. They're complaining they don't know how to get down the bandwidth usage.
One thing I haven't seen them consider is the fact that JPEG and other images that are displayed on the site(and it is an image-heavy site) can contain metadata. Where the picture was taken and such.
Fact of the matter is, this information is of no importance whatsoever on that site. It's not gonna be used, ever. Yet it's still adding to the bandwidth, since it increases the filesize of every images from a few bytes to a few kilobytes.
On a site that uses up more then 2,5 TB per month, stripping the several thousands images of their metadata will help decrease the bandwidth usage at least by a few Gigabytes per month I think, if not more.
So is there a way to do this in PHP? And also, for the allready existing files, does anybody know a good automatic metadata remover? I know of JPEG & PNG Stripper, but that's not very good... Might be usefull for initial cleaning though...
It's trivial with GD:
$img = imagecreatefromjpeg("myimg.jpg");
imagejpeg($img, "newimg.jpg", $quality);
imagedestroy($img);
This won't transfer EXIF data. Don't know how much bandwidth it will actually save, though, but you could use the code above to increase the compression of the images. That would save a lot of bandwidth, although it possibly won't be very popular.
I seriously doubt image metadata is the root of all evil here.
Some questions to take into consideration:
How is the webserver configured?
Does it issue http 304 responses properly?
Isn't there some kind of hand-made caching/streaming of data through php scripting that prevents said data from being cached by the browser? (in which case, url rewriting and http redirections should be considered).
Check out Smush.it! It will strip all un-necs info from an image. They have an API you can use to crunch the images.
Note: By Design, it may change the filetype on you. This is on purpose. If another filetype can display the same image with the same quality, with less bytes it will give you a new file.
I think you need to profile this. You might be right about it saving a few GB but thats relatively little on 2.5TB of bandwidth. You need real data about what is being served most and work on that. If you do find it is images that send your bandwidth usage so high you first should check your caching headers and 304 responses, you also might want to investigate using something like amazon S3 to serve your images. I have managed to reduce bandwidth costs a lot by doing this.
That said, if the EXIF data is really making that much of a difference then you can use the GD library to copy a jpeg image using the imagejpeg function. This won't copy EXIF data.
Emil H's probably addresses the question the best.
But I wanted to add that this will almost certainly not save you as much as you may think. This type of metadata takes up very little space; I would think that
Re-compressing the images to a smaller file size, and
Cropping or resizing to reduce the resolution of the images
are both going to have a much greater effect. With point one alone you could probably drop bandwidth 50% and with both, you could drop bandwidth 80% - that is if you are willing to sacrifice some image size.
If not, you could always have the default view at a smaller size, with an 'enlarge' link. Most people just browsing will see the smaller image, and only those who want the largest size will click to enlarge it, so you'll still get almost all the bandwidth saving. This is what Flickr does, for example.
Maybe some sort of hex data manipulation would help here. I'm facing the same problem and investigating on some sort of automated solution.
Just wondering if that can be done and if possible, I'll write a php class for this.
Might be smart to do all the image manipulation on the client side (using a java applet such as facebook does) and then when the image is compressed, resized and fully stripped of unnecessary pixels and content, it can be uploaded at it's optimal size, saving you bandwidth and server side performance! (at the cost of initial development)

scripts embedded in images

I run a small browser MMO, and I have a problem where a couple users are embedding scripts into their profile images, and using them to make attacks against said users, and my game in general. Is there a way to protect against this, or do I need to start blocking people from being able to use their own custom images?
If it helps any, it's done in PHP/MySQL.
Most likely what is hapening is they are giving you a link to a script that is building the image and returning it on the fly, there is nothing aside from no allowing users to use external images, that you can do about it, one option to prevent it is to download and store the image on your server as opposed to linking to the external image.
--I decided to provide a sample
This image is created on the fly, the url I'm giving is: http://unkwndesign.com/profilePic.png:
alt text http://unkwndesign.com/profilePic.png
now, profilePic.png is a folder that when requested is providing index.php which, using gd, is getting the SO logo, and imposing your IP address over it, to be very clear here I AM NOT LOGING THIS OR ANY OTHER DATA the source for the index.php is:
<?php
$image = imagecreatefrompng("http://stackoverflow.com/Content/Img/stackoverflow-logo-250.png");
$font_size = 12;
$color = imagecolorallocate($image, 0,0,0);
ImageTTFText ($image, $font_size, 0, 55, 35, $color, "arial.ttf",$_SERVER['REMOTE_ADDR']);
header("Content-type: image/png");
imagepng($image);
imagedestroy($image);
?>
Since I am returning an image, with a proper extension, and the proper mime-type there is no way to detect what I am doing.
If the server had downloaded my image and stored it locally the IP address would be that of the server, which would ruin the fun of doing it and likely prove to be enough of a discurageing factor to stop the behavior.
Try having GD process those images. If it throws errors, you know you have a problem. Since image upload is a relatively rare operation, it shouldn't cause load problems to do some kind of arbitrary manipulation.
Some of the most common practices for validating image integrity include checking the MIME type, or binary reading the first few bytes of an image. Although these are not the best, it's worth a try to fend some of them off.
Are you talking about the issue where IE will interpret an image with HTML tags in it as being an HTML page, thus allowing HTML and script injection from user-submitted images?
(The bug being that IE will do this even if you tell it the Content-Type is an image/ type. Microsoft have caused endless security disasters with this attempt to be ‘helpful’.)
If so, the usual solution is to serve user-submitted images from a different hostname, one which does not have access to cookies or scripting at the main hostname from which you serve your web application.
Be sure to lock down your virtual servers so that the image server and the app server are each only available from one particular hostname (and the app server must not be accessible via IP address).
This will fix the cross-site-scripting issues. You may still have cross-site-request-forgery requests to deal with, but that's a different problem and can be exploited without image-wrapped script-injection.

Categories