Performance hit of using php versus html with 'img src' - php

I'm displaying a gallery of pictures which I store below the root for security. There are thumbnails for each jpeg. When displaying the gallery, I have successfully set
<img src='./php/getfile.php?file=xyz-thumb.jpg'></a>
getfile.php processes each thumbnail with the following code. When clicking on the thumbnail, the same code loads the larger version of the image.
I can already tell this code is slower than html and with potentially 20-30 thumbnails on a page, I am debating whether I need to keep the thumbnails visible to public_html for performance sake. Is there a quicker way to display the thumbnails? Is fpassthru() any quicker or more desirable for other reasons?
// File Exists?
if( file_exists($fullfilename)){
// Parse Info / Get Extension
$fsize = filesize($fullfilename);
$path_parts = pathinfo($fullfilename);
$ext = strtolower($path_parts["extension"]);
// Determine Content Type
switch ($ext) {
case "pdf": $ctype="application/pdf"; break;
case "exe": $ctype="application/octet-stream"; break;
case "zip": $ctype="application/zip"; break;
case "doc": $ctype="application/msword"; break;
case "xls": $ctype="application/vnd.ms-excel"; break;
case "ppt": $ctype="application/vnd.ms-powerpoint"; break;
case "gif": $ctype="image/gif"; break;
case "png": $ctype="image/png"; break;
case "jpeg":
case "jpg": $ctype="image/jpg"; break;
default: $ctype="application/force-download";
}
header("Pragma: public"); // required
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private",false); // required for certain browsers
header("Content-Type: $ctype");
if ($mode == "view"){
// View file
header('Content-Disposition: inline; filename='.basename($fullfilename));
}
else {
// Download file
header('Content-Disposition: attachment; filename='.basename($fullfilename));
}
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".$fsize);
ob_clean();
flush();
readfile( $fullfilename );
} else
die('File Not Found:' . $fullfilename);

Based on your comments above, I would say this sounds like a very inefficient way to do it, mostly because it stops normal caching. If somebody is likely to automate scraping of your full size images, then they will find a way around it (e.g. Selenium RC).
If you're only concern is about someone scraping the images, then use another solution. Here are some other solutions:
How do I prevent site scraping?
Protection from screen scraping
The honeypot is a very common implementation.

Anything involving PHP will have a noticeable performance hit, especially if you are checking the database to verify login credentials, etc.
You can improve your code by setting the correct cache headers, etc. But really the best solution is to let apache serve the images as static files. Apache is incredibly good at serving static files, you are never going to make a PHP script that works as well as Apache.
One way to provide reasonable security for static files is to put a very long and random string in the URL. So instead of:
./php/getfile.php?file=xyz-thumb.jpg
Use this as the URL:
./files/usBmN5CssIL47qRroC77n90juaQoREThBbFZUddGneEH5jOuX6JpU5cH6zH1Xa5-thumb.jpg
And make sure directory indexes are forbidden (so a user can't just visit ./files/.
With a random filename that long, even if a hacker were able to guess a billion URLs per second the universe would still have ended long before they guess every possible URL.
If you are worried about search engines/etc somehow indexing the file URLs after some other security breach, you could place all the files in a directory with another random name - and change the name of this directory regularly (perhaps daily, perhaps every 10 minutes). In this case you should leave a the old URL functional for a brief period of time after you rename it (perhaps with a symlink to the new directory name?).
At first glance this may sound less secure than checking a user's login credentials. But realistically, a random filename like that is much more secure than any username/password.

Related

Force image download - .php files download

this is my download.php;
session_start();
$file = $_GET['file'];
download_file($file);
function download_file( $fullPath ){
// Must be fresh start
if( headers_sent() )
die('Headers Sent');
// Required for some browsers
if(ini_get('zlib.output_compression'))
ini_set('zlib.output_compression', 'Off');
// File Exists?
if( file_exists($fullPath) ){
// Parse Info / Get Extension
$fsize = filesize($fullPath);
$path_parts = pathinfo($fullPath);
$ext = strtolower($path_parts["extension"]);
// Determine Content Type
switch ($ext) {
case "pdf": $ctype="application/pdf"; break;
case "exe": $ctype="application/octet-stream"; break;
case "zip": $ctype="application/zip"; break;
case "doc": $ctype="application/msword"; break;
case "xls": $ctype="application/vnd.ms-excel"; break;
case "ppt": $ctype="application/vnd.ms-powerpoint"; break;
case "gif": $ctype="image/gif"; break;
case "png": $ctype="image/png"; break;
case "jpeg":
case "jpg": $ctype="image/jpg"; break;
default: $ctype="application/force-download";
}
header("Pragma: public"); // required
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private",false); // required for certain browsers
header("Content-Type: $ctype");
header("Content-Disposition: attachment; filename=\"".$_REQUEST["isim"]."\";" );
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".$fsize);
ob_clean();
flush();
readfile( $fullPath );
} else
die('File Not Found');
}
This is forced jpg download file. But this file can download all .php files.
Normaly i use this download link and download image;
http://domain.net/download.php?file=wp-content/uploads/2016/04/10/126379-fantasy_art.jpg
But then i tested this link download my config file...
http://domain.net/download.php?file=wp-config.php
I think this is big vulnerable.
How can i fix this? i dont want download any .php files...
Thanks..
Use default in your switch case to avoid this problem:
Remove this:
default: $ctype="application/force-download";
For this: default: die('File not found'); or default: return false;
Also you could check if path makes sense, like it should be a subfolder of uploads. This post has some info for you: Test if a directory is a sub directory of another folder
I think it would be good for you to step back and consider what this script actually does, because it is still a gigantic security hole. Here is what it does:
Take the user's input (which is always untrustworthy)
See if it's extension is allowed according to a small list of possible extensions
If so, pass it off to the user
Now that you have it die for unrecognized file extensions, it won't let them download your actual php files. But it will still let the user do all sorts of terrible things, all of which comes down to one very key issue:
You make no attempt to verify that the file being requested is actually reasonable for the person to view!!!
A key point is that readfile() doesn't care where the file is. Nor does it even assume that the file is in your website's public directory. The only reason it is downloading files from your web directory is because you didn't start the filename with a slash. However, readfile() will happily pass along anything on the server that it has read access to. Without your recent change a user could have just as easily done this:
http://domain.net/download.php?file=/etc/passwd
Moreover, it doesn't even have to be an actual file on the server. In most PHP installations PHP will happily load up URLs as actual files. So someone could also use your script as a proxy:
http://domain.net/download.php?file=http://theFBIwillArrestYouIfYouLoadThis.com/secrets.pdf
That sort of vulnerability (the ability to use your script as a proxy) is still present in your current solution. Anytime I see a website take file paths like this I love to see just how much it will let me get away with. You could set yourself up for a world of hurt in the worst case scenario.
You have to look at it from a defense-in-depth scenario. What it boils down to is the difference between blacklisting (what is the user not allowed to do) and whitelisting (what should this user be allowed to do). Good security practices rely on the latter method of thinking exclusively, because it is impossible to come up with a completely exhaustive blacklist that covers all possible scenarios.
In a case like this if you want a user to be able to download files you need some sort of list of files that are allowed to be downloaded. One example would be to place any file that is supposed to be downloaded into a specific directory. If a user requests a file then your script can use realpath() to make sure that file is actually in your public directory and otherwise forbid the download. Although if they are all in one directory you could just as easy change a configuration rule in your webserver (e.g. apache or nginx) to have it automatically add the 'content-disposition: attachment' header to anything in that directory. Then you just have to make sure that you never put the wrong files in that public directory.
Me personally though, I would approach it with a complete white-list. I would never let someone specify a filename and then use it to download a file. Rather I would have an administrative area where I manage files that are marked for download: the list of allowed files would be stored in the database and managed by me. When the user downloads a file they don't do it by specifying a filename but rather by specifying the id from the database that corresponds to the file they want to download (a simple user interface is necessary to facilitate this). The ID is used to lookup the file path, and the file can then be downloaded safely. You can then even store the files in directories outside the public area of your website so that you have full control over who can access the files.
That last suggestion is probably overkill for what you are trying to do, but the short of this is simple: you have to think carefully about the security implications of your code and make sure you are giving the user the minimum amount of privileges possible.

Laravel Image thumb

I making a web app where uploaded photos are stored in /app/storage.
To show the file I am using a route ex: showphoto/{id} (Paths are stored in DB)
public function showphoto($id){
$photo = Photo::findOrFail($id);
return $this->getFile($photo);
}
private function getFile($f){
if($f->path){
$file = storage_path($f->path.'/'.$f->origin_name);
if (File::exists($file)) {
$contents = File::get($file);
switch($f->ext) {
case "gif": $ctype="image/gif"; break;
case "png": $ctype="image/png"; break;
case "jpeg":
case "jpg": $ctype="image/jpeg"; break;
case "pdf": $ctype="application/pdf"; break;
default:
}
$response = Response::make($contents, 200);
$response->header('Content-Type', $ctype);
return $response;
}
}
}
To show image I am using
{{ HTML::image(route('showphoto', $photo->id), $photo->getName(), array('class'=>'img-thumbnail', 'width' => '100', 'height'=>'100')) }}
Question: Some file are more then 2 MB and when I have a list of them they are loading very slowly, So on the list I want to show just a Thumb of the photo.
It is possible to create a temporary thumb that will not be stored anywhere?
Or it is not a good idea to create a temporary thumb each time I load a page.
Thank you in advance.
If you have a bunch of large images that you are going to be processing in some way when a page is loaded you're probably going to have a bad time. All that extra processing is going to really slow everything down.
What you could do is create a thumbnail for an image when it is uploaded and store that somewhere. That way, you need only load your thumbnails instead of the larger images.
Alternatively, if you prefer to be able to specify the size of the thumbnail in your page another solution could be to generate your thumbnails at your specified size when the page loads. You'll need to make sure that you cache these thumbnails so you can use just load them in going forward though. If you opt for this approach the first time you load your page will probably take a while, but after that, subsequent page loads will be much quicker, since it will be using pre-created, cached images.
A great package to use with Laravel when manipulating images is Intervention Image:
http://image.intervention.io/

PHP Securing Temp Files for Download

I'm semi-new to PHP and I'm starting to dive into file downloading. I create .xlsx and .csv files using PHPExcel and place them in a temp directory to be downloaded. I found a nice script for doing the download and I added some tweaks to it that I needed. The script is below. I've already read these posts:
Secure file download in PHP, deny user without permission
...and...
Secure files for download
...and...
http://www.richnetapps.com/the-right-way-to-handle-file-downloads-in-php/
download.php
<?php
/*====================
START: Security Checks
====================*/
//(1) Make user it's an authenicated/signed in user with permissions to do this action.
require("lib_protected_page.php");
//(2) Make sure they can ONLY download .xlsx and .csv files
$ext = pathinfo($_GET['file'], PATHINFO_EXTENSION);
if($ext != 'xlsx' && $ext != 'csv') die('Permission Denied.');
//(3) Make sure they can ONLY download files from the tempFiles directory
$file = 'tempFiles/'.$_GET['file'];
//ABOUT ITEM 3 - I still need to change this per this post I found....
/*
http://www.richnetapps.com/the-right-way-to-handle-file-downloads-in-php/
You might think you’re being extra clever by doing something like
$mypath = '/mysecretpath/' . $_GET['file'];
but an attacker can use relative paths to evade that.
What you must do – always – is sanitize the input. Accept only file names, like this:
$path_parts = pathinfo($_GET['file']);
$file_name = $path_parts['basename'];
$file_path = '/mysecretpath/' . $file_name;
And work only with the file name and add the path to it youserlf.
Even better would be to accept only numeric IDs and get the file path and name from a
database (or even a text file or key=>value array if it’s something that doesn’t change
often). Anything is better than blindly accept requests.
If you need to restrict access to a file, you should generate encrypted, one-time IDs, so you can be sure a generated path can be used only once.
*/
/*====================
END: Security Checks
====================*/
download_file($file);
function download_file( $fullPath )
{
// Must be fresh start
if( headers_sent() ) die('Headers Sent');
// Required for some browsers
if(ini_get('zlib.output_compression'))
ini_set('zlib.output_compression', 'Off');
// File Exists?
if( file_exists($fullPath) )
{
// Parse Info / Get Extension
$fsize = filesize($fullPath);
$path_parts = pathinfo($fullPath);
$ext = strtolower($path_parts["extension"]);
// Determine Content Type
switch ($ext) {
case "pdf": $ctype="text/csv"; break;
case "pdf": $ctype="application/pdf"; break;
case "exe": $ctype="application/octet-stream"; break;
case "zip": $ctype="application/zip"; break;
case "doc": $ctype="application/msword"; break;
case "xls": $ctype="application/vnd.ms-excel"; break;
case "xlsx": $ctype="application/vnd.ms-excel"; break;
case "ppt": $ctype="application/vnd.ms-powerpoint"; break;
case "gif": $ctype="image/gif"; break;
case "png": $ctype="image/png"; break;
case "jpeg":
case "jpg": $ctype="image/jpg"; break;
default: $ctype="application/force-download";
}
header("Pragma: public"); // required
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
//Now, the use of Cache-Control is wrong in this case, especially to both values set to zero, according to Microsoft, but it works in IE6 and IE7 and later ignores it so no harm done.
header("Cache-Control: private",false); // required for certain browsers
header("Content-Type: $ctype");
header("Content-Disposition: attachment; filename=\"".basename($fullPath)."\";" );
//Note: the quotes in the filename are required in case the file may contain spaces.
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".$fsize);
ob_clean();
flush();
readfile( $fullPath );
}
else
die('File Not Found');
}
?>
My questions are...
Are my security checks enough? I only want authenticated users with proper permissions to be able to download .xlsx and .csv files from only the tempFiles directory. But I've read that download-able files should be outside the webroot, why? With these checks I don't see why that would matter?
The tempFiles directory is forbidden if you type it in on the address bar (www.mySite.com/tempFiles), but if the user somehow guesses a filename (which would be difficult, they are long and unique) then they could type that in on the address bar and get the file (www.mySite.com/tempFiles/iGuessedIt012345.csv). So is there a way to not allow that (I'm running Apache), so they are forced to go through my script (download.php)?
Thank you! Security is my number 1 concern so I want to learn every little thing I can about this before going live. Some of the example download scripts I've seen literally would let you pass in a php filename thus allowing people to steal your source code. FYI, I do clean up the tempFiles directory fairly regularly. Just leaving files there forever would be a security issue.
Another option is to generate file content, not saving it on server but push that content to client browser with proper headers so client browser can interpret it as file to download. In the end client got his file without accessing your tmp folder and you don't have to worry about cleaning tmp, it's secure because you are not saving anything on your server, "data that you don't have cannot be stolen".
Example for pdf:
....
$content = $MyPDFCreator->getContent();
header('Content-Type: application/pdf');
header('Content-Length: '.strlen( $content ));
header('Content-disposition: inline; filename="downloadme.pdf"');
header('Cache-Control: public, must-revalidate, max-age=0');
header('Pragma: public');
header('Expires: Sat, 26 Jul 1997 05:00:00 GMT');
header('Last-Modified: '.gmdate('D, d M Y H:i:s').' GMT');
echo $content;
I suggest that you should not let the user request a file by providing with a full path.
Filter the 'file' parameters. Make sure it doesn't start with dots, to avoid people requesting relative path to other files.
In your line:
$file = 'tempFiles/'.$_GET['file'];
if the user is requesting the file "../../var/www/my-site/index.php" for exemple, the value of your $file variable will become the path to the index.php file, given that your tempfiles/ directory is located two level deeper than your/var/www.
This is just an example, you should get the idea.
So the most important thing in my humble opinion is to filter the file parameter received. You can check for the presence of two dots (..) in the file parameters this way:
if (strpos($GET['file'], "..") !== false) {
// file parameters contains ..
}
If, as suggested by Ashish, you can develop populate a database table with a token associated to a file and a user, then you could increment the number of time that user requests the file. After a certain amount of download, you could then deny the download request.
This approach let you keep a certain control over the downloads of file, while still giving your user some flexibility, for example if the user is accessing your web application from different location/devices and need to download the same file a few time.
Having your files in the webroot will allow visitors to directly access your files (as they can 'run' anything in the webroot - within reason). What would be best would to have this kind of set-up, or one like it;
/var
/www
/html
/my-site
index.php
download.php
...
/tmpFiles
iGuessedIt012345.csv
This way - with some configuration - the outside world can't get to tmpFiles/.
This would also allow you to do your checks for authenticated users with correct permissions in download.php
Another way in which you could keep them out of the tmpFiles/ directory would to have a .htaccess file in that directory, with the following;
deny from all
This will yield a 403 Forbidden message to anything who'd tried to access that directory.
I think you should provide a unique url with a temporary token to download the file. This token should be one time use token. Once the user have used that token it should be invalidated and if user want to download the file he need to regenerate the download link with only provided and authenticated way.
For example you can give a url like:
http://www.somedomain.com/download.php?one_time_token=<some one time token>
Once the url is visited, you should invalidate the given token.
I think using this token method you can secure your file download process.
For the file location You should avoid storing files on public accessible places. You should store files at some other place in you system and read you file from there only.
An authenticated user can access a page(download.php) where he can view files in tempFiles
Set .htaccess to "deny from all" in tempFiles so noone can directly access, then in download.php every file should be downloadable with a token, as sad by Ashish Awasthi
If you don't like tokens you can do something like download?file=iGuessedIt012345.csv, but if you do this way use a whitelist regex to check if is everything right!
example:
$var="iGuessedIt012345.csv";
if (preg_match('#^[[:alnum:]]+\.csv$#i', $var)){
echo "ok";
}else{
echo "bad request";
}
example2:
$var="iGuessed_It-012345.csv";
if (preg_match('#^[a-zA-Z0-9\-\_]+\.csv$#i', $var)){
echo "ok";
}else{
echo "bad request";
}

"Large" S3 Images Not Able to Load in PHP

I'm trying to use WordPress on the AppFog PaaS system. Unfortunately, AppFog doesn't have persistent storage so all content outside of the database needs to be stored on some external system (like S3). I'm successfully using a plugin which pushes all my WordPress media out to S3, but am having problems loading some of the images.
To investigate, I deployed the following script:
// get the image name from the query string
// and make sure it's not trying to probe your file system
if (isset($_GET['pic'])) {
$pic = $_GET['pic'];
// get the filename extension
$ext = substr($pic, -3);
// set the MIME type
switch ($ext) {
case 'jpg':
$mime = 'image/jpeg';
break;
case 'gif':
$mime = 'image/gif';
break;
case 'png':
$mime = 'image/png';
break;
default:
$mime = false;
}
// if a valid MIME type exists, display the image
// by sending appropriate headers and streaming the file
if ($mime) {
header('Content-type: '.$mime);
header('Content-length: '.filesize($pic));
$file = fopen($pic, 'rb');
if ($file) {
fpassthru($file);
exit;
}
}
}
?>
Which allows me to directly test my ability to read and write an image in PHP. This proxy script works perfectly for images under around 10KB -- i.e. when I open the script in a browser pointing it at some "small" image file in my S3 bucket, I'm able to see it.
However, when I attempt to load a "large" file (anything over 10KB), I get an error. In Firefox, that's:
The image “http://myssite.com/iproxy.php?pic=http://aws.amazon.com%2Fmybucket%2Fwp-content%2Fuploads%2F2013%2F01%2Fmylargeimage.png” cannot be displayed because it contains errors.
I've been wrestling with this for hours and can't seem to figure anything out. I've tried changing the output_buffering to a larger value but that hasn't helped.
Any tips would be appreciated!

Serve image with PHP script vs direct loading an image

I want to monitor how often some external images are loaded.
So my idea is instead of giving a uri directly like this:
www.site.com/image1.jpg
I can create a PHP script which reads the image, so I built a PHP file and my HTML would look like this:
<img src="www.site.com/serveImage.php?img=image1.jpg">
but I don't know how to read the image from disk and return it. Would I return a byte array or set the content type?
Kind regards,
Michel
Sending images through a script is nice for other things like resizing and caching on demand.
As answered by Pascal MARTIN the function readfile and these headers are the requirements:
Content-Type
The mime type of this content
Example: header('Content-Type: image/gif');
See the function mime_content_type
Types
image/gif
image/jpeg
image/png
But beside the obvious content-type you should also look at other headers such as:
Content-Length
The length of the response body in octets (8-bit bytes)
Example: header('Content-Length: 348');
See the function filesize
Allows the connectio to be better used.
Last-Modified
The last modified date for the requested object, in RFC 2822 format
Example: header('Last-Modified: Tue, 15 Nov 1994 12:45:26 GMT');
See the function filemtime and date to format it into the required RFC 2822 format
Example: header('Last-Modified: '.date(DATE_RFC2822, filemtime($filename)));
You can exit the script after sending a 304 if the file modified time is the same.
status code
Example: header("HTTP/1.1 304 Not Modified");
you can exit now and not send the image one more time
For last modified time, look for this in $_SERVER
If-Modified-Since
Allows a 304 Not Modified to be returned if content is unchanged
Example: If-Modified-Since: Sat, 29 Oct 1994 19:43:31 GMT
Is in $_SERVER with the key http_if_modified_since
List of HTTP header responses
To achieve something like this, your script will need to :
send the right headers, which depend on the type of the image : image/gif, image/png, image/jpeg, ...
send the data of the image
making sure nothing else is sent (no white space, no nothing)
This is done with the header function, with some code like this :
header("Content-type: image/gif");
Or
header("Content-type: image/jpeg");
or whatever, depending on the type of the image.
To send the data of the image, you can use the readfile function :
Reads a file and writes it to the
output buffer.
This way, in one function, you both read the file, and output its content.
As a sidenote :
you must put some security in place, to ensure users can't request anything they want via your script : you must make sure it only serves images, from the directory you expect ; nothing like serveImage.php?file=/etc/passwd should be OK, for instance.
If you're just willing to get the number of times a file was loaded each day, parsing Apache's log file might be a good idea (via a batch run by cron each day at 00:05, that parses the log of the day before, for instance) ; you won't have real-time statistics, but it will require less resources on your server (no PHP to serve static files)
I use the "passthru" function to call "cat" command, like this:
header('Content-type: image/jpeg');
passthru('cat /path/to/image/file.jpg');
Works on Linux. Saves resources.
You must set the content type:
header("Content-type: image/jpeg");
Then you load the image and output it like this:
$image=imagecreatefromjpeg($_GET['img']);
imagejpeg($image);
Instead of changing the direct image url in the HTML, you can put a line in the Apache configuration or .htaccess to rewrite all the requests of images in a directory to a php script. Then in that script you can make use of the request headers and the $_server array to process the request and serve the file.
First in your .htaccess:
RewriteRule ^(.*)\.jpg$ serve.php [NC]
RewriteRule ^(.*)\.jpeg$ serve.php [NC]
RewriteRule ^(.*)\.png$ serve.php [NC]
RewriteRule ^(.*)\.gif$ serve.php [NC]
RewriteRule ^(.*)\.bmp$ serve.php [NC]
The script serve.php must be in the same directory as .htaccess. You will probably write something like this:
<?php
$filepath=$_SERVER['REQUEST_URI'];
$filepath='.'.$filepath;
if (file_exists($filepath))
{
touch($filepath,filemtime($filepath),time()); // this will just record the time of access in file inode. you can write your own code to do whatever
$path_parts=pathinfo($filepath);
switch(strtolower($path_parts['extension']))
{
case "gif":
header("Content-type: image/gif");
break;
case "jpg":
case "jpeg":
header("Content-type: image/jpeg");
break;
case "png":
header("Content-type: image/png");
break;
case "bmp":
header("Content-type: image/bmp");
break;
}
header("Accept-Ranges: bytes");
header('Content-Length: ' . filesize($filepath));
header("Last-Modified: Fri, 03 Mar 2004 06:32:31 GMT");
readfile($filepath);
}
else
{
header( "HTTP/1.0 404 Not Found");
header("Content-type: image/jpeg");
header('Content-Length: ' . filesize("404_files.jpg"));
header("Accept-Ranges: bytes");
header("Last-Modified: Fri, 03 Mar 2004 06:32:31 GMT");
readfile("404_files.jpg");
}
/*
By Samer Mhana
www.dorar-aliraq.net
*/
?>
(This script can be improved!)
Also, if you want to the user to see a real filename instead of your scriptname when the user RMC's on the image and selects "Save As", you'll need to also set this header:
header('Content-Disposition: filename=$filename');
You're probably better off examining your server access logs for this. Running all images through php might put a bit of load on your server.
I serve my images with readfile as well, but I have gone the extra mile both for security and extra functionality.
I have a database set up which stores the image id, its dimensions and file extension. This also means that images need to be uploaded (allowing optional resizing), so I only use the system for content and not images needed for the website itself (like backgrounds or sprites).
It also does a very good job at making sure you can only request images.
So, for serving the simplified workflow would be like this (cannot post production code here):
1) get the ID of the requested image
2) Look it up in the database
3) Throw headers based on the extension ("jpg" gets remapped to "jpeg" on upload)
4) readfile("/images/$id.$extension");
5) Optionally, protect /images/ dir so it cannot be indexed (not a problem in my own system as it maps URLS like /image/view/11 to something like /index.php?module=image&action=view&id=11)
There are a lot of good answers above, but none of them provide working code that you can use in your PHP app. I've set mine up so that I lookup the name of the image in a database table based off a different identifier. The client never sets the name of the file to download as this is a security risk.
Once the image name is found, I explode it to obtain the extension. This is important to know what type of header to serve based off the image type (i.e. png, jpg, jpeg, gif, etc.). I use a switch to do this for security reasons and to convert jpg -> jpeg for the proper header name. I've included a few additional headers in my code that ensure the file is not cached, that revalidation is required, to change the name (otherwise it will be the name of the script that is called), and finally to read the file from the server and transmit it.
I like this method since it never exposes the directory or actual file name. Be sure you authenticate the user before running the script if you are trying to do this securely.
$temp = explode('.', $image_filename);
$extension = end($temp); // jpg, jpeg, gif, png - add other flavors based off your use case
switch ($extension) {
case "jpg":
header('Content-type: image/jpeg');
break;
case "jpeg":
case "gif":
case "png":
header('Content-type: image/'.$extension);
break;
default:
die; // avoid security issues with prohibited extensions
}
header('Content-Disposition: filename=photo.'.$extension);
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
readfile('../SECURE_DIRECTORY/'.$image_filename);
PHP 8 lets you use the match feature, which will further optimize the code by getting rid of the switch and ugly looking nested cases.

Categories