Storing and reading images above public_html - php

I am trying to secure my PHP Image upload script and the last hurdle I have to jump is making it so that users cannot directly excecute the images, but the server can still serve them in web pages. I tried changing ownership and permissions of the folders to no avail, so I am trying to store the images above public_html and display them in pages that are stored in public_html.
My File Structure:
- userimages
image.jpg
image2.jpg
- public_html
filetoserveimage.html
I tried linking to an image in the userimages folder like this:
<img src="../userimages/image.jpg">
But it does not work. Is there something I am missing here? If you have any better suggestions please let me know. I am trying to keep public users from executing potentially dangerous files they may have uploaded. Just as an extra security measure. Thanks!

You want something that's basically impossible.
The way a browser loads a page (in a very basic sense) is this:
Step 1: Download the page.
Step 2: Parse the page.
Step 3: Download anything referenced in the content of the page (images, stylesheets, javascripts, etc)
Each "Download" event is atomic.
It seems like you want to only serve images to people who have just downloaded a page that references those images.
As PHP Jedi illustrated, you can pass the files through PHP. You could expand on his code, and check the HTTP_REFERER on the request to ensure that people aren't grabbing "just" the image.
Now, serving every image through a PHP passthru script is not efficient, but it could work.
The most common reason people want to do this is to avoid "hotlinking" -- when people create image tags on other sites that reference the image on your server. When they do that, you expend resources handling requests that get presented on someone else's page.
If that's what you're really trying to avoid, you can use mod_rewrite to check the referer.
A decent-looking discussion of hotlinking/anti-hotlinking can be found here

Use an image relay script!
To serve a imagefile that is outside the public_html folder you would have to do it by a php script. E.g make a image-relay.php that reads the image that is outside the public html...
<?php
header('Content-Type: image/jpeg');
$_file = 'myimage.jpg'; // or $_GET['img']
echo file_get_contents('/myimages/'.$_file);
?>
Now, $_file could be a $_GET parameter, but its absolutley important to validate the input parameter...
now you can make an <img src="image-relay.php?img=flower.jpg"> to access a flower.jpg image that is located in /myimage/flower.jpg ...

Well, a web browser will only be able to access files and folders inside public_html.

If the public_html directory is the root of the server for your users, Apache cannot serve anything that is not inside/below that dorectory.
If you want a file to be served by Apache directly, you'll have to put it in/below public_html.

I think your misunderstanding is in the fact that if you include an image in an <img> tag, your browser will send the exact same request to the webserver to fetch it, that will be sent to the webserver if you try to open the src url of the image in your browser directly.
Therefore, either both things work, or neither.
There are hacks around, involving a (php or other) script to make sure that an IP that has requested the image has also requested the html page within the last few seconds (which will not work if the user is behind a proxy that rotates outgoing IPs) or by checking the referer (which does not work with HTTPs and also not if the user has referer disabled).
If you want to make sure that only some users can see the image (both via <img> tag and directly), you can put the image outside public_html and have a (php or other) script that verifies the user's credentials before serving the image.

If you are using Apache or lighttpd you can use the X-Sendfile header to send files that are not in the web root(provided you haven't changed the configuration of mod_xsendfile).
To learn more about X-sendfile see this site.
This solution is giving you the best possible performance as PHP doesn't send the file but the server does and therefore PHP can be exited while the files are being served.
Hope that helps.

Related

How to access a video file in a directory located in my server's home directory

Unable to correctly use php code to load a video mp4 file stored in home directory.
Hi, I am building a WP site that sells instruction video mp4 files. To protect the files I have placed them in a directory called videos which is in the home directory (outside of the public_html directory) to protect the files from being downloaded for free. I am trying to write php code for loading the video. However, I can't access the video in /home/username/videos.
My code:
add_action('template_redirect', 'video_redirect', 5);
function video_redirect(){
if (is_admin())
return;
if (!is_page(videoplayerpageonmysite))
return;
$filename="/home/username/videos/videofile.mp4";
echo " Your browser does not support the video tag.";
Each time I run the code I get a No video with supported format... error.
I'm only able to get it to load the video file when it is in the public_html folder (it works perfectly then), but not when it is located in /home/username/videos/
Please help!
You can't give the browser a file path on your server's hard disk and expect it to be able to load it. It will resolve it as a relative URL, ask the HTTP server for it, and then get a 404.
You need to give the browser a URL that actually loads the file.
If you want to limit who can access it (e.g. people who have paid), then you could write a PHP program that checks to see if the request is coming from someone who has paid (i.e. Authentication + Authorization), then reads the file and outputs it in the HTTP response.
You have to use a PHP script as the source URL in the video tag and specify some identifier for which video it should load e.g. src="loadvideo.php?id=1" or something. (This is because the source must be a valid URL which is actually accessible on the webserver - the browser, which don't forget runs on the user's machine not the server, cannot navigate to a path on disk. If it could, then moving your files outside the public_html folder would not provide any security!)
And then when the video tag is loaded into the page, it will make a request to that URL, which causes the PHP script to run. The script must associate the provided ID with the correct file on disk, fetch the contents of that file and echo them as the response, along with appropriate headers (e.g. mime type etc). You can probably find examples of this pattern online already.
Of course the PHP script will also need to authenticate the request to make sure the requestor is a signed-in, paid user, otherwise you still aren't protecting anything. Without this last step, anyone could just visit the PHP script's URL and provide an ID until they got a result, and download the videos just as if you'd put them in the public_html folder. As long as you implement security correctly though, only users who already paid for the videos would be able to download them.

How to provide a local path for downloading a file from a server?

I am trying to download an object/ file from AWS S3 to the local computer. I would like to provide the user with the opportunity to provide the local path. In HTML we have the
<form> and <input type="file">
elements to provide the user with the option to select a file from the file system for upload. How do we do the reverse? Any pointers would be greatly appreciated.
You can't.
Being able to do so would risk websites trying to put stuff in /etc/hosts, ~/.bash_profile, C:\Windows\System32, etc. You can set a (suggested) filename, but it's going to go to the browser's preferred Downloads folder.
It's a security issue to write anywhere you want to the filesystem. You can present it as a download (application/octet-stream / Content-Disposition) but the user's browser gets the right to choose in the end regardless.
You can force a file to download instead of display on a page from the server with a specific filename only, but it stops there. The browser has the choice of popping up a Save As dialog or saving it in the default Downloads folder.
Incidentally, when a user chooses to upload a file, you don't actually get the path either - you get a fake path and a user-defined real filename. On Windows Chrome, it sends something typically like c:\fakepath\ so it doesn't reveal overly-personal information in the path.

Dangerous file types to avoid in file-sharing website

I am making a small file-sharing website where users can upload content. Recently somebody uploaded a PHP script that was presumably intended to harm the site. It got me thinking: what file types should I block users from uploading? I have already blocked .exe files. What other file types could cause harm to either my website or its users?
This script can be viewed here.
Don't store the files where they're directly accessible - only provide access via a script YOU control. Don't store the files using their user-supplied filename - use a filename YOU generate (best option is to store file details in a database, including the original filename, and store the actual file using that db record's primary key field.
With those two, people can upload antyhing they want, and there'll be zero chance of the file being executed/interpreted on your server, because it's never in a position where it CAN be executed/interpreted.
It looks like the script is cut off while it's still defining functions, so I can't make out what it's doing.
However, if you're doing things correctly you should have an .htaccess file in your "uploaded files" directory with:
Header set Content-Disposition "attachment"
This will ensure that accessing any file in that directory will result in a download, and that script will not be run.
(Actually even better is to have the files outside the webroot, and have a "downloader" php script echoing the file contents)
That script could euphemistically be described as a remote administration script.
You should always use a whitelist, not a blacklist. Instead of "enumerating badness", make a list of allowed file types and reject everything else.
Also, all files uploaded should be put in a directory which does not run the PHP handler, or any other script handlers at all (check for instance what other content management systems written in PHP do in the .htaccess for their upload directories).
It is also a good idea to put the uploaded files in a separate subdomain which does not have any access to the cookies of the main domain, to avoid attacks which attempt to run JavaScript code on the same origin as the main site (a whitelist of content types is not enough for this, since some browsers are known to guess the content type and treat non-HTML files as HTML).

PHP - Question about uploading & uploaded image file

I have read the following tutorial "Uploading Files To the Server Using PHP"
and have several questions related to the topics.
Q1> The tutorial mentions that
"Note that PHP must have write access
to $uploadDir or else the upload will
fail"
For me, I only allow the user to upload the file after the user has login to the website.
If we set that $uploadDir permission as 777, then everyone can have written permission to that folder. How to avoid this problems?
Also I am using WAMP as my testing bed, can I simulate the same case as a real web server?
Q2> In order to prevent Preventing direct access, the tutorial mentions:
"A better approach is to move the
upload directory away from your web
root. For example, the web root for
this site is:
/home/arman198/public_html/ to prevent
direct listing i can set the upload
directory to /home/arman198/upload/."
Now my problem is that how can I display the uploaded images on other website pages. Since, the upload is not accessible directly anymore? I need to display the uploaded image save personal headshot dynamically on other website page. Is it possible?
Thank you
It's a common problem.
All modern computers have a temporary files directory. On Linux/Unix it's /tmp, on Windows it's usually c:\temp. The OS install will have set permissions on that directory so that anyone can write files there but only privileged users can delete files that don't belong to them. This is where PHP will want to put an uploaded file; your application then has to move it elsewhere (this is the purpose of the move_uploaded_file() function). PHP under Windows may need upload_tmp_dir actually set in the php.ini file.
Once you have an uploaded file, you can shift it whereever you like, including to where the webserver can read it to serve it. The biggest problem with that it is awfully easy to put this directory inside your codebase. Don't do that. As soon as you do anything beyond editing the files inside the directory they are served from, it will be problematic. Trust me: I've dealt with a few times this in code I've inherited. It's easy to let your webserver load files from a location outside your codebase.
The other alternative is to produce a download script. That way the file need not be servable by the webserver at all. One disadvantage is that you don't get to leverage the web server's MIME translation, but then, that lets you control which types of image files are permitted.
For the second question, you can use a PHP script intead of direct access to the directory. Lets name it image.php. Lets assume that it can take a parameter id, like image.php?id=image_id. In that file you can get the id using superglobal array $_GET. Then you can search for images with that Id and just send it as response.
First one I'm not sure, but maybe play with .htaccess file.
And for the first question, try setting your permissions to 775. That should allow PHP to write the file to the directory without giving the general public write access.

Performance-oriented way to protect files on PHP level?

I am looking for some input on something I have been thinking about for a long time. It is a very general problem, maybe there are solutions out there I haven't thought of yet.
I have a PHP-based CMS.
For each page created in the CMS, the user can upload assets (Files to download, Images, etc.)
Those assets are stored in a directory, let's call it "/myproject/assets", on a per-page basis (1 subdirectory = 1 page, e.g. "/myproject/assets/page19283")
The user can "un-publish" (hide) pages in the CMS. When a page is hidden, and somebody tries to access it because they have memorized the URL or they come from Google or something, they get a "Not found" message.
However, the assets are still available. I want to protect those as well, so that when the user un-publishes a page, they can trust it is completely gone. (Very important on judicial troubles like court orders to take content down ... Things like that can happen).
The most obvious way is to store all assets in a secure directory (= not accessible by the web server), and use a PHP "front gate" that passes the files through after checking. When a project needs to be watertight this is the way I currently go, but I don't like it because the PHP interpreter runs for every tiny image, script, and stylesheet on the site. I would like have a faster way.
.htaccess protection (Deny from all or similar) is not perfect because the CMS is supposed to be portable and able to run in a shared environment. I would like it to even run on IIS and other web servers.
The best way I can think of right now is moving the particular page's asset directory to a secure location when it is un-published, and move it back when it's published. However, the admin user needs to be able to see the page even when it's un-published, so I would have to work around the fact that I have to serve those assets from the secure directory.
Can anybody think of a way that allows direct Apache access to the files (=no passing through a PHP script) but still controlling access using PHP? I can't.
I would also consider a simple .htaccess solution that is likely to run on most shared environments.
Anything sensitive should be stored in a secure area like you suggested.
if your website is located at /var/www/public_html
You put the assets outside the web accessible area in /var/www/assets
PHP can call for a download or you can feed the files through PHP depending on your need.
If you kept the HTML in the CMS DB, that would leave only non-sensitive images & CSS.
If you absolutely have to turn on and off all access to all materials, I think your best bet might be symlinks. Keep -everything- in a non-web-accessible area, and sym link each folder of assets into the web area. This way, if you need to lock people out completely, just remove the symlink rather than removing all files.
I don't like it, but it is the only thing I can think of that fits your crtieria.
I'd just prevent hotlinking of any non-HTML file, so all the "assets" stuff is accessible only from the HTML page. Removing (or protecting) the page just removes everything without having to mess up the whole file system.
Use X-Sendfile
The best and most efficient way is using X-Sendfile. However, before using X-Sendfile you will need to install and configure it on your webserver.
The method on how to do this will depend on the web server you are using, so look up instructions for your specific server. It should only be a few steps to implement. Once implemented don't forget to restart your web server.
Once X-Sendfile has been installed, your PHP script will simply need to check for a logged in user and then supply the file. A very simple example using Sessions can be seen below:
session_start();
if (empty($_SESSION['user_id'])){
exit;
}
$file = "/path/to/secret/file.zip";
$download_name = basename($file);
header("X-Sendfile: $file");
header("Content-type: application/octet-stream");
header('Content-Disposition: attachment; filename="' . $download_name . '"');
Important note:
If you are wanting to serve the file from another webpage such as an image src value you will need to make sure you sanitize your filename. You do not want anyone overriding your script and using ".." etc. to access any file on your system.
Therefore, if you have code that looks like this:
<img src="myscript.php?file=myfile.jpg">
Then you will want to do something like this:
session_start();
if (empty($_SESSION['user_id'])){
exit;
}
$file = preg_replace('/[^-a-zA-Z0-9_\.]/', '', $_GET['file']);
$download_name = basename($file);
header("X-Sendfile: $file");
header("Content-type: application/octet-stream");
header('Content-Disposition: attachment; filename="' . $download_name . '"');
EDIT: How about a hybrid for the administrative interface? In the ACP you could access via the PHP method to, basically, send all file requests to the PHP authing file, but for public, you can use HTTP AUTH/htaccess to determine the availability of the result. this gives you the performance on the public side, but the protection on the ACP side.
OLD MESSAGE:
.htaccess is compatible with most Apache and IIS<7 environments (using various ISAPI modules) when using mod_rewrite type operations. The only exception is IIS7 + the new Rewrite module which uses the web.config file. HOWEVER, I'd be willing to be that you could efficiently generate/alter the web.config file for this instance instead of using .htaccess.
Given that, you could set up redirects using the rewrite method and redirect to your custom 404 Page (that hopefully sends the proper 404 header). It is not 100% appropriate because the actual asset should be the one giving a 403 header, but... it works.
This is the route I would go unless you want to properly create HTTP AUTH setups for every server platform. Plus, if you do it right, you could make your system extendable to allow other types in the future by you or your users (including a php based option if they wanted to do it).
I'm assuming the 'page' is being generated by PHP and the 'assets' should not require PHP. (Let me know if I got that wrong.)
You can rename the assets folder. For example, rename '/myproject/assets/page19283' to '/myproject/assets/page19283-hidden'. This will break all old, memorized links. When you generate the page for admin users that can see it, you just write the urls using the new folder name. After all, you know whether the page is 'hidden' or not. The assets can be accessed directly if you know the 'secret' url.
For additional security, rename the folder with a bunch of random text and store that in your page table (wherever you store the hidden flag): '/myproject/assets/page19283-78dbf76B&76daz1920bfisd6g&dsag'. This will make it much harder to guess at the hidden url.
Just prepend or append a GUID to the page name in the database and the resource directory in the filesystem. The admin will still be able to view it from the admin interface because the link will be updated but the GUID effectively makes the page undiscoverable by an outside user or search engine.
Store your information in a directory outside the web root (i.e.: one directory outside of public_html or htdocs). Then, use the readfile operator in a php script to proxy the files out when requested. readfile(...) basically takes a single parameter--the path to a file--and prints the contents of that file.
This way, you can create a barrier where if a visitor requests information that's hidden behind the proxy, even if they "memorized" the URL, you can turn them down with a 404 or a 403.
I would go with implementing a gateway. You set a .htaccess file to the /assets/ URL pointing to a gateway.php script that will deny if both the credentials are not valid and this particular file is not published or show it.
I'm a little confused. Do you need to protect also the stylesheet files and images? Perhaps moving this folder is the best alternative.

Categories