I have a growing website with around 30k images in 4 sizes for a total of 120k. My current solution is storing them in a DB and from what I've read on stack oveflow this seems to be a very bad idea.
I'm starting to see why. I've noticed a drastic decrease in performance. think the obvious solution is to move these images to a folder setup and I'd like to use the users ID as a dir name and then have a public and private sub dir.
The structure would look like this:
/54869
/public
/size_1/img1.jpg
/size_2/img1.jpg
/size_3/img1.jpg
/size_4/img1.jpg
/private
/size_1/img2.jpg
/size_2/img2.jpg
/size_3/img2.jpg
/size_4/img2.jpg
What is the best way to secure this private folder and only provide other users access if the owner of the file has granted permission?
I'd also like to prevent users from simply viewing the contents of any folder but I suppose I could perform a check client side to fix this.
Any thoughts or suggestions?
You could make that folder not accessible from the web (e.g. place the folder outside htdocs or add .htaccess rules).
Create a PHP script which handles all requests to the private images.
This script would have to do the following:
check if the user is authenticated
check if the user is authorized to view the requested image
open the image and print it to the browser (you need to set correct http headers to make sure the content is treated as an image)
Then, in your HTML, simply point to /secret_image.php?id=3 or maybe you want to use the path of the image /secret_image.php?p=/54869/public/size_1/img1.jpg
You can try this
imageViewer.php?imgID=XXX; where XXX=hashFunction(salt.imgName)
And do all the processing in the imageViewer.php like if the user has permission or not !
Related
I'm no coder and I'm running into an issue. What I would like to do is to store into a file all the IPs that try accessing specific URLS so that I can build a list of IPs of people that are either trying to see if my site is made with a specific CMS or try to hack their way into it.
I know I can access such data in the server log files but automating the task would save me a lot of time in the long run.
Is there so code that I could add on the specific URLs that would do this?
<?php
$pathToLogFile="";
file_put_contents($pathToLogFile, "\nIP Address". $_SERVER["REMOTE_ADDR"],FILE_APPEND)
?>
Remember to provide an absolute path if you're placing this file in multiple directories, and make sure you initialize a blank file at that path
I need to make a restricted file access system in php (I use Wordpress for my site).
Here is my idea:
User creates file via a pdf generator
Save this file in a specific folder (with a ID)
Denie the access for everybody - including the user who generated the file
Check on a output.php file if the user is the same who generated the file
Give the user the link for the file
The thing is that the file should be just downloadeble from the output.php and not via url in browser or something else. I know that sound unimpossible but is there everybody who has an idea? Although with restrictions if all is not possible.
Thank you very much for your help :)
The Social Media Department would like a directory where they can upload images and other media that need to be kept private until we are ready to publish them. Ideally, we would want the user to get a 404 error instead of being prompted to log in or instead of getting an "access denied" message if they put in an URL for a private file.
Because the Social Media Department does not want to have to move images once an article is ready to be published, really what they need is a way for images that are saved to the WordPress Media Library or some other folder to return a 404 error if they are part of articles that are not published and display for anyone if they are part of articles that art published.
Our users like to try and guess what we'll be announcing by putting in random image file names once they know the URL structure for the images
the only way is to track what you want or dont want or both. at some point you have to ask can this file be served. not hard to code but could be an expensive operation per request.
To keep users from guessing names, you can either prepend/append a random string (per Graham Walters) or hash the whole name. Don't forget to suppress autoindexing of the directory via the Options .htaccess command, or a "Nothing to see here folks, move along." index.html file.
If users can somehow get hold of the names (say, via a leak), but there aren't too many "embargoed" files, the embargoed files could be added to an .htaccess blacklist similar to hotlink protection. Return a 404 if anyone requests those files not via your official pages. Remove them from the blacklist once they go live. If you set up the hotlink protection correctly, you may be able to forbid access to whole classes of files (such as by filetype), except for your official pages.
I have a web app that lets users store files which contain sensitive information.
So far I've written code so that if they which to view their files, they go through view.php?id=xx and a check is done through a database to confirm that they are allowed to look at said file. As an example, John uploads "information.pdf" to the folder "uploads" which is found at "www.mysite.com/uploads", so the file's exact path would be "http://www.mysite.com/uploads/information.pdf", and in the database this same file has an id of , say, 2, so he would get to it via view.php?id=2.
Question
How do I stop anyone from just going to the exact path and looking at his sensitive file?
What I've done
Written the code to only allow access to files if users go through my website, not directly.
I have looked at the recommended questions for the same title, however have had no luck.
Any help would be greatly appreciated.
Don't put it in a publicly accessible path like http://www.mysite.com/uploads/. Put it outside htdocs and only allow access through your view.php
If you want to give download facility to the owner just create adownload.php that checks permission same way as view.php but instead of viewing it lets the user to download.
I am currently trying to develop an image uploading website by using CodeIgniter.
The thing is, I came across an issue today and I would really appreciate any kind of help in order to solve it.
So basically, the site is working. But the thing is, that the files are not private. A user may want to ensure that the files the users upload are only visible by them, and not by someone who just guesses a bunch of urls. (eg. user1 uploads image1 which he wants to keep private, for himself =>[localhostlocalhost/upload_script/files/image1.jpg], user2 can access image1 by guessing and typing the url [localhost/upload_script/files/image1.jpg] which is what we don't want to happen. )
I have done some research and I think that this would probably require another controller for serving the files (which checks for session data).
I have been "playing" with sessions etc in PHP for quite some time in the past, but I am not that familiar with them in CodeIgniter.
Is this the only way? I don't think I need to create separate directories for each user, do I? Can you please tell me how to head to the right direction or give me an example?
Thanks in advance,
harris21
In order to protect files, you will need keep them outside of your web root, otherwise people will always be able to url hack their way round.
I have used the very handy mod_xsendfile for apache (if you have that kind of access to your server) which will allow you to serve files that can be protected by access control and not accessed without the appropriate credentials.
Code snippet that you could put in your CI controller to display an image (adapted from the mod_xsendfile page):
...
if ($user->isLoggedIn())
{
header("X-Sendfile: $path_to_somefile");
header('Content-Type: image/jpeg');
exit;
}
If you cannot install mod_xsendfile then your only other option would be to use readfile() as TheShiftExchange says.
Use PHP to return images and lock the image directory behind the webserver root. This way, before serving an image you can check the user credentials via session variable, assuring that he is allowed to view the image. Otherwise you can redirect the user straight back to the website alerting him he does not have access. Serving images like this is way slower than just serving them via webserver (apache, nginx,...) but it will enable you to have control over the downloading of the images.
To be more exact, save the image details in a database, for example having columns: id, file_path, title, uid. Everytime a user wants to download an image for example calling http://domain.com/files/download/3 you can check if image with id 3 can be downloaded for the currently logged in user. You need to write your own controller that will be doing that.
I am doing a similar thing here http://www.mediabox.si/ you can check how images are served. I am allowing thumbnail images and I am watermarking larger images visible to ordinary visitors.
The ONLY way is to store the images outside the public_html. Otherwise by definition you are opening the file to direct access.
Use a controller to check if the user is allowed to access the file and the php function readfile() to serve the file
You can read some code at one of my other questions here: Does this PHP function protect against file transversal?
And this is actually VERY fast - you won't notice a performance hit at all