Currently, I am putting all files in my public directory which means that anyone can download the file.
However, I don't want anyone else to see that file other than the user who created it. As of right now, I have no control over that. Maybe if I store it in another directory, I will be using middleware to protect it.
But I'm stuck on the part where I can upload the user-uploaded files.
Where is the best directory to put them? I don't have an external server I just have a VPS.
Laravel has a storage folder designed especially for this case. It's not available from outside your server and you will have to serve the files from it through Laravel.
Related
I'm creating a files sharing service that runs through a mobile app, there's a folder in the server that hosts users uploads, I know usually in these scenarios the uploads folder must be put outside the public http directory, but I'm hosting the code on an online hosting service which doesn't allow doing that
So far here are the security measures that I've done:
Files inside the folder are named with randomly generated IDs while all the file information (Name,type..etc) are stored in the database
The Folder itself is protected using htaccess (Order Deny All) so nobody can access any data inside except scripts hosted on the server
When a user wants to download a file, my idea is to make a script that would copy the required file to a temporary folder, while adding a record in the database to delete the temp file after 2 hours of the request (Cron Job)
How efficient is my method? Can a PHP file handle cloning large number of files without putting too much pressure on the server? And what alternative ways are there to protect the folder data
Thanks for your time reading this
I have a framework that I've written. I created a package for said framework that allows me to track my employee's hours, and gives them a place to dump files for the accountant. (Invoices, etc.)
The problem is that these file dumps are accessible through the browser. I could use a .htaccess file to prevent the files from being served up at all, but the problem is that I would like the accountant and the employee to be able to download their files.
A solution might be to read the files with PHP, create a temporary copy, have the user or accountant download that copy, then delete the copy...but this poses two problems.
1) It's going to be time and resource intensive...especially for large files.
2) For whatever short amount of time, there will be a copy of the file which is accessible to whoever knows the URL.
The other solution would be to put the files outside of the public folder, but the problem is that I would like this package to be portable, and a lot of my servers are shared.
What method could I use to be able to serve the files only when authenticated, and avoid the flaws I described above?
I was wondering is it possible to access a folder not inside the web server? Say for eg. I have a Xammp Installation and inside the htdocs folder I have a web app called MySite which have an Upload Folder
What I wanted to do is redirect all my uploads instead of going to and being saved into MySite\Uploads it will be saved into D:\Data\Uploads.
Is this possible, I presume this should have already been asked many times and answered many times, but I wasnt able to find the right answer maybe because I haven't pinned the right question.
Please help.
it is possible if the webserver has write rights in that folder.
for example if you do a:
file_put_contents("absolute_path_where_you_want_to_save", $file_contents);
It will work if the user executing the php script has the correct rights (write) to do so.
This is specially useful when READING files that are outside the webserver, for example to retrieve mysql user and password from a file that can't be read even if someone gains access to the folders of your webserver.
I have a website which has a lot of confidential data and code which I have custom made. I have hired a developer to do the designing and some simple PHP integration for me.
To prevent him from seeing all files, I made a test environment in one of subfolders like mywebsite.com/testfolder
Now I want him to access the db_test.php, function.php and parameter.php files which are located in the root folder such that he can just include them while executing the scripts (example mywebsite.com/testfolder/mainfile.php) and not download them (with php script or by any other means). The idea is to prevent him to see the code and just use the stuff as it is.
This would also mean that his access to the root folder should be also completely restricted except for the above mentioned files.
I have created a test database and a separate user for him so the database bit is secured.
I have also created a ftp user which can just access the testfolder through ftp
What I am concerned about is that he might run a php script that will give all secrets in the root folder.
I have myself been able to list and download files by running a simple php script from testfolder.
Please suggest how to make this work as I am planning to have a virtual team who will work on the website which will have restricted access to various different resources.
RULE NUMBER ONE: never develop on a live project.
you may create a development environment (=web site) somewhere else, put some meaningless files and/or databases there and allow your developers full access. then, from time to time, you update your working copy from the repository (you have setup hg/git repo, haven't you), review and test the changes and only then upload files to your main web site.
I have a site that people upload large (2mb-3mb) files to, In large quantities. So I need to store them on an external drive (my drobo). How can I upload files to a folder on the server and then how can I write a php script that retrieves them and lets users download them.
Thanks,
Joey Sacchini
To do this, simply move your files into an accessible space.
http://php.net/manual/en/function.move-uploaded-file.php
Be sure to consider the implications of this though. Once you move an uploaded file to an open directory, anyone can access it. This is very dangerous. Imagine someone uploading a PHP script.
It is best to create a script that fetches files from a location not in the web root. At a basic level, you can store the file's properties, such as original name (you should rename them to something random on disk) and mimetype, to database. Then send the file to the client with readfile().
For downloading backups to your own personal hard drive, just use SFTP.
This is not a quick answer, you need to understand how to upload, retrieve, and save the file to the server; set write permissions for PHP and a few other things. I suggest you try these links to get you started fast:
http://www.w3schools.com/php/php_file_upload.asp
http://www.tizag.com/phpT/fileupload.php
Also visit the PHP reference manual for some great examples.
well u can keep the uploaded files outside of server directory. so if ur server root is /www/htdocs u can keep the files in say /uploaded. so use something like
move_uploaded_file($_FILES['file'],'/uploaded')
this way ur files will be inaccesible to the outside world