File upload using FileSystem in laravel5.1? - php

I want to upload file in laravel, I am using only local server and but not any cloud storage.
From doc,I found two methods related with file upload,
Normal method (Request::file('file')->move(dest,filename))
Using File system
But I could not understand actual difference between these methods?
If I am using normal method,
$file->move('path', $fileName);
Here where should be path located,storage/app or public/uploads(new)?
How can I upload files using File System ?

The "normal" method allows you to save the uploaded file only in the local filesystem in any place you choose, as long as PHP has write access to that location. This method can only be used for "save" operation and only for uploaded files.
The "filesystem" method is more flexible as it adds a layer of abstraction over the place you write to. Filesystem configuration is stored in a separate config and is transparent to your code. You can easily change the underlying storage (e.g. from local to cloud) or change paths without any changes to the code. The filesystem also gives you a lot of additional helper methods to operate on files it store, like list all files, checking existence, removing. Additional advantage is that it can be used to any files, not only the ones uploaded by the user during current request.
Answering your second question: you decide where to store files in both normal and filesystem method. In the normal method you pass the path, in filesystem you configure the paths using filesystems.php config.
And how to upload the file using File systems? You don't use the filesystem to upload the file, you use it to save the uploaded file. The upload process is the same, but instead of calling $uploadedFile->move() you do:
Storage::put('file key', file_get_contents(Request::file('file')->getRealPath()));

the difference between these two methods is about usage.
You use the "File System" service to work on your local/cloud file system, like creating, moving or deleting file. You could use the Storage::put() method to store the uploaded file, of course, but you would have to get the file from the request either way. So you normally just use the $request->file('photo')->move($destinationPath); method as specified in http://laravel.com/docs/5.1/requests#files to move the file where you want it to be. The File system service is not meant to handle uploads itself. That is what the Request is for.
The question about where you put the files is one you have to answer yourself. The default path for storing files is storage/app. You can put them to public/uploads but it is discouraged as anyone knowing the URL can download the files. It really depends on what the file is meant to. If it is a say profile picture then it can be put in public/uploads. Is the file private then is should not be put there but instead in storage/app.

Related

Can I define a folder where to download to, using force_download()?

I'd need to define a folder where a downloaded files is placed.
Is it possible to achieve a download into a specific folder using the force_download() function, of Codeigniter's framework?
force_download() is part of CI download helper
Generates server headers which force data to be downloaded to your
desktop. Useful with file downloads. The first parameter is the name
you want the downloaded file to be named, the second parameter is the
file data.
that said, a file will be downloaded to your designated download folder, wherever that is on your local disk. You can use this approach to make files downloadable for any user
what you are looking for is to use the CI FTP Class:
Downloads a file from your server. You must supply the remote path and
the local path, and you can optionally set the mode. Example:
$this->ftp->download('/public_html/myfile.html', '/local/path/to/myfile.html', 'ascii');
you must make sure that each time you call this to have the user supplying you with a valid local path, where the downloaded files will be stored.

Security in uploading any file types to server in php [duplicate]

I am allowing users to upload files to my server. What possible security threats do I face and how can I eliminate them?
Let's say I am allowing users to upload images to my server either from their system or from net. Now to check even the size of these images I have to store them in my /tmp folder. Isn't it risky? How can I minimize the risk?
Also let's say I am using wget to download the images from the link that the users upload in my form. I first have to save those files in my server to check if they actually are images. Also what if a prankster gives me a URL and I end up downloading an entire website full of malware?
First of all, realize that uploading a file means that the user is giving you a lot of data in various formats, and that the user has full control over that data. That's even a concern for a normal form text field, file uploads are the same and a lot more. The first rule is: Don't trust any of it.
What you get from the user with a file upload:
the file data
a file name
a MIME type
These are the three main components of the file upload, and none of it is trustable.
Do not trust the MIME type in $_FILES['file']['type']. It's an entirely arbitrary, user supplied value.
Don't use the file name for anything important. It's an entirely arbitrary, user supplied value. You cannot trust the file extension or the name in general. Do not save the file to the server's hard disk using something like 'dir/' . $_FILES['file']['name']. If the name is '../../../passwd', you're overwriting files in other directories. Always generate a random name yourself to save the file as. If you want you can store the original file name in a database as meta data.
Never let anybody or anything access the file arbitrarily. For example, if an attacker uploads a malicious.php file to your server and you're storing it in the webroot directory of your site, a user can simply go to example.com/uploads/malicious.php to execute that file and run arbitrary PHP code on your server.
Never store arbitrary uploaded files anywhere publicly, always store them somewhere where only your application has access to them.
Only allow specific processes access to the files. If it's supposed to be an image file, only allow a script that reads images and resizes them to access the file directly. If this script has problems reading the file, it's probably not an image file, flag it and/or discard it. The same goes for other file types. If the file is supposed to be downloadable by other users, create a script that serves the file up for download and does nothing else with it.
If you don't know what file type you're dealing with, detect the MIME type of the file yourself and/or try to let a specific process open the file (e.g. let an image resize process try to resize the supposed image). Be careful here as well, if there's a vulnerability in that process, a maliciously crafted file may exploit it which may lead to security breaches (the most common example of such attacks is Adobe's PDF Reader).
To address your specific questions:
[T]o check even the size of these images I have to store them in my /tmp folder. Isn't it risky?
No. Just storing data in a file in a temp folder is not risky if you're not doing anything with that data. Data is just data, regardless of its contents. It's only risky if you're trying to execute the data or if a program is parsing the data which can be tricked into doing unexpected things by malicious data if the program contains parsing flaws.
Of course, having any sort of malicious data sitting around on the disk is more risky than having no malicious data anywhere. You never know who'll come along and do something with it. So you should validate any uploaded data and discard it as soon as possible if it doesn't pass validation.
What if a prankster gives me a url and I end up downloading an entire website full of malware?
It's up to you what exactly you download. One URL will result at most in one blob of data. If you are parsing that data and are downloading the content of more URLs based on that initial blob that's your problem. Don't do it. But even if you did, well, then you'd have a temp directory full of stuff. Again, this is not dangerous if you're not doing anything dangerous with that stuff.
1 simple scenario will be :
If you use a upload interface where there are no restrictions about the type of files allowed for upload then an attacker can upload a PHP or .NET file with malicious code that can lead to a server compromise.
refer:
http://www.acunetix.com/websitesecurity/upload-forms-threat.htm
Above link discusses the common issues
also refer:
http://php.net/manual/en/features.file-upload.php
Here are some of them:
When a file is uploaded to the server, PHP will set the variable $_FILES[‘uploadedfile’][‘type’] to the mime-type provided by the web browser the client is using. However, a file upload form validation cannot depend on this value only. A malicious user can easily upload files using a script or some other automated application that allows sending of HTTP POST requests, which allow him to send a fake mime-type.
It is almost impossible to compile a list that includes all possible extensions that an attacker can use. E.g. If the code is running in a hosted environment, usually such environments allow a large number of scripting languages, such as Perl, Python, Ruby etc, and the list can be endless.
A malicious user can easily bypass such check by uploading a file called “.htaccess”, which contains a line of code similar to the below: AddType application/x-httpd-php .jpg
There are common rules to avoid general issues with files upload:
Store uploaded files not under your website root folder - so users won't be able to rewrite your application files and directly access uploaded files (for example in /var/uploads while your app is in /var/www).
Store sanitated files names in database and physical files give name of file hash value (this also resolves issue of storing files duplicates - they'll have equal hashes).
To avoid issues with filesystem in case there are too many files in /var/uploads folder, consider to store files in folders tree like that:
file hash = 234wffqwdedqwdcs -> store it in /var/uploads/23/234wffqwdedqwdcs
common rule: /var/uploads/<first 2 hash letters>/<hash>
install nginx if you haven't done its already - it serves static like magic and its 'X-Accel-Redirect' header will allow you to serve files with permissions being checked first by custom script

Safe image upload in PHP [duplicate]

I am allowing users to upload files to my server. What possible security threats do I face and how can I eliminate them?
Let's say I am allowing users to upload images to my server either from their system or from net. Now to check even the size of these images I have to store them in my /tmp folder. Isn't it risky? How can I minimize the risk?
Also let's say I am using wget to download the images from the link that the users upload in my form. I first have to save those files in my server to check if they actually are images. Also what if a prankster gives me a URL and I end up downloading an entire website full of malware?
First of all, realize that uploading a file means that the user is giving you a lot of data in various formats, and that the user has full control over that data. That's even a concern for a normal form text field, file uploads are the same and a lot more. The first rule is: Don't trust any of it.
What you get from the user with a file upload:
the file data
a file name
a MIME type
These are the three main components of the file upload, and none of it is trustable.
Do not trust the MIME type in $_FILES['file']['type']. It's an entirely arbitrary, user supplied value.
Don't use the file name for anything important. It's an entirely arbitrary, user supplied value. You cannot trust the file extension or the name in general. Do not save the file to the server's hard disk using something like 'dir/' . $_FILES['file']['name']. If the name is '../../../passwd', you're overwriting files in other directories. Always generate a random name yourself to save the file as. If you want you can store the original file name in a database as meta data.
Never let anybody or anything access the file arbitrarily. For example, if an attacker uploads a malicious.php file to your server and you're storing it in the webroot directory of your site, a user can simply go to example.com/uploads/malicious.php to execute that file and run arbitrary PHP code on your server.
Never store arbitrary uploaded files anywhere publicly, always store them somewhere where only your application has access to them.
Only allow specific processes access to the files. If it's supposed to be an image file, only allow a script that reads images and resizes them to access the file directly. If this script has problems reading the file, it's probably not an image file, flag it and/or discard it. The same goes for other file types. If the file is supposed to be downloadable by other users, create a script that serves the file up for download and does nothing else with it.
If you don't know what file type you're dealing with, detect the MIME type of the file yourself and/or try to let a specific process open the file (e.g. let an image resize process try to resize the supposed image). Be careful here as well, if there's a vulnerability in that process, a maliciously crafted file may exploit it which may lead to security breaches (the most common example of such attacks is Adobe's PDF Reader).
To address your specific questions:
[T]o check even the size of these images I have to store them in my /tmp folder. Isn't it risky?
No. Just storing data in a file in a temp folder is not risky if you're not doing anything with that data. Data is just data, regardless of its contents. It's only risky if you're trying to execute the data or if a program is parsing the data which can be tricked into doing unexpected things by malicious data if the program contains parsing flaws.
Of course, having any sort of malicious data sitting around on the disk is more risky than having no malicious data anywhere. You never know who'll come along and do something with it. So you should validate any uploaded data and discard it as soon as possible if it doesn't pass validation.
What if a prankster gives me a url and I end up downloading an entire website full of malware?
It's up to you what exactly you download. One URL will result at most in one blob of data. If you are parsing that data and are downloading the content of more URLs based on that initial blob that's your problem. Don't do it. But even if you did, well, then you'd have a temp directory full of stuff. Again, this is not dangerous if you're not doing anything dangerous with that stuff.
1 simple scenario will be :
If you use a upload interface where there are no restrictions about the type of files allowed for upload then an attacker can upload a PHP or .NET file with malicious code that can lead to a server compromise.
refer:
http://www.acunetix.com/websitesecurity/upload-forms-threat.htm
Above link discusses the common issues
also refer:
http://php.net/manual/en/features.file-upload.php
Here are some of them:
When a file is uploaded to the server, PHP will set the variable $_FILES[‘uploadedfile’][‘type’] to the mime-type provided by the web browser the client is using. However, a file upload form validation cannot depend on this value only. A malicious user can easily upload files using a script or some other automated application that allows sending of HTTP POST requests, which allow him to send a fake mime-type.
It is almost impossible to compile a list that includes all possible extensions that an attacker can use. E.g. If the code is running in a hosted environment, usually such environments allow a large number of scripting languages, such as Perl, Python, Ruby etc, and the list can be endless.
A malicious user can easily bypass such check by uploading a file called “.htaccess”, which contains a line of code similar to the below: AddType application/x-httpd-php .jpg
There are common rules to avoid general issues with files upload:
Store uploaded files not under your website root folder - so users won't be able to rewrite your application files and directly access uploaded files (for example in /var/uploads while your app is in /var/www).
Store sanitated files names in database and physical files give name of file hash value (this also resolves issue of storing files duplicates - they'll have equal hashes).
To avoid issues with filesystem in case there are too many files in /var/uploads folder, consider to store files in folders tree like that:
file hash = 234wffqwdedqwdcs -> store it in /var/uploads/23/234wffqwdedqwdcs
common rule: /var/uploads/<first 2 hash letters>/<hash>
install nginx if you haven't done its already - it serves static like magic and its 'X-Accel-Redirect' header will allow you to serve files with permissions being checked first by custom script

PHP receive file as byte array using Zend_Form

We are using Zend_Form inside a PHP application for building an input file html element. We can set the 'destination' of this element, and when calling receive() the file will be saved to the specified location.
We want to be able not to save the file to disc at all, but grab the file as a byte array and do something else with it.
Is this possible? If it is not possible with Zend_Form(), can it be done any other way?
EDIT: The reason why we cannot write to disc is because the application runs on Azure, and it seems that it does not have write access rights anywhere, not even in the temp folder. We get an exception from Zend saying that 'The given destination is not writeable'.
The only thing that seems viable would be to save the file using the php://memory protocol.
I've never had reason to implement but it looks a simple as setting the save location of the file to php://memory here is the link to the manual page PHP I/O Wrappers.
All PHP uploads are written to the file system regardless of using Zend or not (see upload_tmp_dir and POST method uploads).
Files will, by default be stored in the server's default temporary
directory, unless another location has been given with the
upload_tmp_dir directive in php.ini.
Instead of using receive to process the upload, try accessing it directly using the $_FILES array which would let you read the file into a string using file_get_contents() or similar functions. You can however, still use Zend_Form to create and handle the form in general.
You could set up shared memory upload_tmp_dir to map a filesystem to memory where uploaded files are held. Be cautious with this as if someone attempts to upload a very large file, it will go into memory which could affect performance or your cost of service.
Ultimately, Zend_File_Transfer_Adapter_Http::receive() calls move_uploaded_file() to move the file from its temporary location to the permanent location. In addition it makes sure the upload is valid and filters it, and marks it as received so it cannot be moved again (as that would fail).

What is the best location on server to upload images?

Among many folders available on the server by default like "public_html", "public_ftp" or simply the root, which one is the best to upload and store users' images safely which I can also add a link to mysql db?
If your clients upload images via http form, store it in public_html if they need access it from web.
I recommend you to save the files somewhere in "public_html" and create the following file to restrict public access:
file: upload/.htaccess
deny from all
In your php script you can send the files only to the users with access.
In my opinion, store it outside of the public htdocs folder. This is so that if someone manages to upload anything other than an image file (such as an evil PHP script), they won't be able to call it (and it won't run).
If you're public_html folder is say /path/to/website/public_html I'd store them in /path/to/website/uploaded_images
Also, make sure you that validate it so that it uses a white-list of allowed image names (such as only allowing .jpg, .gif and .png)
Edit:
You also need to create a script which opens the image file and passes it through back to the user
Define safely.
If you want to display these images on a site, that will be a folder in public_html, obviously, unless you want to use a script which presents the images to the user, possibly altering the image on-the fly, changing cache headers etc. Then you may move that folder outside of docroot anywhere the script will have access to those files.
Personally I store user images, files and everything in docroot, protected by .htaccess and accessed via a script which handles user permissions, if necessary. The files reside in an /uploaded folder with subfoldering up to 2 levels deep, each 'level' storing up to 1024 dirs/files. Files are named by their ids only, without any extension, all file info is stored in database. Took me some time to implement, but thankfully this is a reusable code.
Any folder inside your document root is fine. If you want it to be secure, make sure your script accepts only allowed file typed and, as another measure, put an .htaccess file inside that folder:
<FilesMatch "*.php">
SetHandler None
</FilesMatch>
This will ensure nothing get's executed from inside this directory.

Categories