Secure delete with PHP 5.3.x - php

Does someone knows an good PHP Solution to delete or better wipe an file from an linux system?
Scenario:
File is encrypted and saved, when a download is requested the file is copyed to an temporary folder and decrypted. This is already working.
But how to remove the file from the temporary location after sending in to the user?
In my mind i have the following options:
Open the File via "fopen" and write 0,1 into it (think very slow)
Save file to Memcache instead of harddisk (could be a problem with my hoster)
Use somd 3rd pary tool on commandline or as cronjob (could be a problem to install)
Goal: Delete the file from hard disk, without the possibility to recover (wipe/overwrite)

Call "shred" via exec/system/passthru

Arguably the best is to never save the file in its decrypted state in the first place.
Rather, use stream filters to decrypt it on-the-fly and send it directly to the end-user.
Update
Your option 1 is actually not too bad if you consider this code:
$filename = 'path/to/file';
$size = filesize($filename);
$src = fopen('/dev/zero', 'rb');
$dest = fopen('/path/to/file', 'wb');
stream_copy_to_stream($src, $dest, $size);
fclose($src);
fclose($dest);
You could choose /dev/urandom as well, but that will be slow.

Related

How to avoid duplicate file upload but keep the uploader unaware of it?

First of all, I apologize if the question is not clear, I'm explaining it below.
For every file uploaded, I'm renaming the file and recording the hash values (using sha1_files function, please suggest if there are some better or faster hashing techniques for the file in php) in a separate DB table and checking the hash of every new file to avoid duplicate files.
In this manner, the one uploading a duplicate file will get an error msg and the file won't be uploaded.
My question is, is there any techniques or algorithm by which I can prevent duplicate file upload but the duplicate file uploader will be unaware of it and will find the file in his/her account with a different name than the one already present. However, users won't be able to upload banned files by any means.
Yes, you should use xxhash which is much faster than sha1.
According to their benchmarks:
The benchmark uses SMHasher speed test, compiled with Visual 2010 on a
Windows Seven 32-bits box. The reference system uses a Core 2 Duo
#3GHz
SHA1-32 is 0.28 GB/s fast, and xxHash is 5.4 GB/s.
The PHP library is only getting a string as input, so you should use the binary library, and have something like this in your PHP:
list($hash) = explode(" ", shell_exec("/path/to/xxHash/xxhsum " . escapeshellarg($filePath)));
echo $hash;
Installing xxhash:
$ wget https://codeload.github.com/Cyan4973/xxHash/tar.gz/v0.6.3 -O xx.tar.gz
$ tar xvzf xx.tar.gz
$ cd xxHash-0.6.3; make
Just add some extra logic in your code possibly using an extra table or extra fields in the existing table (it is up to you, there is more than one way to do it) that saves the file to an alternate location should you discover it is a duplicate rather than sending an error. Not sure, though, if what you are doing is a good idea from the UI design point of view, as you are doing something different with the user input in a way that the user will notice without telling the user why.
Use an example like this to generate your sha1 hash client side before upload.
Save all your uploaded files with their hash as the filename, or have a database table which contains the hash and your local filename for each file, also save file size and content type.
Before upload submit hash from client side to your server and check for hash in database. If its not present then commence file upload. If present then fake the upload client side or whatever you want to do so the user thinks they have uploaded their file.
Create a column in your users table for files uploaded. Store a serialised associative array in this column with hash => users_file_name as key=>value pairs. Unserialize and display to each user to maintain their own file names then use readfile to serve them the file with the correct name, selecting it server side using the hash
As for your URL question. Create a page for the downloads but include the user in the url as well, so mysite.com/image.php?user=NewBee&image=filename.jpg
Query the database for files uploaded by NewBee and unserialize the array. Then:
$upload = $_GET['image'];
foreach($array as $hash => $filename){
if($filename == $upload)
$file = $hash;
}
Seach database for the path to your copy of that file, then using readfile you can output the same file with whatever namme you want.
header("Content-Description: File Transfer");
header("Content-type: {$contenttype}");
header("Content-Disposition: attachment; filename=\"{$filename}\"");
header("Content-Length: " . filesize($file));
header('Pragma: public');
header("Expires: 0");
readfile($file);
You could create an extra table which links files uploaded (so entries in your table with file hashes) with useraccounts. This table can contain an individual file name for every file belonging to a specific user (so the same file can have a different name per user). With current technologies you could also think about creating the file hash in the browser via javascript and then upload the file only if there isn't already a file with that hash in your database if it is you can instead just link this user to the file.
Addition because of comment:
If you want the same file to be accessible through multiple urls you can use something like apache's mod_ rewrite. I'm no expert with that but you can look here for a first idea. You could update the .htaccess dynamically with your upload script.

Download and rename a file via url with PHP

I have this URL
www1.intranet.com/reportingtool.asp?settings=var&export = ok
There I can download a report. The file-name of the report includes a timestamp. e.g. 123981098298.xls and varies everytime I download it.
I want to have a script with this functions:
<?php
//Download the File
//rename it to **report.xls**
//save it to a specified place
?>
I don't have any idea after searching stackoverflow and googling on this topic :(
Is this generally possible?
The simplest scenario
You can download the report with file_get_contents:
$report = file_get_contents('http://www1.intranet.com/reportingtool.asp?...');
And save it locally (on the machine where PHP runs) with file_put_contents:
file_put_contents('/some/path/report.xls', $report);
More options
If downloading requires control over the HTTP request (e.g. because you need to use cookies or HTTP authentication) then it has to be done through cURL which enables full customization of the request.
If the report is large in size then it could be directly streamed to the destination instead of doing read/store/write in three steps (for example, using fopen/fread/fwrite).
This may not work depending on your security settings, but it's a simple example:
<?php
$file = file_get_contents('http://www1.intranet.com/reportingtool.asp?settings=var&export=ok');
file_put_contents('/path/to/your/location/report.xls', $file);
See file_get_contents and file_put_contents.

Security of unzipping user submitted files

Not so much of a coding problem here, but a general question relating to security.
I'm currently working on a project that allows user submitted content.
A key part of this content is the user uploads a Zip file.
The zip file should contain only mp3 files.
I then unzip those files to a directory on the server, so that we can stream the audio on the website for users to listen to.
My concern is that this opens us up for some potentially damaging zip files.
I've read about 'zipbombs' in the past, and obviously don't want a malicious zip file causing damage.
So, is there a safe way of doing this?
Can i scan the zip file without unzipping it first, and if it contains anything other than MP3's delete it or flag a warning to the admin?
If it makes a difference i'm developing the site on Wordpress.
I currently use the built in upload features of wordpress to let the user upload the zip file to our server (i'm not sure if there's any form of security within wordpress already to scan the zip file?)
Code, only extract MP3s from zip, ignore everthing else
$zip = new ZipArchive();
$filename = 'newzip.zip';
if ($zip->open($filename)!==TRUE) {
exit("cannot open <$filename>\n");
}
for ($i=0; $i<$zip->numFiles;$i++) {
$info = $zip->statIndex($i);
$file = pathinfo($info['name']);
if(strtolower($file['extension']) == "mp3") {
file_put_contents(basename($info['name']), $zip->getFromIndex($i));
}
}
$zip->close();
I would use use something like id3_get_version (http://www.php.net/manual/en/function.id3-get-version.php) to ensure the contents of the file is mp3 too
Is there a reason they need to ZIP the MP3s? Unless there's a lot of text frames in the ID3v2 info in the MP3s, the file size will actually increase with the ZIP due to storage of the dictionary.
As far as I know, there isn't any way to scan a ZIP without actually parsing it. The data are opaque until you run each bit through the Huffman dictionary. And how would you determine what file is an MP3? By file extension? By frames? MP3 encoders have a loose standard (decoders have a more stringent spec) which makes it difficult to scan the file structure without false negatives.
Here are some ZIP security risks:
Comment data that causes buffer overflows. Solution: remove comment data.
ZIPs that are small in compressed size but inflate to fill the filesystem (classic ZIP bomb). Solution: check inflated size before inflating; check dictionary to ensure it has many entries, and that the compressed data isn't all 1's.
Nested ZIPs (related to #2). Solution: stop when an entry in the ZIP archive is itself ZIP data. You can determine this by checking for the central directory's marker, the number 0x02014b50 (hex, always little-endian in ZIP - http://en.wikipedia.org/wiki/Zip_%28file_format%29#Structure).
Nested directory structures, intended to exceed the filesystem's limit and hang the deflating process. Solution: don't unzip directories.
So, either do a lot of scrubbing and integrity checks, or at the very least use PHP to scan the archive; check each file for its MP3-ness (however you do that - extension and the presence of MP3 headers? You can't rely on them being at byte 0, though. http://en.wikipedia.org/wiki/MP3#File_structure) and deflated file size (http://www.php.net/manual/en/function.zip-entry-filesize.php). Bail out if an inflated file is too big, or if there are any non-MP3s present.
Use the following code the file names inside a .zip archive:
$zip = zip_open('test.zip');
while($entry = zip_read($zip)) {
$file_name = zip_entry_name($entry);
$ext = pathinfo($file_name, PATHINFO_EXTENSION);
if(strtoupper($ext) !== 'MP3') {
notify_admin($file_name);
}
}
Note that following code will only have look at the extension. Meaning that user can upload anything what has a MP3 extension. To really check if the file is an mp3 you'll have to unpack it. I would advice you to do that in a temporary directory.
After the file is unpacked you may analyze it using, for example ffmpeg or whatever. Having detailed data about bitrate, track lenght, etc will be interesting in any case.
If the analysis fails you can flag the file.

Gzcompressed or plain string

I have a file plain.cache which is little over 10MB and I made a gzcompressed file gz.cache out of the original plain.cache file. Then, I made two separate files which load each of the mentioned cache files and was kind of surprised that the page load speed of both files was almost the same. So, my question is - am I being right by concluding that gzcompressed file does not in any way benefit the load speed of the page? Now, I would conclude that the gzuncompress that I use in the gz.php file "makes" the same exact string just as when I read it from the plain file. Given all these staments - a general question is how can I (if it all in all is done this way) increase the load speed by compressing the file with gzcompress.
The image of the files is below, and the code of files is as follows:
_makeCache.php, in which I make the gzcompressed version of the plain.cache file:
$str = file_get_contents("plain.cache");
$strCompressed = gzcompress($str, 9);
$file = "gz.cache";
$fp = fopen($file, "w");
fwrite($fp, $strCompressed);
fclose($fp);
plain.php:
echo file_get_contents("plain.cache");
gz.php:
echo gzuncompress(file_get_contents("plain.cache"));
Your http server is compressing the plain.cache automatically on the fly, using gzip as well, and the client decompresses it. So you should see almost no difference.

Best way to store an image from a url in php?

I would like to know the best way to save an image from a URL in php.
At the moment I am using
file_put_contents($pk, file_get_contents($PIC_URL));
which is not ideal. I am unable to use curl. Is there a method specifically for this?
Using file_get_contents is fine, unless the file is very large. In that case, you don't really need to be holding the entire thing in memory.
For a large retrieval, you could fopen the remote file, fread it, say, 32KB at a time, and fwrite it locally in a loop until all the file has been read.
For example:
$fout = fopen('/tmp/verylarge.jpeg', 'w');
$fin = fopen("http://www.example.com/verylarge.jpeg", "rb");
while (!feof($fin)) {
$buffer= fread($fin, 32*1024);
fwrite($fout,$buffer);
}
fclose($fin);
fclose($fout);
(Devoid of error checking for simplicity!)
Alternatively, you could forego using the url wrappers and use a class like PEAR's HTTP_Request, or roll your own HTTP client code using fsockopen etc. This would enable you to do efficient things like send If-Modified-Since headers if you are maintaining a cache of remote files.
I'd recommend using Paul Dixon's strategy, but replacing fopen with fsockopen(). The reason is that some server configurations disallow URL access for fopen() and file_get_contents(). The setting may be found in php.ini and is called allow_url_fopen.

Categories