I'm trying to let users upload files onto my website, but unfortunately some of them seem to turn corrupt when reading them. I've tried both images and html files, and all the images come through corrupt (the HTML files come through fine).
To upload the files I'm using a standard HTML form and the PHP $_FILES array. I'm then using the following code to read the contents of the file:
$filename = $_FILES['varname']['tmp_name'];
$handle = fopen($filename, "r");`
$contents = fread($handle, filesize($filename));
fclose($handle);
Unfortunately the value of $contents is now slightly different to the file I uploaded (here's a snippet from the top of the file):
Original file:
ˇÿˇ·ExifII*ˇÏDucky<ˇÓAdobed¿ˇ€Ñ
New file:
ˇÿˇ· Exif II* ˇÏ Ducky < ˇÓ Adobe d¿ ˇ€ Ñ
As you can see there's a difference in the spacing. Any ideas what would be causing this? Am I handling the file read incorrectly for binary files? It seems odd that it's fine for any text files I upload..
Thanks!
I usually output files like this:
header("Content-Disposition: attachment; filename=\"$fileName\"");
readfile("$HOME_DIR/uploads/$fileName");
exit();
Anyway, to try to debug your problem, you should first understand which phase is failing. Upload or download? To check, just go to your webserver and download the file via FTP, then open it in a binary editor. If it is already corrupt then you need to investigate your upload phase, otherwise it's the other way around.
how do you print $contents ? Are you sure this is a problem with reading the file ?
I guess that maybe this is a problem with PRINTING the file to the output... Try printing it binary way. Something like:
$data = unpack("C*", $contents);
foreach ($data as $v)
{
echo $v, ' ';
}
and compare that with binary dump of the original file...
Related
I want to download different feeds form some publishers. But the poor thing is, that they are first of all zipped as .gz and as second not in the right format. You can download one of the feeds and check it out. They do not have any filespec... So, I'm forced to add the .csv by myself..
My question now is, how can I unzip those files from the different urls?
How I do rename them, I know. But how do I unzip them?
I already searched for it and found this one:
//This input should be from somewhere else, hard-coded in this example
$file_name = '2013-07-16.dump.gz';
// Raising this value may increase performance
$buffer_size = 4096; // read 4kb at a time
$out_file_name = str_replace('.gz', '', $file_name);
// Open our files (in binary mode)
$file = gzopen($file_name, 'rb');
$out_file = fopen($out_file_name, 'wb');
// Keep repeating until the end of the input file
while (!gzeof($file)) {
// Read buffer-size bytes
// Both fwrite and gzread and binary-safe
fwrite($out_file, gzread($file, $buffer_size));
}
// Files are done, close files
fclose($out_file);
gzclose($file);
But with those feeds it doesn't work...
Here a two example files: file one | file two
Do you have an idea? - Would be very grateful!
Greetings!
windows 10 + php7.1.4 it's work.
The following code has the same effect.
ob_start();
readgzfile($file_name);
file_put_contents($output_filename', ob_get_contents());
ob_clean();
Or you can try to use the gzip command to decompress, and then use the it.
Program execution Functions
I have a blob field in a database. In which i store values resulting from this:
mysqli_real_escape_string($mysql_link, file_get_contents($_FILES['file']['tmp_name']));
I created a script to download this file from database.
The script works and the file is downloaded. But there is an issue:
If the file stored there, is an image file, like "jpeg" or "png", everything is just fine. But if it is other type, as "PDF", for example, when i open it, the pdf reader can't read the file. Says that the pdf is damaged. I don't know why it's happening with some file types and not with others.
Here's the download script:
function download($filedata, $filename){
header('Content-Type: '.$filedata['mime']);
header('Content-Disposition: attachment; filename="' . $filename . "." . $filedata['extension']);
header('Pragma: no-cache');
ob_clean();
echo $filedata['data'];
exit;
}
$filedata is the fetched result from database. I also checked it and the values are correct, according to each file, and after all, the images are working perfectly.
It is really tricking me!
Assuming this PDF example execution, the var values would be as follows:
$filedata['mime'] = "application/pdf";
$filedata['extension'] = "pdf";
$filedata['data'] = the blob content;
$filename = uniqid();
Thx in advance!
UPDATE:
I ran a test with PDF files. Selected a set of working PDF's from my local machine, uploaded all, storing them in the database, using the file_get_contents() mechanism, cited above, then downloaded it all back, using the download script, also cited above. The results: some of the downloaded PDF files worked, some of them didn't.
A friend told me something about the encoding of the files. Which could explain the fact that some files, after download, works and some do not. Could it be? And how could i fix it?
UPDATE:
I used echo mb_detect_encoding($filedata['data']); exit; and all printed "UTF-8", for both working and not working files. So, it's not an encoding issue. Any other ideas?
I am downloading a gzip csv and writing the un-zipped string to a file. Using:
$file = gzopen($this->getTmpZipFileName(), 'rb');
$outPutFile = fopen($uncompressedFileName, 'wb');
while(!gzeof($file)){
fwrite($outPutFile, gzgets($file, $bufferSie));
}
At some point during this process something is breaking with a space " ". It is treating the " " as a new line. Which of course will 'break' the csv.
I believe it is something to do with the uncompressing of the gzip file. If I dump out
var_dump(var_dump(gzread($file,100000)));
die();
I get the same issue.
Uncompressing the csv through terminal the csv file is fine.
I am at a loss of what else I can try to open the file correctly.
Any help will be much appreciated.
It turns out when I was creating the gzip file something was messed up with the compression. Using another file from another source works as expected. This has driven me mad!!
I'm trying to debug this issue by posting raw PNG image data to the server with the help of Postman. Here's a screenshot, which might help to understand the issue:
On the server I'm receiving the file as follows:
$png = $GLOBALS["HTTP_RAW_POST_DATA"];
Then I write the data to a new file:
$fh = fopen($myFile, 'w') or die("can't open file");
fwrite($fh, $png);
fclose($fh);
The file gets saved correctly, but it now has a different file size,
417KB instead of 279KB which is the size of the original file.
Now of course, I can't do any image operation as none of the functions (such as getimagesize which returns bool(false)) recognizes the file as a valid image.
I have debugged this process to a point where the issue must be somewhere in the file operations, but I don't understand why the file just doesn't result in the very same file type and size as the original, when the only thing I am doing is using the same raw data.
UPDATE:
I've now compared the encodings of the original file with the uploaded one,
and the former is in ISO-8859-1 and it displays correctly, the latter is in UTF-8 and has about 138kB more in file size.
Now I've achieved to convert the file on the server to ISO-8859-1.
fwrite($fh, iconv("UTF-8", "ISO-8859-1", $png));
The resulting file does now have the same output file size (279kB),
but it is still not recognized as a PNG image, some information seems to still get lost.
UPDATE (1):
I've been able to examine the issue further and found out, that the original file is exactly 4 bytes bigger than the generated file, thus the resulting PNG seems to be corrupted.
UPDATE (2):
I'm now able to save the file and open it as a valid PNG. The following code seems to be saving the image correctly:
$input = fopen("php://input","r+");
$destination = fopen($myFile, 'w+');
stream_copy_to_stream($input, $destination);
fclose($input);
fclose($destination);
However when trying to open the file with the imagecreatefrompng function I get a 500 error. I'm now trying to figure out if it's a memory issue in PHP.
Problem might be the way you test your POST by copying the "binary" data into a text field.
If you paste the same data into a text editor you won't get a valid image file either when saving this with the png extension.
Try to build a simple form with file field to test your upload
I use nginx for uploads and haven't had a problem, but I use the standard PHP way of uploading files as per: http://www.php.net/manual/en/features.file-upload.post-method.php
I would suggest trying that.
Try using: < ?php $postdata = file_get_contents("php://input"); ?>
To get the raw data. I use it some times to get data sent from a ajax post on cake.
I have a page with html5 drag and drop upload feature and the file is uploading using PUT method. If I upload large image files, only part of the image is getting saved into the server. Im using the following PHP code to save the file
$putdata = fopen("php://input", "r");
$fp = fopen("/tmp/myputfile" . microtime() . ".jpg", "w");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
Anything wrong with this ? please help
I think is becos the entire file is not completely uploaded yet when you try to read, so it sometimes will return you zero bytes read. But there might still be data being uploaded.
Maybe you can try using the feof function to check if there is any more data to be read?
see "http://www.php.net/manual/en/function.feof.php"
If you are on Windows, you should add the "b" to the mode-parameter of fopen(). see manual BTW. it is only a good idea to add the param for code-portability...