I'm trying to make a log system using mongodb in php and GridFS. I can initially write data to my file but once I close the stream I don't know how to append data later to it. This is how I write to it:
$bucket= DB::connection('mongodb')->getMongoDB()->selectGridFSBucket();
$stream = $bucket->openUploadStream('my-file-name.txt');
$contents = 'whatever text here \n';
fwrite($stream, $contents);
fclose($stream);
I tried retreiving it and appending data to the stream but it doesn't work. This is attempt:
$bucket= DB::connection('mongodb')->getMongoDB()->selectGridFSBucket();
$stream = $bucket->openDownloadStreamByName('my-file-name.txt');
fwrite($stream, $contents);
also tried fopen on the stream but no luck. I also don't know how to retrieve this file Id after writing data to it.
I couldn't find a simple solution to append so I ended up replacing the previous file with a new file and deleting the old one.
$stream = $bucket->openDownloadStream($this->file_id);
$prev_content = stream_get_contents($stream);
//delete old data chunks
DB::connection('mongodb')->collection('fs.chunks')->where('files_id',$this->file_id)->delete();
//delete old file
DB::connection('mongodb')->collection('fs.files')->where('_id',$this->file_id)->delete();
$new_content =$prev_content.$dt.$msg."\n"; // append to log body
$new_filename = $filename_prefix;
$stream = $bucket->openUploadStream($new_filename);
fwrite($stream, $new_content);
fclose($stream);
$new_file = DB::connection('mongodb')->collection('fs.files')->where('filename',$new_filename)->first();
UPDATE:
For future readers, after using this system for few months I realized this is not a good solution. Files & chunks can get corrupted and also the process is super slow. I simply switched to logging in a txt file and just storing a reference to my file in mongodb much better :)
Related
I want to download different feeds form some publishers. But the poor thing is, that they are first of all zipped as .gz and as second not in the right format. You can download one of the feeds and check it out. They do not have any filespec... So, I'm forced to add the .csv by myself..
My question now is, how can I unzip those files from the different urls?
How I do rename them, I know. But how do I unzip them?
I already searched for it and found this one:
//This input should be from somewhere else, hard-coded in this example
$file_name = '2013-07-16.dump.gz';
// Raising this value may increase performance
$buffer_size = 4096; // read 4kb at a time
$out_file_name = str_replace('.gz', '', $file_name);
// Open our files (in binary mode)
$file = gzopen($file_name, 'rb');
$out_file = fopen($out_file_name, 'wb');
// Keep repeating until the end of the input file
while (!gzeof($file)) {
// Read buffer-size bytes
// Both fwrite and gzread and binary-safe
fwrite($out_file, gzread($file, $buffer_size));
}
// Files are done, close files
fclose($out_file);
gzclose($file);
But with those feeds it doesn't work...
Here a two example files: file one | file two
Do you have an idea? - Would be very grateful!
Greetings!
windows 10 + php7.1.4 it's work.
The following code has the same effect.
ob_start();
readgzfile($file_name);
file_put_contents($output_filename', ob_get_contents());
ob_clean();
Or you can try to use the gzip command to decompress, and then use the it.
Program execution Functions
I'm using php in order to stream a file as a download. A part of this uses readfile:
<?php
// headers etc...
// ...
readfile('file.exe');
?>
This is a binary file, but for various reasons (that are not relevant to this question) I need to store this file as base64 or similar.
How do I stream a file with readfile that is stored encoded with base64?
I guess there are many ways that lead to success here, but I'm looking for the best & most convenient.
Using stream filters should help:
$inFile = 'file.exe';
$outFile = 'php://output';
$inHandle = fopen($inFile, 'r');
$outHandle = fopen($outFile, 'w');
stream_filter_append($inHandle, 'convert.base64-decode');
stream_copy_to_stream($inHandle , $outHandle);
fclose($inHandle );
fclose($outHandle);
If the file is not too big you can read it using file_get_contents and then decode it with base64_decode:
<?php
$content = file_get_contents('file.exe');
echo base64_decode($content);
I'm trying to get chunked uploads working on a form in my Laravel 4 project. The client side bit works so far, the uploads are chunking in 2MB chunks, and data is being sent from the browser. There's even have a handy progress bar in place to show the upload progress.
The problem is on the PHP side, as I'm unable to write the contents of the upload stream to disk. The system always ends up with a 0 byte file created. The idea is to append the chunks to the already uploaded file as they arrive.
The project is built on Laravel 4, so I'm not sure if Laravel reads the php://input stream and does something with it. Since php://input can only be read once, it possibly means that by the time when my controller actually tries to read it the stream, it would be empty.
The controller looks as follows:
public function upload()
{
$filename = Config::get('tms.upload_path') . Input::file('file')->getClientOriginalName();
file_put_contents($filename, fopen('php://input', 'r'), FILE_APPEND);
}
The file is being created, but it's length always remains at 0 bytes. Any ideas how I can coax the contents of the php://input stream out of the system?
afaik fopen returns a pointer to file, and not an stream, so probably it is not good as a parameter for file_put_contents
can you try with this workaround, instead of your file_put_contents?
$putdata = fopen("php://input", "r");
$fp = fopen($filename, "a");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
The answer to this is simple, I needed to turn off multipart/form-data and use file_get_contents("php://input") to read the contents and pass the result to file_put_contents() like so:
file_put_contents($filename, file_get_contents("php://input"), FILE_APPEND);
This works and fixes my problems.
I have a page with html5 drag and drop upload feature and the file is uploading using PUT method. If I upload large image files, only part of the image is getting saved into the server. Im using the following PHP code to save the file
$putdata = fopen("php://input", "r");
$fp = fopen("/tmp/myputfile" . microtime() . ".jpg", "w");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
Anything wrong with this ? please help
I think is becos the entire file is not completely uploaded yet when you try to read, so it sometimes will return you zero bytes read. But there might still be data being uploaded.
Maybe you can try using the feof function to check if there is any more data to be read?
see "http://www.php.net/manual/en/function.feof.php"
If you are on Windows, you should add the "b" to the mode-parameter of fopen(). see manual BTW. it is only a good idea to add the param for code-portability...
I'm using CodeIgniter and I can't figure out how to unzip files!
PHP itself has a number of functions for dealing with gzip files.
If you want to create a new, uncompressed file, it would be something like this.
Note: This doesn't check if the target file exists first, doesn't delete the input file, or do any error checking. You really should fix those before using this in production code.
// This input should be from somewhere else, hard-coded in this example
$file_name = 'file.txt.gz';
// Raising this value may increase performance
$buffer_size = 4096; // read 4kb at a time
$out_file_name = str_replace('.gz', '', $file_name);
// Open our files (in binary mode)
$file = gzopen($file_name, 'rb');
$out_file = fopen($out_file_name, 'wb');
// Keep repeating until the end of the input file
while(!gzeof($file)) {
// Read buffer-size bytes
// Both fwrite and gzread and binary-safe
fwrite($out_file, gzread($file, $buffer_size));
}
// Files are done, close files
fclose($out_file);
gzclose($file);
Note: This deals with gzip only. It doesn't deal with tar.
gzopen is way too much work. This is more intuitive:
$zipped = file_get_contents("foo.gz");
$unzipped = gzdecode($zipped);
works on http pages when the server is spitting out gzipped data also.
If you have access to system():
system("gunzip file.sql.gz");
Use the functions implemented by the Zlib Compression extension.
This snippet shows how to use some of the functions made available from the extension:
// open file for reading
$zp = gzopen($filename, "r");
// read 3 char
echo gzread($zp, 3);
// output until end of the file and close it.
gzpassthru($zp);
gzclose($zp);
Download the Unzip library
and include or autoload the unzip library
$this->load->library('unzip');