We are trying to read our website as a file in php and the fread function only can read 255 chars , can anyone help us in this problem?
Code:
$filename = "oururl";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
fclose($handle);
Also when we are trying to read file through file_get_content the same result is appeared to us for only 255 chars.
You're trying to get a size of a remote source/file specified by URL. filesize function is able to get size of file if URL wrapper supports/provides file statisticsIn your case, http or https could be used for URL wrapper.
$filename = "http://www.someurl.com";
But Stateless HTTP/HTTPS protocols don't provide file statistics.
Related
I have been tryign to get the raw contents of a PDF file using PHP to POST to an API using CURL. I have tried using file_get_contents (though the resulting file is not readable) and I have tried
$filename = "/files/example.pdf";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
fclose($handle);
Then I post contents and again the file is not readable by a PDF reader
Any idea's?
Thanks
I'm trying to get chunked uploads working on a form in my Laravel 4 project. The client side bit works so far, the uploads are chunking in 2MB chunks, and data is being sent from the browser. There's even have a handy progress bar in place to show the upload progress.
The problem is on the PHP side, as I'm unable to write the contents of the upload stream to disk. The system always ends up with a 0 byte file created. The idea is to append the chunks to the already uploaded file as they arrive.
The project is built on Laravel 4, so I'm not sure if Laravel reads the php://input stream and does something with it. Since php://input can only be read once, it possibly means that by the time when my controller actually tries to read it the stream, it would be empty.
The controller looks as follows:
public function upload()
{
$filename = Config::get('tms.upload_path') . Input::file('file')->getClientOriginalName();
file_put_contents($filename, fopen('php://input', 'r'), FILE_APPEND);
}
The file is being created, but it's length always remains at 0 bytes. Any ideas how I can coax the contents of the php://input stream out of the system?
afaik fopen returns a pointer to file, and not an stream, so probably it is not good as a parameter for file_put_contents
can you try with this workaround, instead of your file_put_contents?
$putdata = fopen("php://input", "r");
$fp = fopen($filename, "a");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
The answer to this is simple, I needed to turn off multipart/form-data and use file_get_contents("php://input") to read the contents and pass the result to file_put_contents() like so:
file_put_contents($filename, file_get_contents("php://input"), FILE_APPEND);
This works and fixes my problems.
Advise me the most optimal way to save large files from php stdin, please.
iOS developer sends me large video content to server and i have to store it in to files.
I read the stdin thread with video data and write it to the file. For example, in this way:
$handle = fopen("php://input", "rb");
while (!feof($handle)) {
$http_raw_post_data .= fread($handle, 8192);
}
What function is better to use? file_get_contents or fread or something else?
I agree with #hek2mgl that treating this as a multipart form upload would make most sense, but if you can't alter your input interface then you can use file_put_contents() on a stream instead of looping through the file yourself.
$handle = fopen("php://input", "rb");
if (false === file_put_contents("outputfile.dat", $handle))
{
// handle error
}
fclose($handle);
It's cleaner than iterating through the file, and it might be faster (I haven't tested).
Don't use file_get_contents because it would attempt to load all the content of the file into a string
FROM PHP DOC
file_get_contents() is the preferred way to read the contents of a file into a string. It will use memory mapping techniques if supported by your OS to enhance performance.
Am sure you just want to create the movie file on your server .. this is a more efficient way
$in = fopen("php://input", "rb");
$out = fopen('largefile.dat', 'w');
while ( ! feof($in) ) {
fwrite($out, fread($in, 8192));
}
If you use nginx as web server i want to recommend nginx upload module with possibility to resume upload.
Found this:
https://stackoverflow.com/a/11373078/530599 - great, but
how about stream_filter_append($fp, 'zlib.inflate', STREAM_FILTER_*
Looking for another way to uncompress data.
$fp = fopen($src, 'rb');
$to = fopen($output, 'wb');
// some filtering here?
stream_copy_to_stream($fp, $to);
fclose($fp);
fclose($to);
Where $src is some url to http://.../file.gz for example 200+ Mb :)
Added test-code that works, but in 2 steps:
<?php
$src = 'http://is.auto.ru/catalog/catalog.xml.gz';
$fp = fopen($src, 'rb');
$to = fopen(dirname(__FILE__) . '/output.txt.gz', 'wb');
stream_copy_to_stream($fp, $to);
fclose($fp);
fclose($to);
copy('compress.zlib://' . dirname(__FILE__) . '/output.txt.gz', dirname(__FILE__) . '/output.txt');
Try gzopen which opens a gzip (.gz) file for reading or writing. If the file is not compress, it transparently reads it so you can safely read a non-gzipped file.
$fp = gzopen($src, 'rb');
$to = fopen($output, 'w+b');
while (!feof($fp)) {
fwrite($to, gzread($fp, 2048)); // writes decompressed data from $fp to $to
}
fclose($fp);
fclose($to);
One of the annoying omissions in PHP's stream filter subsystem is the lack of a gzip filter. Gzip is essentially contents compressed using the deflate method. It adds a 2-byte header before the deflated data, however, and a Adler-32 checksum at the end. If you just add an zlib.inflate filter to a stream, it's not going to work. You have to skip the first two bytes before attaching the filter.
Note that there's a serious bug with stream filters in PHP version 5.2.X. It's due to stream buffering. Basically PHP would fail to pass data already in the stream's internal buffer through the filter. If you do a fread($handle, 2) to read the gzip header before attaching the inflate filter, there's a good chance that it's going to fail. A call to fread() would cause PHP to try to fill up the its buffer. Even if the call to fread() asks for only two bytes, PHP might actually read many more bytes (let say 1024) from the physical medium in an attempt to improve performance. Due to the aforementioned bug, the extra 1022 bytes would not get send to the decompression routine.
I have a page with html5 drag and drop upload feature and the file is uploading using PUT method. If I upload large image files, only part of the image is getting saved into the server. Im using the following PHP code to save the file
$putdata = fopen("php://input", "r");
$fp = fopen("/tmp/myputfile" . microtime() . ".jpg", "w");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
Anything wrong with this ? please help
I think is becos the entire file is not completely uploaded yet when you try to read, so it sometimes will return you zero bytes read. But there might still be data being uploaded.
Maybe you can try using the feof function to check if there is any more data to be read?
see "http://www.php.net/manual/en/function.feof.php"
If you are on Windows, you should add the "b" to the mode-parameter of fopen(). see manual BTW. it is only a good idea to add the param for code-portability...