I have used this code to copy a 45 mb zipfile from A server to B server.
<?php
set_time_limit(0);
$file = 'https://www.xxxxx.com/Products.zip';
$newfile = 'Products.zip';
if ( copy($file, $newfile) ) {
echo "Copy success!";
}else{
echo "Copy failed.";
}
?>
After copying 17 mb itis giving server error.
I have used some other codes to download or copy from server to server like
<?php
set_time_limit(0);
$url = 'https://www.mydomaind.com/Products.zip';
$path = 'Products.zip';
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$data = curl_exec($ch);
curl_close($ch);
file_put_contents($path, $data);
echo 'done';
?>
Another one
<?php
set_time_limit(0);
$source =("https://www.mydomaind.com/Products.zip");
$destination = 'Produtcs.zip';
$data = file_get_contents($source);
$handle = fopen($destination, "w");
fwrite($handle, $data);
fclose($handle);
echo 'done';
?>
These last to code download or copy files like 5 mb easily.
But when itry to same job for 50 mb. Gives error.
Please helpme how i can do that.
Thanks
Your hitting php limit.
Look inside php.ini and up the limit
; Maximum allowed size for uploaded files.
upload_max_filesize = 40M
; Must be greater than or equal to upload_max_filesize
post_max_size = 40M
p.s. you should not use PHP for sending files between servers.
Better use SSH and SCP. PHP can lunch commands with exec() than you use shell to send file.
If you are hitting your limit, here is another SO answer on how to set it in .htaccess
Changing upload_max_filesize on PHP
Sorry to respond here mods, can't comment yet!
Already tried to split/cut the zip into 2/3 parts? Imho would that solve your prob without any errors from servver side.
Try using this code so that you do not use up the available RAM on your server.
function chunked_copy() {
# write with buffer, 1 meg at a time, adjustable.
$buffer_size = 1048576;
$ret = 0;
$fin = fopen("http://www.example.com/file.zip", "rb"); #source
$fout = fopen("file.zip", "w"); #destination
while(!feof($fin)) {
$ret += fwrite($fout, fread($fin, $buffer_size));
}
fclose($fin);
fclose($fout);
return $ret; # return number of bytes written
}
Related
I have a 197gb text file that I want to read and push the contents into MySql database. I know, I can't put that big file in PHP buffer and read it as whole, So I want to read few hundred lines as a time and keep on reading next and next to read the whole file.
I am trying it with this but the page returns nothing
<?php
$i = 0;
$handle = fopen("./data/200gbfile.txt", "r") or die("Couldn't get handle");
if ($handle) {
while (($line = fgets($handle)) !== false) {
echo $line . "<br />";
if ($i > 100) {
exit;
}
$i++;
}
fclose($handle);
} else {
echo "Error Opeing File!";
}
?>
Is there a limit of the max file size to be handled in php setting?
EDIT: for the 197gb file in question, fopen is failing to return anything and
the output page is just going blank.
You can read the file in chunks to save memory:
For example:
$fd = #fopen("./data/200gbfile.txt", "r");
while (!feof($fd)) {
$data = fread($fd, 1024); // read the file in 1024kb chunks
// handle current data (read line by line for example)
}
fclose($fd);
But no idea if that works with a file with 100Gbytes+.
Edit: # with fopen is required as suggested by Roman.
you can use ini_set('memory_limit','16M'); to set size accordingly but i don't wether it will handle such huge file. never tested that..
I encode a file with a special watermark for be downlaoded by the user
I would like to be able to start the downlaod when ffmpeg encode the file.
I have a php page which one create the water mark , launch ffmpeg and start the download with X-sendifle
The download don't start, adding sleep(15) let's the download start just I receive only what is done
Actually I use Apache2 as the webserver with X-sendfile mod
Okay I finally find how to do
Using X-sendfile it's a wrong idea
Using mp4 it's a wrong idea because at the end of the encodage the software rewrite 4 bytes at the beginning of the file
So this is the code I use for be able to send a file like webm or flv files when ffmpeg encode the output
shell_exec('ffmpeg command');
sleep(2); //wait few second for the software start
$foutput ="where/is/your/output.file";
header('Content-Disposition: attachment; file.name.for.the.user"');
header("Keep-Alive:timeout=200, max=500");
//sending by chunk
set_time_limit(3600);
$file =fopen($foutput , 'r');
$files = fstat($file);
$filesize= $files['size'];
$last = 0;
echo fread($file , $filesize);
sleep(5); //wait a little for the software add new bytes
$last = $filesize;
$files = fstat($file);
$filesize= $files['size'];
$done =0;
while( $done < 2 ){
if ($filesize != $last){
fseek($file , $last);
$read = $filesize - $last ;
echo fread($file , $read);
$last =$filesize;
sleep(5);
$files = fstat($file);
$filesize= $files['size'];
}else{
fclose($file);
return 0;
};
};
I'm a noob for coding and got a problem with my "while" just the code work really good without any issue
Do not use filesize , this command don't like when a file change of size all time and return only 4096 in that case
The download will be slow , this is normal , you receive chunk by chunk what the software done like with ffmpeg the transfer is between 200 to 1200 kb/s
In the code below I...
open a textfile, write four characters to it, and close it again,
re-open it, read the contents, and use filesize to report the size of the file. (It's 4, as it should be),
manipulate those contents and tack on four more characters as well. Then I write that new string to the textfile and close it again,
use filesize again to report how big the file is.
To my surprise, the answer it gives is still 4 even though the actual size of the file is 8! Examining the contents of the file proves that the write works and the length of the contents is 8.
What is going on??
By the way I have to use fread and fwrite instead of file_get_contents and file_put_contents. At least I think I do. This little program is a stepping stone to using "flock" so I can read the contents of a file and rewrite to it while making sure no other processes use the file in between. And AFAIK flock doesn't work with file_get_contents and file_put_contents.
Please help!
<?php
$filename = "blahdeeblah.txt";
// Write 4 characters
$fp = fopen($filename, "w");
fwrite($fp, "1234");
fclose($fp);
// read those characters, manipulate them, and write them back (also increasing filesize).
$fp = fopen($filename, "r+");
$size = filesize($filename);
echo "size before is: " . $size . "<br>";
$t = fread($fp, $size);
$t = $t[3] . $t[2] . $t[1] . $t[0] . "5678";
rewind($fp);
fwrite($fp, $t);
fclose($fp);
// "filesize" returns the same number as before even though the file is larger now.
$size = filesize($filename);
echo "size after is: " . $size . " ";
?>
From http://php.net/manual/en/function.filesize.php
Note: The results of this function are cached. See clearstatcache() for more details.
When you open a file by fopen() function you can obtain proper size at any time using fstat() function:
$fstat=fstat($fp);
echo 'Size: '.$fstat['size'];
Example:
$filename='blahdeeblah.txt';
$fp=fopen($filename, 'a');
$size=#filesize($filename);
echo 'Proper size (obtained by filesize): '.$size.'<br>';
$fstat=fstat($fp);
echo 'Proper size (obtained by fstat): '.$fstat['size'].'<br><br>';
fwrite($fp, '1234');
echo 'Writing 4 bytes...<br><br>';
$fstat=fstat($fp);
echo 'Proper size (obtained by fstat): '.$fstat['size'].'<br>';
fclose($fp);
$size=#filesize($filename);
echo 'Wrong size (obtained by filesize): '.$size;
Note that cached value is used only in current script. When you run the script again filesize() reads new (proper) size of file.
Example:
$filename='blahdeeblah.txt';
$fp=fopen($filename, 'a');
$size=#filesize($filename);
echo 'Proper size: '.$size.'<br>';
fwrite($fp, '1234');
fclose($fp);
$size=#filesize($filename);
echo 'Wrong size: '.$size;
I'm using a simple unzip function (as seen below) for my files so I don't have to unzip files manually before they are processed further.
function uncompress($srcName, $dstName) {
$string = implode("", gzfile($srcName));
$fp = fopen($dstName, "w");
fwrite($fp, $string, strlen($string));
fclose($fp);
}
The problem is that if the gzip file is large (e.g. 50mb) the unzipping takes a large amount of ram to process.
The question: can I parse a gzipped file in chunks and still get the correct result? Or is there a better other way to handle the issue of extracting large gzip files (even if it takes a few seconds more)?
gzfile() is a convenience method that calls gzopen, gzread, and gzclose.
So, yes, you can manually do the gzopen and gzread the file in chunks.
This will uncompress the file in 4kB chunks:
function uncompress($srcName, $dstName) {
$sfp = gzopen($srcName, "rb");
$fp = fopen($dstName, "w");
while (!gzeof($sfp)) {
$string = gzread($sfp, 4096);
fwrite($fp, $string, strlen($string));
}
gzclose($sfp);
fclose($fp);
}
try with
function uncompress($srcName, $dstName) {
$fp = fopen($dstName, "w");
fwrite($fp, implode("", gzfile($srcName)));
fclose($fp);
}
$length parameter is optional.
If you are on a Linux host, have the required privilegies to run commands, and the gzip command is installed, you could try calling it with something like shell_exec
SOmething a bit like this, I guess, would do :
shell_exec('gzip -d your_file.gz');
This way, the file wouldn't be unzip by PHP.
As a sidenote :
Take care where the command is run from (ot use a swith to tell "decompress to that directory")
You might want to take a look at escapeshellarg too ;-)
As maliayas mentioned, it may lead to a bug. I experienced an unexpected fall out of the while loop, but the gz file has been decompressed successfully. The whole code looks like this and works better for me:
function gzDecompressFile($srcName, $dstName) {
$error = false;
if( $file = gzopen($srcName, 'rb') ) { // open gz file
$out_file = fopen($dstName, 'wb'); // open destination file
while (($string = gzread($file, 4096)) != '') { // read 4kb at a time
if( !fwrite($out_file, $string) ) { // check if writing was successful
$error = true;
}
}
// close files
fclose($out_file);
gzclose($file);
} else {
$error = true;
}
if ($error)
return false;
else
return true;
}
I want to make movement such as the tail command with PHP,
but how may watch append to the file?
I don't believe that there's some magical way to do it. You just have to continuously poll the file size and output any new data. This is actually quite easy, and the only real thing to watch out for is that file sizes and other stat data is cached in php. The solution to this is to call clearstatcache() before outputting any data.
Here's a quick sample, that doesn't include any error handling:
function follow($file)
{
$size = 0;
while (true) {
clearstatcache();
$currentSize = filesize($file);
if ($size == $currentSize) {
usleep(100);
continue;
}
$fh = fopen($file, "r");
fseek($fh, $size);
while ($d = fgets($fh)) {
echo $d;
}
fclose($fh);
$size = $currentSize;
}
}
follow("file.txt");
$handle = popen("tail -f /var/log/your_file.log 2>&1", 'r');
while(!feof($handle)) {
$buffer = fgets($handle);
echo "$buffer\n";
flush();
}
pclose($handle);
Checkout php-tail on Google code. It's a 2 file implementation with PHP and Javascript and it has very little overhead in my testing.
It even supports filtering with a grep keyword (useful for ffmpeg which spits out frame rate etc every second).
$handler = fopen('somefile.txt', 'r');
// move you at the end of file
fseek($handler, filesize( ));
// move you at the begining of file
fseek($handler, 0);
And probably you will want to consider a use of stream_get_line
Instead of polling filesize you regular checking the file modification time: filemtime
Below is what I adapted from above. Call it periodically with an ajax call and append to your 'holder' (textarea)... Hope this helps... thank you to all of you who contribute to stackoverflow and other such forums!
/* Used by the programming module to output debug.txt */
session_start();
$_SESSION['tailSize'] = filesize("./debugLog.txt");
if($_SESSION['tailPrevSize'] == '' || $_SESSION['tailPrevSize'] > $_SESSION['tailSize'])
{
$_SESSION['tailPrevSize'] = $_SESSION['tailSize'];
}
$tailDiff = $_SESSION['tailSize'] - $_SESSION['tailPrevSize'];
$_SESSION['tailPrevSize'] = $_SESSION['tailSize'];
/* Include your own security checks (valid user, etc) if required here */
if(!$valid_user) {
echo "Invalid system mode for this page.";
}
$handle = popen("tail -c ".$tailDiff." ./debugLog.txt 2>&1", 'r');
while(!feof($handle)) {
$buffer = fgets($handle);
echo "$buffer";
flush();
}
pclose($handle);