Passing file download via php very slow - php

I have a window 2003 dedicated server, i have installed xampp on it.
So i tried to pass download using PHP script such as zina from pancake.org, phpIndexer php functions such as fread, fgets, file, file_get_contents;
If i download let say from apache mod_dirlisting, the speed is 1mbps however on same server with using php, the speed dropped to 30kbps.
Any idea what causing it? should i tweak any php.ini to reflect?

You could try to use readfile function (see PHP doc). fread or file_get_contents read file to memory then you send it by print or echo, readfile reads the file directly to the output buffer, should be faster.

Related

PHP ftp_get download zero bytes files

in a PHP project, I need to download CSV files from a FTP server. I'm using PHP ftp_XXX function to do this.
I'm working on two separate computers, one can download FTP files with no problem; the other one initiate the FTP connection, open and create a file on my disk but after a few seconds (sounds like a timeout), the script end with this error:
PHP Warning: ftp_get(): Opening BINARY mode data connection for...
I've already tried to use passive mode, the connection is closed at the end of my script and the strange thing is that this is working on another computer, and on my server.
So here are my questions:
1) do you have any idea why this is happening?
2) are there configuration in php.ini or apache to enable properly PHP FTP?
Thanks you.
Cyril
Maybe you exceeded maximum execution time.
Try to increase it:
http://php.net/manual/en/function.set-time-limit.php

PHP script stops suddenly without any error

I have a PHP script that downloads files with direct link from a remote server that I own. Sometimes large files (~500-600 MB) and sometimes small files (~50-100 MB).
Some code from the script:
$links[0]="file_1";
$links[0]="file_2";
$links[0]="file_3";
for($i=0;$i<count($links);$i++){
$file_link=download_file($links[$i]); //this function downloads the file with curl and returns the path to the downloaded file in local server
echo "Download complete";
rename($file_link,"some other_path/..."); //this moves the downloaded file to some other location
echo "Downloaded file moved";
echo "Download complete";
}
My problem is if I download large file and run the script from web browser, it takes upto 5-10 minutes to complete and the script echos upto "Download complete" then it dies completely. I always find that the file that was being downloaded before the script dies is 100% downloaded.
On the other hand if I download small files like 50-100MB from web browser or run the script from command shell this problem does not occur at all and the script completes fully.
I am using my own VPS for this and do not have any time limit in the server. There is no fatal error or memory overload problem.
I also used ssh2_sftp to copy files from the remote server. But same problem when I run from web browser. It always downloads the file, executes the next line and then dies! Very strange!
What should I do to get over this problem?
To make sure you can download larger files, you will have to make sure that there is:
enough memory available for php
the maximum execution time limit is set high enough.
Judging from what you said about ssh2_sftp (i assume you are running it via php) your problem is the 2nd one. Check your error(-logs) to find if that truly is your error. If so you simply increase the maximum execution time in your settings/php.ini and that should fix it.
Note: I would encourage you not to let PHP handle these large files. Call some program (via system() or exec()) that will do the download for you as PHP still has garbage collection issues.

gzcompress won't produce a valid zipped file?

Consider this:
$text = "hello";
$text_compressed = gzcompress($text, 6);
$success = file_put_contents('file.gz', $text_compressed);
When i try to open file.gz, i get errors. How can i open file.gz under terminal without calling php? (using gzuncompress works just fine!)
I can't recode every file i did, since that i now have almost a Billion files encoded this way! So if there is a solution... :)
You need to use gzencode() instead.
Luckily for you, the fix is easy: just write a script that opens each of your files one by one, uses gzuncompress() to uncompress that file, and then writes that file back out with gzencode() instead of gzcompress(), repeating the process for all of the files.
Alternatively (since you said you "didn't want to recode your files"), you could use uncompress to open the existing files from the command line (instead of gunzip/zcat).
As noted on the gzcompress() manual page:
gzcompress
This is not the same as gzip compression, which includes some header data. See gzencode() for gzip compression.
As said, you don't really have gzipped files. To open your files from a terminal you need the uncompress utility.

performance of passthru("cat file")

I'm using passthru("cat filepath") in my download script. My concern is that it might use a lot of server resource.
What is the difference between directly link a file in a public directory and download a file using passthru("cat filepath") in php?
What is the difference between directly link a file in a public directory and download a file using passthru("cat filepath") in php?
The difference is that linking directly to a file does not invoke PHP, while running a PHP script which in turn runs cat causes, well, both PHP and cat to be invoked. This will take up a moderate amount of extra memory, but won't cause server load under most circumstances.
I was using readfile(), but this function can't be used for files larger than 2gb
You might want to find a better solution than passing all of the file contents through PHP, in that case. Look into X-Sendfile support in your web server software of choice.
Don't use passthru() for that, you're opening yourself to CLI Injection and performance is terrible. readfile() exists just for that.
readfile($filepath);
There is a small overhead when passing through PHP compared to a direct link but we are usually talking of milliseconds. However, the browser will not be able to request a 206 Partial when using readfile() unless you code support for it or use something like PEAR::HTTP_Download.
EDIT: Seems you are using passthru() because apparently readfile() doesn't handle >2GB files properly (I never had that problem with readfile(), in fact I just tested it with a 7.2 GB file and it worked fine). In which case, at least escape your parameters.
function readfile_ext($filepath) {
if(!file_exists($filepath))
return false;
passthru('cat ' . escapeshellarg($filepath));
return true;
}
Instead of passthru('cat filepath'), use the PHP native readfile('filepath'), which has better performance.
Both methods will be slower than simply directly linking to the file though, since PHP has a certain overhead.

Apache / PHP output caching

I'm working on a PHP script which generates large (multi-MB) output on the fly without knowing the length in advance. I am writing directly to php://output via fwrite() and have tried both standard output and using Transfer-Encoding: chunked (encoding the chunks as required) but no matter what I try the browser waits until all the data is written before displaying a download dialog. I have tried flush()ing too after the headers and after each chunk but this also makes no difference.
I'm guessing that Apache is caching the output as the browser would normally display after receiving a few kB from the server.
Does anyone have any ideas on how to stop this caching and flush the data to the browser as it is generated?
Thanks,
J
First of all, like BlaM mentioned in his comment, if in the PHP configuration OutputBuffering is enabled, it wont work, so it would be useful to know your phpinfo().
Next thing, try if it works with a big file that is stored on yor webserver, output it usinf readfile. And, together with this, check if you send the correct headers. Hints on how to readfile() and send the correct headers a provided here: StackOverflow: How to force a file download in PHP
And while you are at it, call ob_end_flush() or ob_end_clean() at the top of your script.

Categories