PHP readfile vs. file_get_contents - php

I have used following code to generate zip
// push to download the zip
header('Content-type: application/zip');
header('Content-Disposition: attachment; filename="'.$zip_name.'"');
readfile($zip_name);
this code works fine but for unknown reasons was not working until I tried
// push to download the zip
header('Content-type: application/zip');
header('Content-Disposition: attachment; filename="'.$zip_name.'"');
echo file_get_contents($zip_name);
I am curious about finding what is happening in both the cases

Readfile will read the file directly into the output buffer, and file_get_contents will load the file into memory, when you echo the result the data is copied from memory to the output buffer effectively using 2 times the memory of readfile.

Related

PHP: How to start serving a file that is being written, before it is completely written?

I want to "pipe" a file through PHP.
My PHP script calls a bash script through "shell_exec". The bash script downloads a file and prints its data. The PHP script then continues to send the file as response:
if (!file_exists("./$filename")) shell_exec("./get-file $file_id \"$filename\"");
if (file_exists($filename)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="' . basename($filename) . '"');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($filename));
readfile($filename);
unlink($filename);
exit;
}
This works okay. But is there some way to "pipe" the file while it's being downloaded? Let's say the file is 500MB, instead of waiting for the file to download in the server, and then serve it to the client, is there some way to send the content as it's being downloaded? The reason is obvious; I want to minimize the total time it takes for the whole process to complete. If what I ask is possible, it could bring total time to half. Thank you.
Edit: I found this but the solution is based on curl. In my case, the file is being downloaded by external script. So I want to run a loop in which the PHP script checks if new bytes are written in the file, and proceed to sending them to the client, without closing the connection. Note that the file is not text-based, it's an audio file.
Edit: Now I'm closer: I modified the external script to output the file in stdout, effectively feeding its data to shell_exec. Now, how can I send data before shell_exec has finished? I guess I'll have to use something like ob_flush() but some help would be appreciated. Thanks again
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="' . basename($filename) . '"');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
// This sends the data AFTER the exec finishes. I want to send them as they arrive, real-time
echo shell_exec("get-file $file_id");
die();
OK I found the answer in other posts.
I just use passthru() instead of shell_exec() and it seems to be working great.
Thank everybody! <3

Download Large File with PHP readfile(); Downloading is stopping around 890mb

I write some small script and it is working ok! however when i try download large file such a more than 1GB Download is stopping around ~880MB
Am i doing something wrong? or is there any better solution for download big file with PHP
This is my code
<?php
set_time_limit(0);
ini_set('memory_limit', '5000M');
//This File is 4GB size
$url = 'example.com/files/file_name.zip';
$headers = get_headers($url, TRUE); //collectind header information from file Url
$filesize = $headers['Content-Length']; //File size
while (ob_get_level()) ob_end_clean();
header('Content-Type: application/zip');
header('Content-Disposition: filename=' . basename($url));
header('Content-length: ' . $filesize);
header('Cache-Control: no-cache');
header("Content-Transfer-Encoding: chunked");
ob_flush();
flush();
readfile($url);
exit;
I suspect that you're hitting a memory limit error. Normally readfile avoids using much memory, but depending on your settings it can consume RAM in the process. Check your error logs to confirm that you're seeing a memory allocation error, and if you are seeing it, try getting rid of the while and ob_flush and flush and just add something like this after the headers:
if (ob_get_level()) {
ob_end_clean();
}
readfile($url);
Also, don't pump your memory limit up to 5GB... if this is working correctly you shouldn't require hardly any memory.
If that's not it, can you confirm that you're serving a remotely hosted ZIP file (i.e. you don't have direct access to the ZIP file on the current server)?

Download file from server php using headers

I have xampp web server, and trying to download file using headers! Don't know what is wrong, but file not starting to download and not appears in browser! In http response I have the source of file!
header("Content-disposition: attachment;filename=d:\\archive\\result.csv");
header("Content-type: application/pdf");
readfile("sample.pdf");
Can any one halp me, please!
Try this
header('Content-Type: application/csv');
header('Content-Disposition: attachment; filename=file:///D:\archive\result.csv' );

How to Increase Speed of Downloading an MP3

I'm using the following code to force download some mp3 files that are stored on my server. This works fine but it takes over 1 minute to download 1 mp3 file, even for a file that is 2.5MB. Something seems wrong for it take that long. Any ideas what I can do to make this download a lot faster?
$fullfilename=$_GET['file'];
$filename=basename($fullfilename);
header('Content-type: audio/mpeg');
header("Content-Disposition: attachment; filename=\"{$filename}\"");
header("Content-Length: " . filesize($filename));
header('Content-Transfer-Encoding: binary');
ob_clean();
flush();
readfile($fullfilename);
exit;
It depends on the internet connection between the server and your browser. PHP cannot do anything about it.

One Time Download Links and large files

I'm using this script:
http://www.webvamp.co.uk/blog/coding/creating-one-time-download-links/
to allow users download files (one time). Everything works fine with small files. Now i'm trying to do the same but with larger file 1.2 GB. Instead of forcing user to download file, script show off the relative patch to the file! Is there any way to modify the script or its a fault of the server configuration?
Thanks for help!
Looking for the code i think it fails on large files due to memory limitation. Script reads the whole file in memory via file_get_contents() before sending it. I suspect, >1Gb files will cause the problems with memory.
Try to replace following lines in download.php script:
//get the file content
$strFile = file_get_contents($strDownload);
//set the headers to force a download
header("Content-type: application/force-download");
header("Content-Disposition: attachment;
filename=\"".str_replace(" ", "_", $arrCheck['file'])."\"");
//echo the file to the user
echo $strFile;
to:
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($strDownload));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($strDownload));
ob_clean();
flush();
readfile($strDownload);
This may be help.
Note from manual: readfile() will not present any memory issues, even when sending large files, on its own

Categories