I am currently developing an application in PHP in which my server (a dedicated server) must to download a file, and the user should download the file in same time.
Here is an example :
Server start to download a file at a time A.
User wants to download this file at the time A + 3 seconds (for example)
I already solved the problem :"If the user downloads the file faster than the server..". But I didn't know how to make a php script in which the user is gonna to download the full file (it means that the size must be the full size of the file, not the size it's currently downloaded at the time A+3seconds). I already make that :
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.$data['name'].'";');
header('Content-Transfer-Encoding: binary');
header('Content-Length: '.$data['size']);
readfile($remoteFile);
But it doesn't work, the user is gonna download just the size it is currently on the server (which corrupt the file) and not the full file...
If you have any solution, thank you.
You could probably pipe the file manually, by opening the connection and reading until you're past all headers. Then once you've figured out the Content-Length, send that to the user and just echo all remaining data you get (do use flush() and avoid output buffers).
Pseudocode(-ish):
open the file
# grab headers
while you didn't get all HTTP headers:
read more
look for the Content-Length header
send the Content-Length header
# grab the file
while the rest of the request isn't done
read more
send it to the user
flush the buffers
done
Expanding on #Tom answer, you can use cURL to greatly simplify the algorithm by using the CURLOPT_HEADERFUNCTION and CURLOPT_READFUNCTION callbacks - see curl_setopt().
Don't send the content-length header. It's not required assuming you're using http 1.1(your webserver almost certainly does). Drawback is their browser cant show download time/size remaining.
Related
I'm randomly getting download errors from a link on a page. I also simplified the link to a directory for easy usage in emails for users.
On the main page the link looks like this:
a href="http://myPage.com/Mac" target="_blank" id="macDownloadButton" class="downloadbutton w-button">Download Mac version</a>
On my server, that's a directory with an index.php in it which looks like this:
<?php
// mac version
$file="http://www.myPage.com/downloads/myApp_Mac.zip";
$filename="myApp_Mac.zip";
header('Content-Transfer-Encoding: binary');
header('Accept-Ranges: bytes');
header('Content-Length: ' . filesize($file));
header('Content-Encoding: none');
header('Content-Type: application/zip');
header('Content-Disposition: attachment; filename=' . $filename);
readfile($file);
exit;
?>
Again, the reason I do this is so it's a simple link to send to users in email like, "http://myPage.com/Mac" and "http://myPage.com/Windows".
The weird thing is that it mostly works...but sometimes it doesn't.
What am I doing wrong?
It's hard to know precisely what's wrong unless you check for errors on your readfile() call.
But you're invoking your web server from your web server here when you specify a filename starting with http. You're doing
readfile('http://www.myPage.com/downloads/myApp_Mac.zip');
where you could just as easily do
readfile('../downloads/myApp_Mac.zip');
and read the zip file from the local file system to send to your user.
What's more, filesize('../downloads/myApp_Mac.zip'); will yield a numerical value quickly and send it in the Content-Length header. That will allow the browser, by knowing the total size of the file you're sending, to display a meaningful progress bar.
You should remove the Accept-Ranges header; the php program you showed us doesn't honor range requests. If you lie to the browser by telling it you do honor those requests, the browser may get confused and corrupt the downloaded copy of your file. That will baffle your user.
Your Content-Disposition header is perfect. It defines the filename to be used on your user's machine in the downloads folder.
Simple operations are more reliable, and this may help you.
The reason you got stat failed with link as an error message is this: stat(2) is a operating-system call that operates on files in local and mounted file systems.
As previously mentioned by O. Jones you should definitely always use your local file path.
Most of my previous issues have been mostly browser related where I needed to tweak/add a http header, and in one case I needed to send all the HTTP headers in lowercase but I haven't had an issue like that in years. My personal recommendation would be to use a solid download library/function - it will make a noticeable difference to your productivity as well as rule out most browser related issues you may come across.
I have used the codeIgniter download helper for the last 3 years and recommend it for 99% of use cases. At the very least I would recommend your read through it's code - you will probably find a few cases you have never even considered such as clearing the output buffer,mime detection and even a special case for Android 2.1 as well as a few headers you may or may not need.
If all else fails I have no idea what server your running this on but if you continue to have issues I would recommend monitoring which processes your machine is running while paying close attention to ram and IO usage. I've have encountered bad/misbehaving services that run periodically using 99% of my IO or ram for short intervals at a time that caused a few really odd and unexpected errors.
I did an xml file and force download it by these headers:
header('Content-disposition: attachment; filename="export.xml"');
header('Content-type: application/xml; charset=utf8');
readfile('export.xml');
But before the download I see a dialog that this file can be harmful for my computer? How to get rid of this dialog? Maybe my headers is wrong?
upd Well, can do nothing, I did a test on my test-hosting, u can check it here: site with generation link, and an xml file as is: export.xml
Try changing application/xml to text/xml. Probably your browser thinks that application means executable.
Try this :
<?php
header('Content-disposition: attachment; filename="export.xml"');
header('Content-type: "text/xml"; charset="utf8"');
readfile('export.xml');
?>
Note: This does not solve your issue, however it did solve an issue I had on my computer giving that notice (windows, chrome, apache webserver, PHP 5.4.10). I leave it here for future visitors.
Some browsers do not only look for the headers but also for the "filename" in the URL.
For example if you download a PHP file that contains XML, the browser might identify it as a dangerous file (because it can be executed on your system or is not within some whitelist or what not):
http://example.com/xml-download.php
A simple solution is to make this file not end with .php any longer, for example by adding a ?:
http://example.com/xml-download.php?
And continue with that to even signal the filename that way:
http://example.com/xml-download.php?export.xml
(the last one is not necessary but can be useful especially with some older browsers)
At the current time I'm using this code to output a file to a user.
header('Content-type: application/octet-stream');
header('Content-Disposition: attachment; filename='.$or.'');
readfile($file);
The code, however doesn't tell the browser how large the file is. And it can't output large files like 1 gb. I want the code to tell the browser the actual size of the file and be able to output large files
For large files, you need to use chunked transfer. In most cases the underlying web server (Apach/Nginx/WHY) will have facilities to do that. I recommend you use them.
That way your code does not take up a worker thread for ages, and the run-away timer will not cut in in the middle of your down-load (which would upset your users).
btw - You are talking about file download, not upload - that would be user to server. Your tag is wrong.
readfile() will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off with ob_get_level().
header("Content-Length: $value");
I'm working on a PHP script which generates large (multi-MB) output on the fly without knowing the length in advance. I am writing directly to php://output via fwrite() and have tried both standard output and using Transfer-Encoding: chunked (encoding the chunks as required) but no matter what I try the browser waits until all the data is written before displaying a download dialog. I have tried flush()ing too after the headers and after each chunk but this also makes no difference.
I'm guessing that Apache is caching the output as the browser would normally display after receiving a few kB from the server.
Does anyone have any ideas on how to stop this caching and flush the data to the browser as it is generated?
Thanks,
J
First of all, like BlaM mentioned in his comment, if in the PHP configuration OutputBuffering is enabled, it wont work, so it would be useful to know your phpinfo().
Next thing, try if it works with a big file that is stored on yor webserver, output it usinf readfile. And, together with this, check if you send the correct headers. Hints on how to readfile() and send the correct headers a provided here: StackOverflow: How to force a file download in PHP
And while you are at it, call ob_end_flush() or ob_end_clean() at the top of your script.
<?php
$filename= './get/me/me_'.rand(1,100).'.zip';
header("Content-Length: " . filesize($filename));
header('Content-Type: application/zip');
header('Content-Disposition: attachment; filename=foo.zip');
readfile($filename);
?>
Hi,
I have this simple code that forces a random file download, my problem is that if I call the script two or more times from the same browser the second download won't start until the first is completed or interrupted. Thus I can download only one file per time.
Do you have any clue?
This may be related to PHP's session handling.
Using the default session handler, when a PHP script opens a session it locks it. Subsequent scripts that need to access it have to wait until the first script is finished with it and unlocks it (which happens automatically at shutdown, or by session_write_close() ). This will manifest as the script not doing anything till the previous one finishes in exactly the same way you describe.
Clearly you aren't starting the session explicitly, but there's a config flag that causes the session to start automatically: session.auto_start - http://www.php.net/manual/en/session.configuration.php
Either use phpinfo() to determine if this is set to true, or look in your config. You could also try adding session_write_close() to the top of the script, see if it makes the issue go away.
just guesses. There could be different reasons.
first, your server could restrict the number of connections or childs in paralell. But I guess this sin't the problem
second, it is more likely that the client restricts the number of connections. The "normal" browser opens only two connections at a time to a certain server. Modern browsers allow up to 8 (?) connections. This is a simple restriction in order to avoid problems which could occur with slow servers.
One workaround could be to place every download on a "virtual" subdomain.
give it a try!
Just to say that the session_write_close(); solved the problem for me.
I was using session_destroy(); (that worked) but was not much good if I needed to keep session data :)
All you need to do I place session_write_close(); just before you start streaming the file data.
Example:
<?php
$filename= './get/me/me_'.rand(1,100).'.zip';
session_write_close();
header("Content-Length: " . filesize($filename));
header('Content-Type: application/zip');
header('Content-Disposition: attachment; filename=foo.zip');
readfile($filename);
?>
I'd further investigate Ralf's suggestion about the server restrictions and start with checking the logfiles to ensure that the second request is received by the server at all. Having that knowledge, you can eliminate one of the possibilities and at least see which side the problem resides on.
From the client's browser - you didn't mention which one is it - if Firefox, try to install the Live Http Headers extension to see what happens to request you send and if browser receives any response from the server side.
As far as I can find, there is no php configuration setting that restricts max downloads or anything like that - besides, such a configuration is outside the scope of php.
Therefore, I can come to only two conclusions:
The first is that this is browser behaviour, see if the problem is repeated across multiple browsers (let me know if it is). The HTTP spec does say that only two connections to the same domain should be active at any one time, but I wasn't aware that affected file downloads as well as page downloads. A way of getting round such a limitation is to allocate a number of sub-domains to the same site (or do a catch-all subdomains DNS entry), and when generating a link to the download, select a random sub domain to download from. This should work around the multiple request issue if it is a browser problem.
A second and much more unlikely option is that (and this only applys if you are using Apache), your MaxKeepAliveRequests configuration option is set to something ridiculously low and KeepAlives are enabled. However, I highly doubt that is the issue, so I suggest investigating the browser possibility.
Are you getting an error message from the browser when the second download is initiated, or does it just hang? If it just hangs, this suggests it is a browser issue.