I'm building a web based file management interface for our clients, where I am attempting to facilitate a download from a remote (FTP) server to the client without downloading to the local web server first.
I've found that Readfile() does exactly what is needed, working perfectly for both web based downloads as well as from public FTP servers. The problem is that when specifying credentials via the FTP url, it apparently no longer works. I've found other reports of this online but thus far no solutions or workarounds.
$file_url = 'ftp://username:password#198.2.148.130/198.2.148.130%20port%2025665/server.properties';
header('Content-Type: application/octet-stream');
header("Content-Transfer-Encoding: Binary");
header("Content-disposition: attachment; filename=\"" . basename($file_url) . "\"");
readfile($file_url);
Is there any workaround methods that would make this operate as expected? I am stumped as to how this can be fixed, where it seems like a bug moreso than a limitation.
The URL-encoded spaces (%20) in the URL seems to the culprit.
Just use literal spaces:
$file_url = 'ftp://username:password#198.2.148.130/198.2.148.130 port 25665/server.properties';
It's perfectly valid to specify the credentials this way in PHP FTP URL wrapper.
Related
So my company store all PDF files in Amazon S3 privately.
When the user request it our system pull it from Amazon S3 and then serve it to the user with following code:
header("Cache-Control: public");
header("Pragma: public");
header("Expires: 0");
header("Content-Description: File Transfer");
header('Content-Disposition: attachment; filename="'.$fileName.'"');
header('Content-Length: ' . strlen($res->body));
header("Content-type: application/pdf");
header("Content-Transfer-Encoding: binary");
header('Connection: close');
echo $res->body;
$res is the respond returned from Amazon with the content from $res->body;
I see random slow download speed when the user try to download the PDF files, especially when the PDF is large (~5mb) compare to the rest that only having 800kb-1.5mb.
Solution tried:
1) Removing the content-length header doesn't help.
2) Remove EnableSendfile off in httpd.conf doesn't help either.
I also checked the server to make sure it wasn't the workload of the server that's causing this.
The speed test of both the server and user's workstation looks good too.
Do anyone of you have any idea what is the reason that's causing this slowness?
It might be that multiple people from your company use the internet.
You should do a full scan of the network, but still even then your not alone on the internet, and their "virtual" server environments are shared as well.
from my understanding the issue is that simply takes time to fetch the file from S3 in order to return it to the user.
Make yourself a favor:
create a signed url for a short period time
redirect the user to such url
dont worry, as creating signed urls doesn't expose any private information that compromises your security (if you do ir correctly)
I'm using Retrofit 2.0.0-beta2 and I need to download some files from my PHP server. My first approach which worked was to directly use the GET method from its relative server path and I was getting the correct bytes.
Now I've tried something more secure that delivers the file to me based on some checks. It automatically fetches the file path from the DB and checks if the user session is correct. This works in browser tests, both Chrome PC and Chrome from Android correctly download some photos.
I'm serving the file using the X-Sendfile header like so:
header("X-Sendfile: $file_name");
header("Content-type: image/jpeg");
header('Content-Disposition: attachment; filename="' . basename($file_name) . '"');
The Android-side call looks like this:
#Streaming
#GET("/card/download")
Call<ResponseBody> getCard(#Query("filename") String filename);
All I'm getting when opening the files is the echoed text response from server. Is there any way I can receive the "correct" files?
Apparently there was some sort of problem installing the mod.
I also updated OkHttp to version 2.7.0
Without downloading and than sending the data in a body of the request. Is there any way how to just "stream it through"? At the moment I am limited by the memory allocated by PHP as I have to hold the content of the entire file in memory before sending it through ...
I am working on an enterprise App Store for Apple iOS apps (and android too but that is fine with a link) and they require the domain to be same as the server and the server itself is just too small for hosting so many files ... it also requires a valid SSL certificate which I have on the server.
I have tried to link directly to S3 already but it won't work, the only way to do this is to serve the files from the server, thus I have been asking about passing the file over/hiding the original location or I don't know, maybe mounting S3 as a drive?
A proxy using fpassthru needs to be done as suggested by #AD7six as follows:
$handle = #fopen('file path or in this case url to file on S3', 'rb');
header('Cache-Control: no-cache, must-revalidate');
header('Pragma: no-cache'); //keeps ie happy
header('Content-Disposition: attachment; filename=app.ipa');
header('Content-type: application/octet-stream');
header('Content-Length: '.$fileInfo['size']); // taken from a previous S3 API call to get object info
header('Content-Transfer-Encoding: binary');
ob_end_clean(); // apparently very important for bigger files
fpassthru($handle); // proxy stream file through your server
exit();
Have you read the official documentation of S3?
See this answer: How to create download link for an Amazon S3 bucket's object?
There is no need to push the data through your server to send it to the client. In the case you need protected downloadable items, there is as well a way to give the download link a token that is only valid for a given amount of time.
I am currently using this to redirect to an ftp server
header('Location: ' . $result['url']);
the $result['url'] contains ftp urls like this
ftp:username#password:server
the problem is that anyone can open the console and see the security creds,
I know that you can hide an download url by using readfile like so
header("Content-Description: File Transfer");
header("Content-Length: " .(string)(filesize($variable)) );
header('Content-Disposition: attachment; filename="'.basename($variable).'"');
header("Content-Transfer-Encoding: binary\n");
readfile($variable);
is there something similar for ftp url redirects?
Short answer: it's not possible
Long answer: it's not possible at all. Even if it was possible - then anyway, what would be passed to the client is easy to discovery, thus insecure.
I have a file
/file.zip
A user comes to
/download.php
I want the user's browser to start downloading the file. How do i do that? Does readfile open the file on server, which seems like an unnecessary thing to do. Is there a way to return the file without opening it on the server?
I think you want this:
$attachment_location = $_SERVER["DOCUMENT_ROOT"] . "/file.zip";
if (file_exists($attachment_location)) {
header($_SERVER["SERVER_PROTOCOL"] . " 200 OK");
header("Cache-Control: public"); // needed for internet explorer
header("Content-Type: application/zip");
header("Content-Transfer-Encoding: Binary");
header("Content-Length:".filesize($attachment_location));
header("Content-Disposition: attachment; filename=file.zip");
readfile($attachment_location);
die();
} else {
die("Error: File not found.");
}
readfile will do the job OK and pass the stream straight back to the webserver. It's not the best solution as for the time the file is sent, PHP still runs. For better results you'll need something like X-SendFile, which is supported on most webservers (if you install the correct modules).
In general (if you care about heavy load), it's best to put a proxying webserver in front of your main application server. This will free up your application server (for instance apache) up quicker, and proxy servers (Varnish, Squid) tend to be much better at transfering bytes to clients with high latency or clients that are generally slow.
If the file is publicly accessable, just do a simple redirect to the URL of your file.
If the file is public, then you can just serve it as a static file directly from the web server (e.g. Apache), and make download.php redirect to the static URL. Otherwise, you have to use readfile to send the file to the browser after authenticating the user (remember about the Content-Dispositon header).