I have many files stored in a NAS. The NAS is attached to a server as network drive (lets say, it is on Y://).
I use xampp to serve my application built in php. The application was built to serve users to download the files from NAS, directly through http instead of ftp.
So can I set the files from NAS, so it can be downloaded by using http URL, like example.com/files/the-file.zip ?
The xampp is located at C:// directory
Note: The xampp htdocs has already set to accessed by a domain. So it is not domain pointing problem
You could try so through PHP:
<?php
$downloadFolder = 'Y:/';
$fileName = $downloadFolder . $_GET['file'];
$sanitizedFileName = realpath($fileName);
if($fileName !== $sanitizedFileName) {
throw new RuntimeException('Someone tried to escape');
}
// As seen in http://php.net/readfile:
if (file_exists($fileName)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.basename($fileName).'"');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($fileName));
readfile($fileName);
exit;
}
Or find a way to run everything through .htaccess, but that might give you less control over security handling (probably, you don't want to serve files from other directories or other drives)
Related
i have a files for download mysql database using php.which is working properly.but i want to download codes to specific folder created in D drive
$backup_file_name = $database_name . '_backup_' . time() . '.sql';
$fileHandler = fopen($backup_file_name, 'w+');
$number_of_lines = fwrite($fileHandler, $sqlScript);
fclose($fileHandler);
// Download the SQL backup file to the browser
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=' . basename($backup_file_name));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($backup_file_name));
file_put_contents('D:\dbbackups', $backup_file_name);
ob_clean();
flush();
readfile($backup_file_name);
exec('rm ' . $backup_file_name);
but file id download inside the project folder.any help would be highly appreciated
From PHP, you can't.
Your server-side PHP application has no knowledge of, or control over, the application or device which is making the HTTP request. All it does it return some data and headers to the requesting client.
Your server / PHP has no idea whether
the client device even has a D: drive (or even runs an O/S which uses drive letters), or whether a specific folder exists within it
the client will even treat the response as a file and try to save it somewhere
And even if it did know the above, then
your server would have no permissions to access the client-side device or its storage media.
If what you're suggesting was possible it would be a big security / privacy problem. But it would still be impractical even then, because of my first point.
What you can do to help yourself in this situation though is to write your own client-side program which makes the HTTP request to your server to execute the PHP, receives the data in the response and saves it to the location you want. Or if you're doing this via a browser you can set the browser's default download location to that folder.
When I run the function below it locates and reads the file, displaying the results in the my Chrome Dev Tools preview tab correctly in the csv format. But it's not downloading it. If I link directly to the file in my browser it downloads it, so it doesn't appear to be an .htaccess issue. I've used the example in the documentation and many variations of it found here on Stack Overflow but the results are the same: the file displays in my preview tab in dev tools (and the same goes with Firefox as well) but no download. My code:
public function download()
{
$file = $this->dir;
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/csv');
header('Content-Disposition: attachment; filename='. $file);
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
}
I'm developing locally with latest Wamp server. When I push/pull to my remote, the result is the same.
From your question, it sounds like you might be trying to download your file via an AJAX request.
If so, I don't believe you can do this. Instead you could open the link to the file in a new window, which will successfully download the file.
I have fried my brain all day on this. Researching SO until my eyes are bleary...I need to know: How do I access files placed outside the site root?
Background: Apache 2.0 dedicated server running Linux.
Code: PHP and MySQL
Reason: I want the files to be secured against typing in the file path and filename into a browser.
This can't be that difficult...but my splitting head says otherwise. Any help would be absolutely appreciated.
Have a look at the answers to this question, which seem to be doing more or less the same thing.
A quick summary: readfile() or file_get_contents() are what you're after. This example comes from the readfile() page:
<?php
$file = 'monkey.gif';
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
?>
I don't recommend allowing the $file variable to be set using user input! Think about where the filenames are coming from before arbitrarily returning files in the response.
are you trying to access files outside of site root? Then you can look at this link in stackoverflow.
And this is the official doc in Apache.
Otherwise you don't have to do special handling to prevent others from accessing files outside site root.
I have the following problem with a server, there many variables at play. So this is what happened. Everything was working perfect when I developed a small webapp with zend on my desktop on fedora. Then I transfer the app to a server on dreamhost and everything worked fine.
So the problem comes with the client that needed a server on china, because they are behind the great firewall and they wanted to transfer their files really faster. They had huge files, around 3.4 GB. so they gave me windows 2003 machine virtual machine, and they couldnt change it to linux, and that is when everything went on a downward spiral.
Basically my app had a folder outside the documentroot, where all the files where going to be upload via ftp. My app read the files and only allowed logged users to download the files.
This is my plugin - controller
<?php
class Mapache_Controller_Plugin_AssetGrabber
extends Zend_Controller_Plugin_Abstract
{
public function dispatchLoopStartup (Zend_Controller_Request_Abstract $request)
{
if ($request->getControllerName() != 'assets')
return;
$auth = Zend_Auth::getInstance();
if (!$auth->hasIdentity())
throw new Exception('Not authenticated!');
//$file = APPLICATION_PATH . '/../assets/' . $request->getActionName();
$file = APPLICATION_PATH . '/..' . $_SERVER['REQUEST_URI'];
$file = str_replace("_", " ", $file);
// echo $file;
if (file_exists($file)) {
header("Content-type: ".$this->new_mime_content_type($file));
header('Content-disposition: attachment;');
//header('Content-type: '.$this->new_mime_content_type($file));
//readfile('$file');
echo file_get_contents($file);
}
}
function new_mime_content_type($filename){
$result = new finfo();
if (is_resource($result) === true){
return $result->file($filename, FILEINFO_MIME_TYPE);
}
return false;
}
}
The only thing I changed for it to work on windows, was adding
$file="d:/". $_SERVER['REQUEST_URI'];, so it knows to look on the D: drive of the server.
so basically i have another php file that does an scandir an lists all the files and directories and creates a link to URL/assets/folder/file, it works fine on my test server even with big files. But when I try to download a zip file fo 200 mb from the windows server I get a corrupted zip file of 228 or 229 bytes, like just the header.
The server has xammp with zend installed, I am going crazy.
Migrating back to my dreamhost server will take days, I installed an rsync client on the server and started copying the files but in the last 4 it has only copy 400mb, so my 30 Gb of data will take days.
When i log on to the server with rdesktop and try to download the files I still get 228 bytes of files instead of 200mb or even 3.4Gb.
It has windows 2003.
Do I have to configure something on the apache server, something on the httpd.conf or php.ini?
I found the answer, I need to properly download the file
header('Content-Disposition: attachment;filename="'.basename($file).'"');
header('Content-Description: File Transfer');
//header('Content-Type: application/octet-stream');
header('Content-type: '.$this->new_mime_content_type($file));
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
set_time_limit(0);
$filex = #fopen($file,"rb");
while(!feof($filex))
{
print(#fread($filex, 1024*8));
ob_flush();
flush();
}
//ob_clean();
//flush();
//readfile($file);
I am currently creating a PHP website which allows administrators to upload a variety of documents (pdf,doc,docx,xls) which can then be downloaded at a later date. These can only be accessed by administrators after they have logged in. Up until this point to do this I have been storing files above the web root and then using PHP to access and serve the file via a PHP script hence preventing direct access to the files. This does work but never seems like an ideal way to do it as it's reliant on setting the correct headers via PHP for the file download which does not always give the correct results on all browsers. I can't really see any other way of doing it that would also stop the files being publically accessible if they knew where they were located.
What process would you usually use to store and serve files on a web server that should not be publically accessible?
Sample PHP:
<?php
if (TRUE === $_SESSION['logged_in']) {
}
$file = '/full/path/to/useruploads/secret.pdf';
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=' . basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
?>