URL to file in non web accessible directory (Readfile()? Fopen()?...) - php

so files are uploaded to a non web accessible directory on my server, but i want to provide a URL or some for of download access to these files. Below is my attempt, but it isn't working.
$destination = $_SERVER["DOCUMENT_ROOT"] . "/../Uploads/" . $random;
mkdir($destination);
move_uploaded_file($temp, $destination."/".$name);
$final = $server."/".$destination."/".$name;
**$yourfile = readfile('$final');**
and i then echo our $yourfile:
<?php echo $yourfile; ?>
elsewhere.
I either get a failed to open stream, or a huge long string. Is there any solution to just download the file on request via URL?
EDIT: I want to keep the directory non web accessible.

readfile outputs the content directly, it does not return it. Alternatively read the manual page on file_get_contents.
readfile('$final'); is never going to succeed. Unless the file literally had the "$final" name. Double quotes or no quotes.
Your question has been answered a few hundred times already. There's no need for you to post your issue four times in a row.
PHP display/download directory files outside the webserver root
How to serve documents from outside the web root using PHP?
Display all images from outside web root folder using PHP
https://stackoverflow.com/search?q=php%20readfile%20from%20outside%20docroot

Related

How to only allow PHP to download images from my server?

I have a server that contains a simple php file for downloading images and a folder containing those images.
<?php
$filepath = "myFiles/" . $_POST["file"];
if (file_exists($filepath)) {
$file = fopen($filepath,"r") or die();
echo fread($file,filesize($filepath));
fclose($file);
}
?>
This download.php file as well as the myFiles folder are both located in the www/html/ folder.
I am trying to figure out a way to make it so that my PHP script can access my image files, while keeping the files locked away from regular visitors. My problem is that if I set permissions that the files can't be viewed through the browser, then the PHP script can't access them either. So either both have access or neither does.
Am I on the correct track? How could I make it so that I can download my images using a PHP script while keeping the images otherwise inaccessible?
That won't be something you can handle using the linux file system permissions. You can put back the linux permissions to what they were initially for the files.
Instead, if you have a /home folder, I would recommend putting the original files to hide there. Check with your webhost if you have one.
Otherwise, if you have to put everything in www absolutely, then put the files to hide in a new subfolder, e.g. "hidden-files", and in that folder put a .htaccess file inside to block direct browser access to the files. The .htaccess file can be a one-line file with Deny From All command inside.
This way your files will only be able to be proxied through download.php.

Serve static files from another server while keeping the process invisible

What I am trying to achieve is to serve files from an S3 bucket for multiple websites without showing the real path of the items. The files are stored in the S3 bucket like: bucket/website
Example:
domain: domain.com
static: static.domain.com
I do not want to create S3 buckets for every website, so I want to store all the files in a bucket folder and serve them from there with a script.
I currently got:
<?php
$path = "domain";
$file = "filename";
// Save a copy of the file
file_put_contents($file, fopen($path . $file, 'r'));
// Set the content type and output the contents
header('Content-Type: ' . mime_content_type($file));
readfile($file);
// Delete the file
unlink($file);
?>
but it's not that clean since I save the file and then output it. Saving the file could potentially cause a mess due to the fact that the websites might have files with the same name.
I'm also lacking the .htaccess file to make the proper rewrites so it would appear that you got the file from static.domain.com/file and not a static.domain.com/script.
Any suggestions on how to realize this would be great! Many thanks in advance!
Configure Apache to proxy the request to S3. Something like:
RewriteRule /path/to/static/files/(.*) http://my.bucket.s3.amazon.com/$1 [P]
This means a request to your web server for /path/to/static/files/foo.jpg will cause your web server to fetch the file from my.bucket.s3.amazon.com/foo.jpg and serve it as if it came from your web server directly.
However, note that this somewhat negates the point of S3 for the purposes of offloading traffic. It just offloads the storage, which is fine, but your web server still needs to handle every single request, and in fact has some overhead in terms of bandwidth due to having to fetch it from another server. You should at the very least look into caching a copy on-server, and/or using a CDN in front of all this.

File location issue

I'm having trouble putting data in safe locations. What I want to do is allow my localhost to access the files to create my pages but prohibit all other access.
I started out trying to write a .htaccess file to prevent access to subfolders but read here that this was a poor way to do things and was getting into a tangle anyway so, following advice, I tried moving the files out of the public_html directory:
The structure is:
bits_folder
images
testimage.jpg
files
testfile.php
public_html
application
callingfile.php
With this layout, I get error 404 if I try to access anything in bits_folder from the browser, as desired. callingfile.php however does not seem able to access the testimage, but can include the php testfile.
callingfile.php:
require("../../bits_folder/files/testfile.php"); //works and displays file echo
<img src="../../bits_folder/images/testimage.jpg" //gives broken image
both the files (testimage and testfile) are in the folders where they should be.
I am assuming that the reason for this behaviour is that the img is a http request after the page is served and will thus be denied but I am no server expert. Is this the case? Can this be overcome? Should I be doing this another way?
Only place scripts and images for PHP to use outside public_html. Images and other things that are as src or otherwise linked in HTML/JavaScript cause the browser to request those. The web server will refuse to serve them from outside the public directory.
Your browser will get access denied for www.example.com/../../bits_folder/images/testimage.jpg

Calling All Files From A Different Directory Located Above Domain Name Directory

I am attempting to create a script in PHP which reads and includes all files from a directory which is above the domain name directory.
For example:
My domain name is example.com and located in /var/www/html/example.com/ and I want /var/www/html/example.com/file.php to be able to read from:
/var/www/html/videos/video1/ which contains index.html and the folders:
/var/www/html/videos/video1/images/ and /var/www/html/videos/video1/scripts/
e.g. www.example.com/file.php?dir=/var/www/html/videos/video1/index.html
If I use include (/var/www/html/videos/video1/index.html) it will only call the html file, which is done perfectly. However all the files in images folder and the scripts folders are not able to load.
I don't want to copy or call each file separately. I want to be able to only need to call index.html and then make the browser think it's in that directory and automatically read any file within there.
I know this works because Moodle uses this method (in file.php) to protect learning files by storing them in a moodledata folder which is one level above the public folder.
I've had a look but cannot make sense of it and I've searched the Internet to achieve the method I have explained above but have not had any success.
The reason I want to do this is to avoid having duplicate video files on the server for other sites that are hosted on the same server.
Many thanks in advance and for taking the time to assist.
$dir = realpath(dirname(__FILE__)."/../");
This would be the directory you are looking for. Require files relative to that.
To output a different file look at readfile which outputs straight to the buffer or perhaps use file_get_contents which can be held in a variable.
See if you can load an image in the browser directly, if you can it's probably a problem with your code, if you can't it may be a rights issue.

PHP, wamp, share local file, href='c:\path\file.doc',

Wampserver 2.2
PHP
$path is outside the www-root.
$path = 'file:///c:/path/files'<br />
$file = 'file.txt'<br />
echo "< a href='$path/$file'> . $file . < /a>< br />";
How do I make this accessable so visitors can download $file? Nothing is happening when I click the link. This is a part of a small and simple document management system.
You mix up two concepts:
The file path on the server where a file is stored
The URL for the client where to find that file
it is one of the more important functions of a webserver to abstract apart these two locations.
If you want to server a file from outside of your www-root, you need to create a helper script inside your www-root, that does the download. If you write this in PHP, look at the fpassthru() or readfile() functions.
Outsiders can't access files on your computer just by providing the link to them! This would be a huge security issue!
See this answer here
Allow users to download files outside webroot

Categories